Program

In-memory computing with memristor circuit primitives

The dramatic rise of data-intensive workloads has revived special-purpose hardware and architectures for continuing improvements in computational speed and energy efficiency. Several promising special-purpose approaches take inspiration from the brain, which outperforms digital computing in power and performance in key tasks such as pattern matching. One of these computing approaches inspired by the brain’s dataflow architecture is called “in-memory computing” which stores and computes stable computational kernels within specialized circuit geometries. While traditional CMOS ASICs implementing in-memory computing deliver some performance gains, such approaches still suffer from low power efficiency. Therefore, new proposals leveraging non-volatile memristive devices for in-memory computation are highly attractive in a variety of application domains. Originally developed as digital (binary) high density non-volatile memories, metal oxide memristive devices have further demonstrated a wide range of behaviors and properties – such as a wide range of tunable analog resistance and non-linear dynamics – which motivate their use in novel functions and new computational models. Many recent in-memory compute studies have focused on crossbar circuit architectures, demonstrating their application for neural networks, scientific computing and signal processing. However, other circuit primitives such as content addressable memories (CAMs) have shown further promise for mapping a diverse range of complimentary computational models such as finite state machines, pattern matching, and hashing. In addition, our team’s recently invented analog CAM circuit has been shown to accelerate interpretable machine learning models. In this talk, I will review the varied co-design opportunities from algorithms and architectures to circuits and devices for enabling low power, high-throughput computation in a variety of application domains.