A new spin on neural networks

J. Grollier, CNRS/Thales lab, Palaiseau, France

An emerging branch of research studies how hardware neural networks can be both assembled by leveraging physical principles toward low-power operation, and how they could learn to perform cognitive tasks by directly using physical principles, such as energy minimization (1).

In this talk, I would like to highlight that spin-based systems are particularly adapted for these purposes (2, 3). In the first part, I will describe how we can exploit the ability of spintronic devices to transmit and process radio frequency signals to form hardware neural networks. I will show that a small experimental multilayer neural network composed of magnetic tunnel junctions emulating both synapses and neurons can natively classify radio-frequency inputs with high accuracy, a work that sets the ground for building large-scale spintronic neural networks.  In the second part, I will show that we can train an Ising Machine based on coupled spins (here the D’Wave computer) to perform supervised learning, by leading it to minimize its energy and the network error at the same time, through the algorithm of Equilibrium Propagation (4). We obtain software-equivalent accuracy on a subset of the MNIST database, thus demonstrating, for the first time, state-of-the art accuracy supervised learning on an Ising machine, opening new perspectives for the applications of these spin systems to artificial intelligence.

1.           D. Marković, A. Mizrahi, D. Querlioz, J. Grollier, Nature Reviews Physics, 1–12 (2020).
2.           J. Torrejon et al., Nature. 547, 428–431 (2017).
3.           M. Romera et al., Nature. 563, 230 (2018).
4.           M. Ernoult, J. Grollier, D. Querlioz, Y. Bengio, B. Scellier, in Advances in Neural Information Processing Systems 32, pp. 7081–7091 (2019).