Artificial neural networks learn by minimizing a loss function with a computer to achieve the desired result. Alternatively, many forms of neuromorphic computing use local learning rules inspired by biological learning. We adopt a third approach, focusing on far simpler networks that exploit physics to both perform the forward computation and to obtain local learning rules that replace back propagation. Our Coupled Learning framework, related to Equilibrium Propagation, can potentially be implemented in mechanical and fluidic networks. It has been realized by our collaborators in laboratory electrical networks, one using digital variable resistors and the other using transistors, paving the way for micro fabrication of VLSI realizations.