Performance Walls in Machine Learning and Neuromorphic Systems

Shantanu Chakrabartty, Gert Cauwenberghs

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


At the fundamental level, an energy imbalance exists between training and inference in machine learning (ML) systems. While inference involves recall using a fixed or learned set of parameters that can be energy-optimized using compression and sparsification techniques, training involves searching over the entire set of parameters and hence requires repeated memorization, caching, pruning, and annealing. In this paper, we introduce three 'performance walls' that determine the training energy efficiency, namely, the memory-wall, the update-wall, and the consolidation-wall. While the emerging compute-in-memory ML architectures can address the memory-wall bottleneck (or energy-dissipated due to repeated memory access) the approach is agnostic to energy-dissipated due to the number and precision required for the training updates (the update-wall) and is agnostic to the energy-dissipated when transferring information between short-term and long-term memories (the consolidation-wall). To overcome these performance walls, we propose a learning-in-memory (LIM) paradigm that prescribes ML system memories with metaplasticity and whose thermodynamical properties match the physics and energetics of learning.

Original languageEnglish
Title of host publicationISCAS 2023 - 56th IEEE International Symposium on Circuits and Systems, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665451093
StatePublished - 2023
Event56th IEEE International Symposium on Circuits and Systems, ISCAS 2023 - Monterey, United States
Duration: May 21 2023May 25 2023

Publication series

NameProceedings - IEEE International Symposium on Circuits and Systems
ISSN (Print)0271-4310


Conference56th IEEE International Symposium on Circuits and Systems, ISCAS 2023
Country/TerritoryUnited States


  • Energy Efficiency
  • Machine Learning
  • Memory
  • Neuromorphic Systems
  • Thermodynamics
  • Training


Dive into the research topics of 'Performance Walls in Machine Learning and Neuromorphic Systems'. Together they form a unique fingerprint.

Cite this