Resonant Machine Learning Based on Complex Growth Transform Dynamical Systems

Oindrila Chatterjee, Shantanu Chakrabartty

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Traditional energy-based learning models associate a single energy metric to each configuration of variables involved in the underlying optimization process. Such models associate the lowest energy state with the optimal configuration of variables under consideration and are thus inherently dissipative. In this article, we propose an energy-efficient learning framework that exploits structural and functional similarities between a machine-learning network and a general electrical network satisfying Tellegen's theorem. In contrast to the standard energy-based models, the proposed formulation associates two energy components, namely, active and reactive energy with the network. The formulation ensures that the network's active power is dissipated only during the process of learning, whereas the reactive power is maintained to be zero at all times. As a result, in steady state, the learned parameters are stored and self-sustained by electrical resonance determined by the network's nodal inductances and capacitances. Based on this approach, this article introduces three novel concepts: 1) a learning framework where the network's active-power dissipation is used as a regularization for a learning objective function that is subjected to zero total reactive-power constraint; 2) a dynamical system based on complex-domain, continuous-time growth transforms that optimizes the learning objective function and drives the network toward electrical resonance under steady-state operation; and 3) an annealing procedure that controls the tradeoff between active-power dissipation and the speed of convergence. As a representative example, we show how the proposed framework can be used for designing resonant support vector machines (SVMs), where the support vectors correspond to an LC network with self-sustained oscillations. We also show that this resonant network dissipates less active power compared with its non-resonant counterpart.

Original languageEnglish
Article number9095268
Pages (from-to)1289-1303
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume32
Issue number3
DOIs
StatePublished - Mar 2021

Keywords

  • Complex-domain machine learning
  • Tellegen's theorem
  • coupled oscillators
  • electrical resonance
  • energy-based learning models
  • energy-efficient learning models
  • resonant networks
  • support vector machines (SVMs)

Fingerprint

Dive into the research topics of 'Resonant Machine Learning Based on Complex Growth Transform Dynamical Systems'. Together they form a unique fingerprint.

Cite this