TY - JOUR
T1 - Decentralized global optimization based on a growth transform dynamical system model
AU - Chatterjee, Oindrila
AU - Chakrabartty, Shantanu
N1 - Funding Information:
Manuscript received September 13, 2017; revised January 25, 2018; accepted March 14, 2018. Date of publication April 16, 2018; date of current version November 16, 2018. This work was supported by research grants from the National Science Foundation under Grant ECCS 1550096 and Grant CSR 140527. (Corresponding author: Shantanu Chakrabartty.) The authors are with the Department of Electrical and Systems Engineering, Washington University in St. Louis, St. Louis, MO 63130 USA (e-mail: [email protected]). Digital Object Identifier 10.1109/TNNLS.2018.2817367 Fig. 1. Illustration of different annealing principles for optimizing a target objective function q(x) with multiple local minima but only one global minimum x∗. (a) Process of surmounting the energy barriers in simulated annealing. (b) Process of quantum tunneling through the barriers in quantum annealing. (c) Proposed approach where a driver function h(x,t) evolves under the influence of q(x) to a Dirac-delta function centered at the global minimum x∗.
Publisher Copyright:
© 2012 IEEE.
PY - 2018/12
Y1 - 2018/12
N2 - Conservation principles, such as conservation of charge, energy, or mass, provide a natural way to couple and constrain spatially separated variables. In this paper, we propose a dynamical system model that exploits these constraints for solving nonconvex and discrete global optimization problems. Unlike the traditional simulated annealing or quantum annealing-based global optimization techniques, the proposed method optimizes a target objective function by continuously evolving a driver functional over a conservation manifold, using a generalized variant of growth transformations. As a result, the driver functional asymptotically converges toward a Dirac-delta function that is centered at the global optimum of the target objective function. In this paper, we provide an outline of the proof of convergence for the dynamical system model and investigate different properties of the model using a benchmark nonlinear optimization problem. Also, we demonstrate how a discrete variant of the proposed dynamical system can be used for implementing decentralized optimization algorithms, where an ensemble of spatially separated entities (for example, biological cells or simple computational units) can collectively implement specific functions, such as winner-take-all and ranking, by exchanging signals only with its immediate substrate or environment. The proposed dynamical system model could potentially be used to implement continuous-time optimizers, annealers, and neural networks.
AB - Conservation principles, such as conservation of charge, energy, or mass, provide a natural way to couple and constrain spatially separated variables. In this paper, we propose a dynamical system model that exploits these constraints for solving nonconvex and discrete global optimization problems. Unlike the traditional simulated annealing or quantum annealing-based global optimization techniques, the proposed method optimizes a target objective function by continuously evolving a driver functional over a conservation manifold, using a generalized variant of growth transformations. As a result, the driver functional asymptotically converges toward a Dirac-delta function that is centered at the global optimum of the target objective function. In this paper, we provide an outline of the proof of convergence for the dynamical system model and investigate different properties of the model using a benchmark nonlinear optimization problem. Also, we demonstrate how a discrete variant of the proposed dynamical system can be used for implementing decentralized optimization algorithms, where an ensemble of spatially separated entities (for example, biological cells or simple computational units) can collectively implement specific functions, such as winner-take-all and ranking, by exchanging signals only with its immediate substrate or environment. The proposed dynamical system model could potentially be used to implement continuous-time optimizers, annealers, and neural networks.
KW - Analog computing
KW - annealing algorithms
KW - dynamical systems
KW - global optimization
KW - growth transforms
KW - substrate computing
UR - http://www.scopus.com/inward/record.url?scp=85045644306&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2018.2817367
DO - 10.1109/TNNLS.2018.2817367
M3 - Article
C2 - 29993647
AN - SCOPUS:85045644306
SN - 2162-237X
VL - 29
SP - 6052
EP - 6061
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 12
M1 - 8338337
ER -