TY - GEN
T1 - A Deep Reservoir Computing Architecture for Dynamic Generative Modeling
AU - Zhang, Wei
AU - Kuan, Yuan Hung
AU - Chang, Su Hsin
AU - Li, Jr-Shin
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025
Y1 - 2025
N2 - Reservoir computing networks (RCNs) have been recognized as a popular machine learning tool for modeling the temporal evolution of dynamic data due to their close relationship with dynamical systems. A defining characteristic of RCNs is their fixed (training parameter-free) hidden layers, which offers significant computational benefits. However, this feature also introduces adverse impacts, such as increased warm-up time, limited long-term memory, and sensitivity to hyperparameters. To balance these advantages and disadvantages and expand the application domains of RCNs, we develop a novel deep reservoir computing network (DRCN) architecture that integrates control-theoretic concepts and techniques into RCNs. This architecture is designed as a cascade of shallow RCNs and is represented as a piecewise time-invariant control system. We further propose a layer-by-layer training strategy for the DRCN, resulting in an iterative deep learning algorithm for modeling dynamical systems. This enables us to exploit the DRCN as a generative model to generate output-of-sample data using the learned dynamical systems. The performance and efficiency of the DRCN-based dynamic generative model are demonstrated through various learning problems arising from time-series analysis and control systems, using both synthetic and real-world datasets.
AB - Reservoir computing networks (RCNs) have been recognized as a popular machine learning tool for modeling the temporal evolution of dynamic data due to their close relationship with dynamical systems. A defining characteristic of RCNs is their fixed (training parameter-free) hidden layers, which offers significant computational benefits. However, this feature also introduces adverse impacts, such as increased warm-up time, limited long-term memory, and sensitivity to hyperparameters. To balance these advantages and disadvantages and expand the application domains of RCNs, we develop a novel deep reservoir computing network (DRCN) architecture that integrates control-theoretic concepts and techniques into RCNs. This architecture is designed as a cascade of shallow RCNs and is represented as a piecewise time-invariant control system. We further propose a layer-by-layer training strategy for the DRCN, resulting in an iterative deep learning algorithm for modeling dynamical systems. This enables us to exploit the DRCN as a generative model to generate output-of-sample data using the learned dynamical systems. The performance and efficiency of the DRCN-based dynamic generative model are demonstrated through various learning problems arising from time-series analysis and control systems, using both synthetic and real-world datasets.
KW - control systems
KW - echo state networks
KW - multivariate time-series
KW - reservoir computing
UR - https://www.scopus.com/pages/publications/105023975228
U2 - 10.1109/IJCNN64981.2025.11227706
DO - 10.1109/IJCNN64981.2025.11227706
M3 - Conference contribution
AN - SCOPUS:105023975228
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - International Joint Conference on Neural Networks, IJCNN 2025 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2025 International Joint Conference on Neural Networks, IJCNN 2025
Y2 - 30 June 2025 through 5 July 2025
ER -