12 Scopus citations

Abstract

In signal processing, a large number of samples can be generated by a Monte Carlo method and then encoded as a Gaussian mixture model for compactness in computation, storage, and communication. With a large number of samples to learn from, the computational efficiency of Gaussian mixture learning becomes important. In this paper, we propose a new method of Gaussian mixture learning that works both accurately and efficiently for large datasets. The proposed method combines hierarchical clustering with the expectation-maximization algorithm, with hierarchical clustering providing an initial guess for the expectation-maximization algorithm. We also propose adaptive splitting for hierarchical clustering, which enhances the quality of the initial guess and thus improves both the accuracy and efficiency of the combination. We validate the performance of the proposed method in comparison with existing methods through numerical examples of Gaussian mixture learning and its application to distributed particle filtering.

Original languageEnglish
Pages (from-to)116-121
Number of pages6
JournalSignal Processing
Volume150
DOIs
StatePublished - Sep 2018

Keywords

  • Adaptive hierarchical clustering
  • Adaptive splitting
  • Distributed particle filtering
  • Expectation-maximization algorithm
  • Gaussian mixture learning

Fingerprint

Dive into the research topics of 'Gaussian mixture learning via adaptive hierarchical clustering'. Together they form a unique fingerprint.

Cite this