Article
Version 3
This version is not peer-reviewed
An Efficient Gaussian Mixture Model and Its Application to Neural Network
Version 1
: Received: 15 February 2023 / Approved: 16 February 2023 / Online: 16 February 2023 (06:59:43 CET)
Version 2 : Received: 10 March 2023 / Approved: 16 March 2023 / Online: 16 March 2023 (02:21:19 CET)
Version 3 : Received: 6 August 2024 / Approved: 7 August 2024 / Online: 13 August 2024 (08:06:48 CEST)
Version 2 : Received: 10 March 2023 / Approved: 16 March 2023 / Online: 16 March 2023 (02:21:19 CET)
Version 3 : Received: 6 August 2024 / Approved: 7 August 2024 / Online: 13 August 2024 (08:06:48 CEST)
How to cite: Lu, W.; Ding, D.; Wu, F.; Yuan, G. An Efficient Gaussian Mixture Model and Its Application to Neural Network. Preprints 2023, 2023020275. https://doi.org/10.20944/preprints202302.0275.v3 Lu, W.; Ding, D.; Wu, F.; Yuan, G. An Efficient Gaussian Mixture Model and Its Application to Neural Network. Preprints 2023, 2023020275. https://doi.org/10.20944/preprints202302.0275.v3
Abstract
Gaussian mixture models (GMMs) are powerful tools specifically suited for problems where data distributions are multi-modal. Inspired by Fourier expansion, we propose a concept called GMM expansion to approximate arbitrary densities. A simple learning method is introduced under this framework. Theoretical and numerical analyses are provided to show that densities under GMM expansion are able to approximate arbitrary densities with certain accuracy. The proposed learning algorithm demonstrates better accuracy and time efficiency compared to classic density approximation methods such as the Expectation Maximization (EM) algorithm and Bayesian variational inference (BVI) for Gaussian mixture models. The proposed framework also shows better accommodation of neural networks. Three neural network applications are built to demonstrate its usability. The accuracy of density estimation in neural networks is reported to be significantly improved in one of the applications. Another application shows that latent variables can be easily turned into random variables. This user-friendly method opens up more possibilities in terms of latent manipulation, embedding techniques, and many other potential uses.
Keywords
GMM; Decomposition; Density approximation; Neural Network
Subject
Computer Science and Mathematics, Computer Science
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment