Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Improving Deep Echo State Network with Neuronal Similarity-based Iterative Pruning Merging Algorithm

Version 1 : Received: 22 January 2023 / Approved: 30 January 2023 / Online: 30 January 2023 (02:34:13 CET)

A peer-reviewed article of this Preprint also exists.

Shen, Q.; Zhang, H.; Mao, Y. Improving Deep Echo State Network with Neuronal Similarity-Based Iterative Pruning Merging Algorithm. Appl. Sci. 2023, 13, 2918. Shen, Q.; Zhang, H.; Mao, Y. Improving Deep Echo State Network with Neuronal Similarity-Based Iterative Pruning Merging Algorithm. Appl. Sci. 2023, 13, 2918.

Abstract

Recently, a layer-stacked ESN model named deep echo state Network (DeepESN) has been established. As an interactional model of recurrent neural network and deep neural network, investigations of DeepESN are of significant importance in both areas. Optimizing the structure of neural networks remains a common task in artificial neural networks, and the question of how many neurons should be used in each layer of DeepESN must be stressed. In this paper, our aim is to solve the problem of choosing the optimized size of DeepESN. Inspired by the sensitive iterative pruning algorithm, a neuronal similarity-based iterative pruning merging algorithm (NS-IPMA) is proposed to iteratively prune or merge the most similar neurons in DeepESN. Two chaotic time series prediction tasks are applied to demonstrate the effectiveness of NS-IPMA. The results show that the DeepESN pruned by NS-IPMA outperforms unpruned DeepESN with the same network size, and NS-IPMA is a feasible and superior approach to improving the generalization performance of DeepESN.

Keywords

reservoir computing; deep echo state network; neuronal similarity-based iterative pruning merging algorithm; chaotic time series forecast

Subject

Computer Science and Mathematics, Data Structures, Algorithms and Complexity

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.