Preprint
Article

PrimeNet: Adaptive Multi-Layer Deep Neural Structure for Enhanced Feature Selection in Early Convolution Stage

Altmetrics

Downloads

246

Views

358

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

28 July 2021

Posted:

30 July 2021

You are already at the latest version

Alerts
Abstract
The colossal depths of the deep neural network sometimes suffer from ineffective backpropagation of the gradients through all its depths. Whereas, The strong performance of shallower multilayer neural structures prove their ability to increase the gradient signals in the early stages of training which easily gets backpropagated for global loss corrections. Shallow neural structures are always a good starting point for encouraging the sturdy feature characteristics of the input. In this research, a shallow, deep neural structure called PrimeNet is proposed. PrimeNet is aimed to dynamically identify and encourage the quality visual indicators from the input to be used by the subsequent deep network layers and increase the gradient signals in the lower stages of the training pipeline. In addition to this, the layerwise training is performed with the help of locally generated errors which means the gradient is not backpropagated to previous layers, and the hidden layer weights are updated during the forward pass, making this structure a backpropagation free variant. PrimeNet has obtained state-of-the-art results on various image datasets, attaining the dual objective of (1) compact dynamic deep neural structure, which (2) eliminates the problem of backwards-locking. The PrimeNet unit is proposed as an alternative to traditional convolution and dense blocks for faster and memory-efficient training, outperforming previously reported results aimed at adaptive methods for parallel and multilayer deep neural systems.
Keywords: 
Subject: Computer Science and Mathematics  -   Algebra and Number Theory
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated