Preprint
Article

Stochastic Activation Function Layers for Convolutional Neural Networks

Altmetrics

Downloads

458

Views

449

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

14 February 2020

Posted:

17 February 2020

You are already at the latest version

Alerts
Abstract
In recent years, the field of deep learning achieved considerable success in pattern recognition, image segmentation and may other classification fields. There are a lot of studies and practical applications of deep learning on images, video or text classification. In this study, we suggest a method for changing the architecture of the most performing CNN models with the aim of designing new models to be used as stand-alone networks or as a component of an ensemble. We propose to replace each activation layer of a CNN (usually a ReLu layer) by a different activation function stochastically drawn from a set of activation functions: in this way the resulting CNN has a different set of activation function layers.
Keywords: 
Subject: Computer Science and Mathematics  -   Computer Vision and Graphics
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated