全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Performance Analysis of Various Activation Functions in Generalized MLP Architectures of Neural Networks

Keywords: Activation Functions , Multi Layered Perceptron , Neural Networks , Performance Analysis

Full-Text   Cite this paper   Add to My Lib

Abstract:

The activation function used to transform the activation level of a unit (neuron) into an outputsignal. There are a number of common activation functions in use with artificial neural networks(ANN). The most common choice of activation functions for multi layered perceptron (MLP) isused as transfer functions in research and engineering. Among the reasons for this popularity areits boundedness in the unit interval, the function’s and its derivative’s fast computability, and anumber of amenable mathematical properties in the realm of approximation theory. However,considering the huge variety of problem domains MLP is applied in, it is intriguing to suspect thatspecific problems call for single or a set of specific activation functions. The aim of this study is toanalyze the performance of generalized MLP architectures which has back-propagation algorithmusing various different activation functions for the neurons of hidden and output layers. Forexperimental comparisons, Bi-polar sigmoid, Uni-polar sigmoid, Tanh, Conic Section, and RadialBases Function (RBF) were used.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413