Improvements on Activation Functions in ANN: An Overview

Lexuan YANG

Abstract


Activation functions are an essential part of artificial neural networks. Over the years, researches have been done to seek for new functions that perform better. There are several mainstream activation functions, such as sigmoid and ReLU, which are widely used across the decades. At the meantime, many modified versions of these functions are also proposed by researchers in order to further improve the performance. In this paper, limitations of the mainstream activation functions, as well as the main characteristics and relative performances of their modifications, are discussed.


Keywords


Activation functions; ANN; piecewise-linear

Full Text:

PDF

References


Clevert, D.-A., Unterthiner, T., & Hochreiter, S. (2015). Fast and accurate deep network learning by exponential linear units (elus). ArXiv preprint arXiv:1511.07289.

Glorot, X., Bordes, A., & Bengio, Y. (2011). Deep sparse rectifier neural networks. Proceedings of the fourteenth international conference on artificial intelligence and statistics.

Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning (p.195). Cambridge: The MIT Press.

Gulcehre, C., et al. (2016). Noisy Activation Functions.

He, K. M., et al. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. Proceedings of the IEEE international conference on computer vision.

He, Y., Cheng, L. F., Zhang, P. L., & Li, Y. (2019). Application of activation function of deep neural network. Measurement and control technology, 38(4), 50-53.

Hornik, K., Stinchcombe, M., & White, H. (1989). Multilayer feedforward networks are universal approximators. Neural Networks, 2(5), 359-366.

Huang, Y., et al. (2017). A study of traning algorithm in deep neural networks based on sigmoid activation function. Computer Measurement & Control, 2, 132-135.

Liu, X. W., Guo, D. B., & Li, C. (2019). An Improvement of the Activation Function in Convolutional Neural Networks. Journal of Test and Measurement Technology, 33(2), 121-125.

Liu, Y. Q., Wang, T. H., & Xu, X. (2019). New adaptive activation function of deep learning neural networks. Journal of Jilin University (Science Edition), 57(4), 857-859.

Maas, A. L., Hannun, A. Y., & Andrew Y. Ng. (2013). Rectifier nonlinearities improve neural network acoustic models. Proc. icml., 30(1), 2013.

Mai, Y. C., Chen, Y. H., & Zhang, L. (2019). Bio-inspired activation function with strong anti-noise ability. Computer Science, (7), 206-210.

Mcculloch, W. S., & Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biology, 52(1-2), 99-115.

Ramachandran, P., Zoph, B., & Le, Q. V. (2017). Swish: a self-gated activation function. ArXiv preprint arXiv:1710.05941.

Shi, Q. (2017). Research and verification of image classification optimization algorithm based on convolutional neural network. Beijing: Beijing Jiaotong University.

WANG, H. X., Zhou, J. Q., GU, C. H., & Lin, H. (2019). Design of activation function in CNN for image classification. Journal of Zhejiang University (Engineering Science), 53(7), 1363-1373 DOI: 10.3785/j.issn.1008-973X.2019.07.016

Xu, B., et al. (2015). Empirical evaluation of rectified activations in convolutional network. ArXiv preprint arXiv:1505.00853.

Xu, Y. J., & Xu, F. F. (2019). Optimization of activation function in neural network based on ArcReLU function. Journal of Data Acquisition & Processing, (3), 15.

Zhang, T., Yang, J., Song, W. A., & Song, C. F. (2019). Research on improved activation function TReLU. Journal of Chinese Computer Systems, 40(1), 58-63.




DOI: http://dx.doi.org/10.3968/11667

Refbacks

  • There are currently no refbacks.


Copyright (c) 2020 Management Science and Engineering

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.


Share us to:   


Reminder

  • We are currently accepting submissions via email only.

    The registration and online submission functions have been disabled.

    Please send your manuscripts to mse@cscanada.net,or mse@cscanada.org  for consideration.

    We look forward to receiving your work.

 


We only use three mailboxes as follows to deal with issues about paper acceptance, payment and submission of electronic versions of our journals to databases:
caooc@hotmail.com; mse@cscanada.net; mse@cscanada.org

 Articles published in Management Science and Engineering are licensed under Creative Commons Attribution 4.0 (CC-BY).

 MANAGEMENT SCIENCE AND ENGINEERING Editorial Office

Address:1055 Rue Lucien-L'Allier, Unit #772, Montreal, QC H3G 3C4, Canada.

Telephone: 1-514-558 6138
Http://www.cscanada.net Http://www.cscanada.org

Copyright © 2010 Canadian Research & Development Centre of Sciences and Cultures