对比、介绍神经元的激活函数。
阶跃函数,sign
优点:理想的计划函数
缺点:不连续,x=0x=0x=0 无导数,不好优化
测试
优点:
缺点
优点:
缺点
Krizhevsky et al. 2012 [7]
优点:
缺点
[Mass et al., 2013], [He et al., 2015]
优点
Parametric ReLU [He et al., 2015] 提出,ImageNet 2014 超越人类的准确率。
[Exponetial Linear Units, Clever et al., 2015]
由 Ian J. Goodfellow 等人在 ICML 2013 提出 [4]
优点
缺点
Noisy ReLUs have been used with some success in restricted Boltzmann machines for computer vision tasks. [3]
CSS231 Lecture 5 的实践建议
[1] http://ufldl.stanford.edu/wiki/index.php/神经网络
[2] https://en.wikipedia.org/wiki/Rectifier_(neural_networks)
[3] Vinod Nair and Geoffrey Hinton (2010). Rectified linear units improve restricted Boltzmann machines. ICML. PDF
[4]He, Kaiming; Zhang, Xiangyu; Ren, Shaoqing; Sun, Jian (2015). “Delving Deep into Rectifiers: Surpassing Human-Level Performance on Image Net Classification”. PDF
[5] Goodfellow I J, Warde-Farley D, Mirza M, et al. Maxout networks[J]. arXiv preprint arXiv:1302.4389, 2013. PDF
[6] CS231n Winter 2016 Lecture 5 Neural Networks VIDEO PDF
[7] Krizhevsky A, Sutskever I, Hinton G E. Imagenet classification with deep convolutional neural networks[C]//Advances in neural information processing systems. 2012: 1097-1105. PDF
[8] 请问人工神经网络中的activation function的作用具体是什么?为什么ReLu要好过于tanh和sigmoid function? https://www.zhihu.com/question/29021768