作者PyTorch (PY火炬)
看板DataScience
标题Re: [徵文] Self-Normalizing Neural Networks
时间Mon Jul 9 13:45:54 2018
感谢MXNet大的详细解说
想请教MXNet
我一直以来有个疑惑未明
就是selu是make Feed-forward great again
但是如果加在convolution layer也有self normalize的效果吗?
以这篇post的作者使用DCGAN的经验来看
https://ajolicoeur.wordpress.com/cats/
“All my initial attempts at generating cats in 128 x 128 with DCGAN failed.
However, simply by replacing the batch normalizations and ReLUs with SELUs,
I was able to get slow (6+ hours) but steady convergence with the same learning
rates as before.
SELUs are self-normalizing and thus remove the need for batch normalization.”
看似是selu也能用在convolution layer且self normalize
不知道数学上也能支持这件事吗?
selu paper里的数学推导应该是在Feed-forward的前提?
--
※ 发信站: 批踢踢实业坊(ptt.cc), 来自: 140.112.25.100
※ 文章网址: https://webptt.com/cn.aspx?n=bbs/DataScience/M.1531115157.A.8F5.html
※ 编辑: PyTorch (140.112.25.100), 07/09/2018 13:48:18