Detail of Publication
Text Language | English |
---|---|
Authors | Yoshihiro Yamada, Masakazu Iwamura, Takuya Akiba and Koichi Kise |
Title | ShakeDrop Regularization for Deep Residual Learning |
Journal | IEEE Access |
Vol. | 7 |
No. | 1 |
Pages | pp.186126-186136 |
Reviewed or not | Reviewed |
Month & Year | December 2019 |
Abstract | Overfitting is a crucial problem in deep neural networks, even in the latest network architectures. In this paper, to relieve the overfitting effect of ResNet and its improvements (i.e., Wide ResNet, PyramidNet, and ResNeXt), we propose a new regularization method called ShakeDrop regularization. ShakeDrop is inspired by Shake-Shake, which is an effective regularization method, but can be applied to ResNeXt only. ShakeDrop is more effective than Shake-Shake and can be applied not only to ResNeXt but also ResNet, Wide ResNet, and PyramidNet. An important key is to achieve stability of training. Because effective regularization often causes unstable training, we introduce a training stabilizer, which is an unusual use of an existing regularizer. Through experiments under various conditions, we demonstrate the conditions under which ShakeDrop works well. |
DOI | 10.1109/ACCESS.2019.2960566 |
- Following file is available.
- Entry for BibTeX
@Article{Yamada2019, author = {Yoshihiro Yamada and Masakazu Iwamura and Takuya Akiba and Koichi Kise}, title = {ShakeDrop Regularization for Deep Residual Learning}, journal = {IEEE Access}, year = 2019, month = dec, volume = {7}, number = {1}, pages = {186126--186136}, DOI = {10.1109/ACCESS.2019.2960566} }