2017 SGDRStochasticGradientDescentwi
- (Loshchilov & Hutter, 2017) ⇒ Ilya Loshchilov, and Frank Hutter. (2017). “SGDR: Stochastic Gradient Descent with Warm Restarts.” In: Conference Track Proceedings of the 5th International Conference on Learning Representations (ICLR 2017).
Subject Headings: Learning Rate Schedule; Cosine Annealing; Gradient Descent Algorithm.
Notes
- Online Resource(s):
Cited By
- Google Scholar: ~ 1,788 Citations.
Quotes
Abstract
Restart techniques are common in gradient-free optimization to deal with multimodal functions. Partial warm restarts are also gaining popularity in gradient-based optimization to improve the rate of convergence in accelerated gradient schemes to deal with ill-conditioned functions. In this paper, we propose a simple warm restart technique for stochastic gradient descent to improve its anytime performance when training deep neural networks. We empirically study its performance on the CIFAR-10 and CIFAR-100 datasets, where we demonstrate new state-of-the-art results at 3.14\% and 16.21\%, respectively. We also demonstrate its advantages on a dataset of EEG recordings and on a downsampled version of the ImageNet dataset. Our source code is available at: https://github.com/loshchil/SGDR
References
BibTeX
@inproceedings{2017_SGDRStochasticGradientDescentwi, author = {Ilya Loshchilov and Frank Hutter}, title = {SGDR: Stochastic Gradient Descent with Warm Restarts}, booktitle = {Conference Track Proceedings of 5th International Conference on Learning Representations (ICLR 2017)}, publisher = {OpenReview.net}, year = {2017}, url = {https://openreview.net/forum?id=Skq89Scxx}, }
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2017 SGDRStochasticGradientDescentwi | Frank Hutter Ilya Loshchilov | SGDR: Stochastic Gradient Descent with Warm Restarts | 2017 |