Mini-Batch Stochastic Gradient Descent Algorithm
(Redirected from minibatch)
Jump to navigation
Jump to search
A Mini-Batch Stochastic Gradient Descent Algorithm is a stochastic gradient descent algorithm that ...
- …
- Counter-Example(s):
- See: Batch Normalization.
References
2014
- (Li et al., 2014) ⇒ Mu Li, Tong Zhang, Yuqiang Chen, and Alexander J. Smola. (2014). “Efficient Mini-batch Training for Stochastic Optimization.” In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ISBN:978-1-4503-2956-9 doi:10.1145/2623330.2623612
- QUOTE: Stochastic gradient descent (SGD) is a popular technique for large-scale optimization problems in machine learning. In order to parallelize SGD, minibatch training needs to be employed to reduce the communication cost. However, an increase in minibatch size typically decreases the rate of convergence.
2011
- (Cotter et al., 2011) ⇒ Andrew Cotter, Ohad Shamir, Nati Srebro, and Karthik Sridharan. “Better mini-batch algorithms via accelerated gradient methods." In Advances in Neural Information Processing Systems, pp. 1647-1655. 2011.