2013 ConstrainedStochasticGradientDe
- (Mu et al., 2013) ⇒ Yang Mu, Wei Ding, Tianyi Zhou, and Dacheng Tao. (2013). “Constrained Stochastic Gradient Descent for Large-scale Least Squares Problem.” In: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ISBN:978-1-4503-2174-7 doi:10.1145/2487575.2487635
Subject Headings:
Notes
Cited By
- http://scholar.google.com/scholar?q=%222013%22+Constrained+Stochastic+Gradient+Descent+for+Large-scale+Least+Squares+Problem
- http://dl.acm.org/citation.cfm?id=2487575.2487635&preflayout=flat#citedby
Quotes
Author Keywords
- Large-scale least squares; least squares methods; online learning; parameter learning; stochastic optimization; stochastic programming
Abstract
The least squares problem is one of the most important regression problems in statistics, machine learning and data mining. In this paper, we present the Constrained Stochastic Gradient Descent (CSGD) algorithm to solve the large-scale least squares problem. CSGD improves the Stochastic Gradient Descent (SGD) by imposing a provable constraint that the linear regression line passes through the mean point of all the data points. It results in the best regret bound $O (\ log{T}) $, and fastest convergence speed among all first order approaches. Empirical studies justify the effectiveness of CSGD by comparing it with SGD and other state-of-the-art approaches. An example is also given to show how to use CSGD to optimize SGD based least squares problems to achieve a better performance.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2013 ConstrainedStochasticGradientDe | Dacheng Tao Wei Ding Yang Mu Tianyi Zhou | Constrained Stochastic Gradient Descent for Large-scale Least Squares Problem | 10.1145/2487575.2487635 | 2013 |