2017 DistributedStochasticVarianceRe
- (Lee et al., 2017) ⇒ Jason D. Lee, Qihang Lin, Tengyu Ma, and Tianbao Yang. (2017). “Distributed Stochastic Variance Reduced Gradient Methods by Sampling Extra Data with Replacement.” In: The Journal of Machine Learning Research, 18(1).
Subject Headings: Distributed Stochastic Variance Reduced Gradient Algorithm.
Notes
Cited By
- http://scholar.google.com/scholar?q=%222017%22+Distributed+Stochastic+Variance+Reduced+Gradient+Methods+by+Sampling+Extra+Data+with+Replacement
- http://dl.acm.org/citation.cfm?id=3122009.3176866&preflayout=flat#citedby
Quotes
Abstract
We study the round complexity of minimizing the average of convex functions under a new setting of distributed optimization where each machine can receive two subsets of functions. The first subset is from a random partition and the second subset is randomly sampled with replacement. Under this setting, we define a broad class of distributed algorithms whose local computation can utilize both subsets and design a distributed stochastic variance reduced gradient method belonging to in this class. When the condition number of the problem is small, our method achieves the optimal parallel runtime, amount of communication and rounds of communication among all distributed first-order methods up to constant factors. When the condition number is relatively large, a lower bound is provided for the number of rounds of communication needed by any algorithm in this class. Then, we present an accelerated version of our method whose the rounds of communication matches the lower bound up to logarithmic terms, which establishes that this accelerated algorithm has the lowest round complexity among all algorithms in our class under this new setting.
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2017 DistributedStochasticVarianceRe | Tianbao Yang Jason D. Lee Qihang Lin Tengyu Ma | Distributed Stochastic Variance Reduced Gradient Methods by Sampling Extra Data with Replacement | 2017 |