UBC Faculty Research and Publications

Convergence rate of stochastic gradient with constant step size Schmidt, Mark

Abstract

We show that the basic stochastic gradient method applied to a strongly-convex differentiable function with a constant step-size achieves a linear convergence rate (in function value and iterates) up to a constant proportional the step-size (under standard assumptions on the gradient).

Item Media

Item Citations and Data

Rights

Attribution-NoDerivs 2.5 Canada