Stochastic Gradient Descent and Its Variants in Machine Learning

Abstract

Stochastic gradient descent (SGD) is a fundamental algorithm which has had a profound impact on machine learning. This article surveys some important results on SGD and its variants that arose in machine learning.

Keywords

Stochastic optimization Gradient descent Large scale optimization

This article belongs to the Special issue—Recent Advances in Machine Learning.