Directional analysis of stochastic gradient descent via von mises-fisher distributions in deep learning
Although stochastic gradient descent (SGD) is a driving force behind the recent success of
deep learning, our understanding of its dynamics in a high-dimensional parameter space is
limited. In recent years, some researchers have used the stochasticity of minibatch
gradients, or the signal-to-noise ratio, to better characterize the learning dynamics of SGD.
Inspired from these work, we here analyze SGD from a geometrical perspective by
inspecting the stochasticity of the norms and directions of minibatch gradients. We propose a …
deep learning, our understanding of its dynamics in a high-dimensional parameter space is
limited. In recent years, some researchers have used the stochasticity of minibatch
gradients, or the signal-to-noise ratio, to better characterize the learning dynamics of SGD.
Inspired from these work, we here analyze SGD from a geometrical perspective by
inspecting the stochasticity of the norms and directions of minibatch gradients. We propose a …
Although stochastic gradient descent (SGD) is a driving force behind the recent success of deep learning, our understanding of its dynamics in a high-dimensional parameter space is limited. In recent years, some researchers have used the stochasticity of minibatch gradients, or the signal-to-noise ratio, to better characterize the learning dynamics of SGD. Inspired from these work, we here analyze SGD from a geometrical perspective by inspecting the stochasticity of the norms and directions of minibatch gradients. We propose a model of the directional concentration for minibatch gradients through von Mises-Fisher (VMF) distribution, and show that the directional uniformity of minibatch gradients increases over the course of SGD. We empirically verify our result using deep convolutional networks and observe a higher correlation between the gradient stochasticity and the proposed directional uniformity than that against the gradient norm stochasticity, suggesting that the directional statistics of minibatch gradients is a major factor behind SGD.
arxiv.org