Asynchronous Methods in Gradient Descent

No Thumbnail Available

Date

2021-07

Journal Title

Journal ISSN

Volume Title

Publisher

Indian Statistical Institute, Kolkata.

Abstract

Su, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case.

Description

Dissertation under the supervision of Swagatam Das

Keywords

Nesterov Accelerated Gradient Descent, Asynchronous, Hogwild, DownPour SGD

Citation

33p.

Endorsement

Review

Supplemented By

Referenced By