Ghosh, Koushik2022-03-242022-03-242021-0733p.http://hdl.handle.net/10263/7303Dissertation under the supervision of Swagatam DasSu, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case.enNesterov Accelerated Gradient DescentAsynchronousHogwildDownPour SGDAsynchronous Methods in Gradient DescentOther