Asynchronous Methods in Gradient Descent

dc.contributor.authorGhosh, Koushik
dc.date.accessioned2022-03-24T05:01:58Z
dc.date.available2022-03-24T05:01:58Z
dc.date.issued2021-07
dc.descriptionDissertation under the supervision of Swagatam Dasen_US
dc.description.abstractSu, Boyd and Candes ’14 [1] showed that if we make the stepsizes smaller and smaller, Nesterov Accelerated Gradient Descent converges to a 2nd order ODE. On the other hand, arjevani has shown recently some convergence results on delayed vanilla Gradient descent . Our idea is to take a delayed version of Nesterov Accelerated Gradient Descent and derive it’s corresponding ODE and prove convergence for the convex case.en_US
dc.identifier.citation33p.en_US
dc.identifier.urihttp://hdl.handle.net/10263/7303
dc.language.isoenen_US
dc.publisherIndian Statistical Institute, Kolkata.en_US
dc.relation.ispartofseriesDissertation;CS-1906
dc.subjectNesterov Accelerated Gradient Descenten_US
dc.subjectAsynchronousen_US
dc.subjectHogwilden_US
dc.subjectDownPour SGDen_US
dc.titleAsynchronous Methods in Gradient Descenten_US
dc.typeOtheren_US

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Koushik Ghosh-cs-19-21.pdf
Size:
659.03 KB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: