Skip to content

Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network

Notifications You must be signed in to change notification settings

manthanb/gdOptimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

gdOptimization

Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network to perform classification on Fashion MNIST database

About

Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages