Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network to perform classification on Fashion MNIST database
-
Notifications
You must be signed in to change notification settings - Fork 0
Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network
manthanb/gdOptimization
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Implementation of back-propagation through different convex optimization algorithms - gradient descent, gradient descent with momentum, Nesterov's accelerated momentum, RMSProp and ADAM - for a deep neural network
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published