Skip to content
forked from schaul/py-optim

Gradient-based optimization algorithms in Python

Notifications You must be signed in to change notification settings

bitfort/py-optim

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 

Repository files navigation

py-optim

A collection of (stochastic) gradient descent algorithms with a unified interface.

Objective

Provide a very flexible framework to experiment with algorithm design for optimization problems that rely on (stochastic) gradients. Issues considered:

  • minibatches
  • learning rates, fixed, adaptive, annealing
  • preconditioning
  • momentum
  • averaging
  • ...

Dependencies

About

Gradient-based optimization algorithms in Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published