Skip to content

Latest commit

 

History

History
22 lines (17 loc) · 479 Bytes

README.md

File metadata and controls

22 lines (17 loc) · 479 Bytes

py-optim

A collection of (stochastic) gradient descent algorithms with a unified interface.

Objective

Provide a very flexible framework to experiment with algorithm design for optimization problems that rely on (stochastic) gradients. Issues considered:

  • minibatches
  • learning rates, fixed, adaptive, annealing
  • preconditioning
  • momentum
  • averaging
  • ...

Dependencies