Skip to content

Andres-Hernandez/py-optim

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 

Repository files navigation

py-optim

A collection of (stochastic) gradient descent algorithms with a unified interface.

Objective

Provide a very flexible framework to experiment with algorithm design for optimization problems that rely on (stochastic) gradients. Issues considered:

  • minibatches
  • learning rates, fixed, adaptive, annealing
  • preconditioning
  • momentum
  • averaging
  • ...

Dependencies

About

Gradient-based optimization algorithms in Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%