Skip to content

badalyananna/optimization-semi-supervised-learning

Repository files navigation

This repository contains 4 optimization algorithms: Gradient Descent, Randomized BCGD, Cyclic BCGD and BCGD with Gauss Southwell rule

common.py contains the functions to calculate the gradient, the objective function and stepsizes

GradientDescent contains the implementation of the Gradient Descent algorithm.

RandomBCGD.py contains the implemetation of the Randomized BCGD algorithm.

CyclicBCDG.py contains our implementation of the Cyclic BCGD algorithm.

GaussSouthwellBCDG.py contains our implementation of the Gauss Southwell BCGD algorithm.

config.yml contains the parameters used to run the algorithms in main.ipynb.

main.ipynb shows the results of running the algorithms on randomly generated data points as well as on the real dataset.

Authors

The current version of the code is implemented by Anna Badalyan, based on the group work done by:
  1. Abhishek Varma Dasaraju
  2. Anna Badalyan
  3. Brenda Eloisa Tellez Juarez
  4. Rebecca Di Francesco

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published