Skip to content

intsystems/gradhpo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Test status Test coverage Docs status

Название исследуемой задачи:Short-Horizon Gradient-Based Hyperparameter Optimization
Тип научной работы:Research Project
Авторы:Eynullayev Altay, Rubtsov Denis, Karpeev Gleb

Abstract

Hyperparameter optimization is a fundamental challenge in modern machine learning, requiring the selection of suitable hyperparameters given a validation dataset. Gradient-based methods address this via bilevel optimization, enabling optimization over billion-dimensional search spaces - far beyond the reach of classical approaches such as grid search or Bayesian optimization. This project implements and wraps key gradient-based HPO algorithms as a reusable JAX library: T1-T2 with DARTS numerical approximation, Generalized Greedy Gradient-Based HPO, Online HPO with Hypergradient Distillation. The library provides a unified API suitable for a broad class of tasks, with full documentation and automated testing.

Library Planning

Can be found here.

Software modules developed as part of the study

  1. A python package gradhpo with all implementation here.
  2. A code with all experiment visualisation here. Can use colab.

About

A collection of short-horizon gradient-based hyperparameter optimization algorithms

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors