Skip to content

Latest commit

 

History

History
41 lines (31 loc) · 2.05 KB

File metadata and controls

41 lines (31 loc) · 2.05 KB

Test status Test coverage Docs status

Название исследуемой задачи:Short-Horizon Gradient-Based Hyperparameter Optimization
Тип научной работы:Research Project
Авторы:Eynullayev Altay, Rubtsov Denis, Karpeev Gleb

Abstract

Hyperparameter optimization is a fundamental challenge in modern machine learning, requiring the selection of suitable hyperparameters given a validation dataset. Gradient-based methods address this via bilevel optimization, enabling optimization over billion-dimensional search spaces - far beyond the reach of classical approaches such as grid search or Bayesian optimization. This project implements and wraps key gradient-based HPO algorithms as a reusable JAX library: T1-T2 with DARTS numerical approximation, Generalized Greedy Gradient-Based HPO, Online HPO with Hypergradient Distillation. The library provides a unified API suitable for a broad class of tasks, with full documentation and automated testing.

Library Planning

Can be found here.

Software modules developed as part of the study

  1. A python package gradhpo with all implementation here.
  2. A code with all experiment visualisation here. Can use colab.