| Название исследуемой задачи: | Short-Horizon Gradient-Based Hyperparameter Optimization |
|---|---|
| Тип научной работы: | Research Project |
| Авторы: | Eynullayev Altay, Rubtsov Denis, Karpeev Gleb |
Hyperparameter optimization is a fundamental challenge in modern machine learning, requiring the selection of suitable hyperparameters given a validation dataset. Gradient-based methods address this via bilevel optimization, enabling optimization over billion-dimensional search spaces - far beyond the reach of classical approaches such as grid search or Bayesian optimization. This project implements and wraps key gradient-based HPO algorithms as a reusable JAX library: T1-T2 with DARTS numerical approximation, Generalized Greedy Gradient-Based HPO, Online HPO with Hypergradient Distillation. The library provides a unified API suitable for a broad class of tasks, with full documentation and automated testing.
Can be found here.