This repository contains the source code of the Poisson inverse problem experiment in the paper "Fast minimization of expected logarithmic loss via stochastic dual averaging" accepted by AISTATS 2024.
- Tested on Julia Version 1.9.2
- Set the dimension and the number of samples in line 23 to 25 in
main.jl
$ cd pip/
$ julia ./install.jl
$ julia ./main.jl
- Expectation maximization (EM): L. A. Shepp and Y. Vardi, Maximum likelihood reconstruction for emission tomography, IEEE Trans. Med. Imaging, 1982 (link) and Thomas M. Cover, An algorithm for maximizing expected log investment return, IEEE Trans. Inf. Theory, 1984 (link)
- Primal-dual hybrid gradient method (PDHG): Antonin Chambolle and Thomas Pock, A first-order primal-dual algorithm for convex problems with applications to imaging, J. Math. Imaging Vis., 2011 (link)
- NoLips: Heinz H. Bauschke, Jérôme Bolte, Marc Teboulle, A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications, Math. Oper. Res., 2017 (link)
- Entropic mirror descent with Armijo line search (EMD): Yen-Huan Li and Volkan Cevher, Convergence of the exponentiated gradient method with Armijo line search, J. Optim. Theory Appl., 2019 (link)
- Frank-Wolfe (FW): Renbo Zhao and Robert M. Freund, Analysis of the Frank–Wolfe method for convex composite optimization involving a logarithmically-homogeneous barrier, Math. Program., 2023 (link)
- Stochastic primal-dual hybrid gradient (SPDHG): Antonin Chambolle, Matthias J. Ehrhardt, Peter Richtárik, and Carola-Bibiane Schönlieb, Stochastic primal-dual hybrid gradient algorithm with arbitrary sampling and imaging applications, SIAM J. Optim., 2018 (link)
- Stochastic Soft-Bayes (SSB): Yen-Huan Li, Online positron emission tomography By online portfolio selection, Int. Conf. Acoustics, Speech, and Signal Processing (ICASSP), 2020 (link)
- Stochastic LB-OMD (SLBOMD): Chung-En Tsai, Hao-Chung Cheng, and Yen-Huan Li, Faster stochastic first-order method for maximum-likelihood quantum state tomography, Int. Conf. Quantum Information Processing (QIP), 2023 (link)
- 1-sample LB-SDA: this work
-
$d$ -sample LB-SDA: this work