Arya Akhavan
About me
I am currently a postdoctoral researcher at CMAP, École Polytechnique de Paris. I am interested in optimization in all aspects and mainly its applications in statistical inference.
I did my PhD at Crest, ENSAE, Institut Polytechnique de Paris, and Istituto Italiano di Tecnologia (IIT), Genova, where I was fortunate to be advised by Alexandre B. Tsybakov and Massimiliano Pontil. I defended my thesis on February 3, 2023. You can find here my PhD thesis.
Education
PhD Candidate, Theoretical Statistics
ENSAE IP, Paris, and Istituto Italiano di Tecnologia-Genova (IIT), September 2019 – February 2023
Supervisors: Alexandre B. Tsybakov and Massimiliano Pontil
Master M2 MASH Learning and Applications
Paris Dauphine University and École Normale Supérieure Paris-Saclay, Fall 2017 – December 2018
First year of Master in Mathematics
Sharif University of Technology, 2016 – 2017
B.Sc. in Mathematics
Department of Mathematics, Statistics and Computer Sciences, University of Tehran, 2011 – 2016
High School Diploma in Mathematics and Physics Discipline
Rahyar High School, Tehran, Iran, 2007 – 2011
Work Experience
Research Fellow at Istituto Italiano di Tecnologia-Genova (IIT), Spring 2019
Supervisors: Alexandre B. Tsybakov and Massimiliano Pontil
Internship at ENSAE, IP, Paris, Spring and Fall 2018
Supervisor: Alexandre B. Tsybakov
Publications and Preprints
-
Estimating the minimizer and the minimum value of a regression function under passive design
A. Akhavan, D. Gogolashvili, and A. B. Tsybakov
arXiv preprint arXiv:2211.16457. To appear in Journal of Machine Learning Research (JMLR). 2024
Gradient-free optimization of highly smooth functions: improved analysis and a new algorithmA. Akhavan, E. Chzhen, M. Pontil, and A. B. Tsybakov
arXiv preprint arXiv:2306.02159. Under revision in Journal of Machine Learning Research (JMLR). 2023
Group meritocratic fairness in linear contextual banditsR. Grazzi, A. Akhavan, J. Falk, L. Cella, and M. Pontil
Advances in Neural Information Processing Systems (NeurIPS). 2022
A gradient estimator via L1-randomization for online zero-order optimization with two point feedbackA. Akhavan, E. Chzhen, M. Pontil, and A. B. Tsybakov
Advances in Neural Information Processing Systems (NeurIPS). 2022
Distributed zero-order optimization under adversarial noiseA. Akhavan, M. Pontil, and A. B. Tsybakov
Advances in Neural Information Processing Systems (NeurIPS). 2021
Exploiting higher order smoothness in derivative-free optimization and continuous banditsA. Akhavan, M. Pontil, and A. B. Tsybakov
Advances in Neural Information Processing Systems (NeurIPS). 2020