The price of computational efficiency in modern machine learning

February 13, 2023
3:00-4:00pm ET
Cummings 601
Speaker: Ilias Zadik
Host: Diane Souvaine

Abstract

In recent years we have experienced a remarkable growth on the number and size of available datasets. Such growth has led to the intense and challenging pursuit of data methods which are provably both computationally efficient (i.e., polynomial-time) and statistically accurate. Notably, the analysis of polynomial-time methods has revealed intriguing phenomena in several learning tasks, such as their apparent failure of such methods to reach the information-theoretic optimal statistical guarantees (that is the presence of a non-trivial “computational- statistical trade-off”).

In this talk, I will present new such algorithmic results for the well-studied planted clique model and for the fundamental sparse regression model. For planted clique, we reveal the surprising severe failure of the Metropolis process to work in polynomial-time, even when simple degree heuristics succeed. In particular, our result resolved a well-known 30-years old open problem on the performance of the Metropolis process for the model, posed by Jerrum in 1992. For sparse regression, we show the failure of large families of polynomial-time estimators, such as MCMC and low-degree polynomial methods, to improve upon the best-known polynomial-time regression methods. As an outcome, our work offers rigorous evidence that popular regression methods in practice, such as LASSO, are optimally balancing their computational and statistical recourses.

Bio: Ilias Zadik is a postdoctoral researcher at the Mathematics department of MIT, mentored by Professor Mossel and Professon Sun. Prior to this, he was an independent postdoctoral fellow in the Center for Data Science of NYU. He received his PhD in 2019 from the Operations Research Center of MIT under the supervision of Professor Gamarnik.

His research spans high dimensional statistics, the theory of machine learning and applied probability. He is particularly interested in the study of computationally efficient estimators in machine learning, the interplay of differential privacy and high dimensional estimation and the rise of sharp statistical phase transitions.