Adrian S. Lewis

Past Awards

2020
John von Neumann Theory Prize: Winner(s)
2020 - Winner(s)

The 2020 INFORMS John von Neumann Theory Prize is awarded to Adrian S. Lewis for his fundamental and sustained contributions to continuous optimization, operations research, and, more broadly, computational science. His work has pushed the frontiers of nonlinear optimization and convex analysis and developed path-breaking theory that has led to much subsequent work. The clarity and elegance of his writing is well-known and admired. Through scholarly papers, research monographs, and mentorship, he has influenced several generations of optimization researchers, as well as practitioners.

Professor Lewis has published seminal work on a wide range of topics including eigenvalue optimization,quasi-Newton algorithms, gradient sampling methods and control, activity identification via partial smoothness, alternating projection methods, conditioning and error bounds, semi-algebraic variational analysis and the Kurdyka-Lojasiewicz inequality, and hyperbolic polynomials. His results on convex analysis over Hermitian matrices opened the door to the subdifferential analysis of such functions, as well as to a duality and sensitivity theory for optimization problems with such functions. Together with Burke and Overton, he produced a series of papers leading to a deep understanding of the variational behavior of spectral functions, including the spectral radius. His convergence guarantees for alternating/cyclic projection methods, both for convex and nonconvex settings, are used to find a point at the intersection of finitely many sets, a prototypical problem in computational mathematics. A consistent theme in Professor Lewis's work is to bring variational analytic tools and computation closer together. For example, his recent paper, with Drusvyatskiy and Ioffe, proves that under a natural transversality condition, described in variational analytic terms, the method of alternating projections converges linearly locally. His more recent work has focused on understanding the impact of variational analytic notions of stability on linear/quadratic rates of convergence of Gauss-Newton type methods for minimizing compositions of convex functions and smooth maps. These results have implications for a number of fundamental problems including phase retrieval, matrix factorization, and robust principal component analysis.



2018
INFORMS Computing Society Prize: Winner(s)