Sparse and Smoothing Methods for Nonlinear Optimization of Complex Models

Research grant PTDC/MAT/116736/2010 funded by FCT.

January 2012 - December 2014

The goal of this research project is the study and application of techniques from sparse optimization and smoothing methods to optimization problems governed by complex models or involving functions of expensive evaluation.

Sparse optimization has seen an enormous development due to its application in the field of compressed sensing. However, not much is known outside the now classical linear l1 or lp recovery settings, in particular when nonlinearity occurs or derivatives are absent. Smoothing techniques are also well studied in the field of variational analysis and classical non-smooth optimization, but not yet applied to the type of problems of interest to us.

Thus, we search answers to open problems in nonlinear optimization and scientific computing by framing sparse optimization and smoothing methods in new contexts and applying them using novel perspectives.

In many application problems in derivative-free optimization (DFO) one has little or no correlation between problem variables, and such (sparsity) structure is unknown in advance. One of the main contributions of this project will be the development of l1-minimization interpolation modeling for functions with sparse second order derivatives. We have preliminary results saying that it is possible to compute quadratic interpolants, when the Hessian is sparse, based on random sample sets, using significantly less than O(n^2) points and enjoying, with high probability, of the same accuracy properties.

Such a result has a number of repercussions inside and outside optimization and deserves being extensively studied. An obvious application is in fact in DFO where the need for repeated building of such models results expensive and therefore poses challenging computational issues. To our knowledge this is the first application of compressed sensing to optimization. We remark that the sparse modeling component of this part of the project is linear while its use is clearly nonlinear. Most of the work here is part of a collaboration with K. Scheinberg and involves students in Coimbra, Princeton (Afonso Bandeira), and Lehigh. Included in this part of the project, a very challenging application of interest to us is data assimilation (ongoing collaboration with CERFACS and S. Gratton), where problem reduction size and an understanding of variable interaction are essential, opening the door to the use of our sparse Hessian techniques.

As we said before, in the past few years, the notion of sparsity has been explored extensively from theoretical and application stand points. Despite their impressive effectiveness, the resulting approaches are limited to linear observation operators. However, most phenomena are nonlinear, and thus, this poses a great limitation as for the utility of these linear sparse approaches for a broad range of applications. In fact, while sparsity may account for a notable amount of structure, it only accounts for a somewhat too simplistic form of structure.

In this project we will also study a generalization of sparse recovery and compressed sensing for nonlinear observation operators. This type of problems emerge in a broad range of applications such as hydrology, electrical impedance tomography, and many others. Thus, we will be interested in the solution of nonlinear l1-minimization problems for which there are virtually no algorithms available. In this part of the project, we will move somehow in an opposite direction as before, in the sense that the sparse modeling component is clearly non-linear, being the linear approach found in the algorithmic development, namely in the linear sparse subproblem approximation. This component of the project will be joint work with IBM Research (L. Horesh and A. R. Conn) and will involve a number of Coimbra's students given the complexity of both the modeling and the optimization aspects of the task.

On the other hand, we will apply smoothing methods to DFO problems,
aiming at providing (at least partial) answers to two open problems:
(i) the development and analysis of interpolation-based trust-region
problems for nonsmooth functions; (ii) the analysis of worst-case
complexity for DFO methods (in particular direct search), again under
the presence of nonsmoothness. Smoothing methods and related techniques
will also play an important role in the above components of the project
as we heavily need fast and reliable methods for the solution of
l1 and lp minimization subproblems.

**Papers:**

- Le Thi Hoai An, A. I. F. Vaz, and L. N. Vicente, Optimizing radial basis functions by D.C. programming and its use in direct search for global derivative-free optimization, TOP, 20 (2012) 190-214.
- A. S. Bandeira, K. Scheinberg, and L. N. Vicente, Computation of sparse low degree interpolating polynomials and their application to derivative-free optimization, Mathematical Programming, 134 (2012) 223-257.
- A. S. Bandeira, K. Scheinberg, and L. N. Vicente, On partially sparse recovery, preprint 11-13, Dept. Mathematics, Univ. Coimbra.
- A. S. Bandeira, K. Scheinberg, and L. N. Vicente, Convergence of trust-region methods based on probabilistic models, SIAM Journal on Optimization, 24 (2014) 1238-1264.
- E. Bergou, S. Gratton, and L. N. Vicente, Levenberg-Marquardt methods based on probabilistic gradient models and inexact subproblem solution, with application to data assimilation, submitted.
- R. P. Brito and L. N. Vicente, Efficient cardinality/mean-variance portfolios, in System Modeling and Optimization, Springer series IFIP Advances in Information and Communication Technology, edited by C. Pötzsche, C. Heuberger, B. Kaltenbacher, and F. Rendl, 2014.
- A. R. Conn and L. N. Vicente, Bilevel derivative-free optimization and its application to robust optimization, Optimization Methods and Software, 27 (2012) 561-577.
- Y. Diouane, S. Gratton, and L. N. Vicente, Globally convergent evolution strategies, Mathematical Programming, 152 (2015) 467-490.
- Y. Diouane, S. Gratton, and L. N. Vicente, Globally convergent evolution strategies for constrained optimization, Computational Optimizations and Applications, 62 (2015) 323-346.
- M. Dodangeh and L. N. Vicente, Worst case complexity of direct search under convexity, Mathematical Programming, 155 (2016) 307-332.
- R. Garmanjani and L. N. Vicente, Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization, IMA Journal of Numerical Analysis, 33 (2013) 1008-1028.
- S. Gratton, C. W. Royer, L. N. Vicente, and Z. Zhang, Direct search based on probabilistic descent, SIAM Journal on Optimization, 25 (2015) 1249-1716.
- S. Gratton and L. N. Vicente, A surrogate management framework using rigorous trust-region steps, Optimization Methods and Software, 29 (2014) 10-23.
- S. Gratton and L. N. Vicente, A merit function approach for direct search, SIAM Journal on Optimization, 24 (2014) 1980-1998.
- Le Thi Hoai An, Huynh Van Ngai, Pham Dinh Tao, A. I. F. Vaz, and L. N. Vicente, Globally convergent DC trust-region methods, Journal of Global Optimization, 59 (2014) 209-225.
- L. N. Vicente, Worst case complexity of direct search, EURO Journal on Computational Optimization, 1 (2013) 143-153.
- L. N. Vicente and A. L. Custódio, Analysis of direct searches for discontinuous functions, Mathematical Programming, 133 (2012) 299-325.