Luís Nunes Vicente

Departamento de Matemática
Faculdade de Ciências e Tecnologia
Universidade de Coimbra
3001-501 Coimbra, Portugal

Office: 5:12
Phone: 351 239 791171
Fax: 351 239 793069
E-mail: lnv{at}mat.uc.pt

Welcome to my page.

See my Google Scholar Profile.

See my Vitae in pdf format or a Short Bio. See also CV de 2 páginas em Português.

There is more information as one scrolls down ... papers, DFO book ...


August 3, 2015: I am on vacation... from academic bureaucracy! I am reading "Évariste", a historical novel based on the life of Évariste Galois (par François-Henri Désérable, chez Gallimard, 2015).

July 26, 2015: In the seventies/eighties, papers on algorithms for nonlinear optimization would start by "local convergence", and in the eighties/nineties by "global convergence". At that time, there was a huge excitement about polynomial complexity of interior-point methods. I still remember well Professor Richard Tapia saying once in class that such a polynomial complexity was a qualified form of global convergence (indeed the starting point was arbitrary).

In the nonlinear nonconvex case, algorithms based on Newton or quasi-Newton were purely analyzed from a local, global, and global-to-local view points, which made sense. Even so, twenty or thirty years ago, optimizers were already developing WCC bounds/global rates for nonlinear optimization algorithms but mostly for first-order methods and typically under convexity and smoothness.

With the appearance of the information processing applications (most notably compressed sensing and machine learning), first-order methods become (and still are) very popular.

From here to start developing WCC bounds/global rates for other algorithms (not necessarily first-order ones) was a quick step. The global rates are indeed a tool of interest in the nonlinear case, capable of providing information when the classical global convergence or local analysis do not. It helped, for instance, understanding better cubic regularization methods. In the derivative-free case, as the methods are "zero-order", one can only hope to derive rates of a global nature.

Now papers on algorithms for nonlinear optimization start by "worst case complexity" or "global rate"...

... click here for all news ...

March 19, 2015: In the paper Direct search based on probabilistic descent (see papers), there is a new proof technique for establishing global rates and worst case complexity bounds for randomized algorithms for which the new iterate depends on some object (directions, models) and the quality of the object is favorable with a certain probability. The technique is based on counting the number of iterations for which the quality is favorable and examining the probabilistic behavior of this number.

March 19, 2015: I'm starting a miniature blog here... I hope I'll have something interesting to say once in a while...


Research Summary
My research interests include the development and analysis of numerical methods for large-scale nonlinear programming, sparse optimization, PDE constrained optimization problems, and derivative-free optimization problems, and applications in computational sciences, engineering, and finance.

Biography
I obtained my Ph.D. in Computational and Applied Mathematics from Rice University in 1996. I held visiting positions at the IBM T.J. Watson Research Center and the IMA/University of Minnesota in 2002/2003 and at the Courant Institute of Mathematical Sciences/NYU and CERFACS in 2009/2010. I have been in the faculty of the Department of Mathematics of the University of Coimbra since 1996.


Publications and Talks
Research, Students, and Software

A list of former research students.

Software available:
  • Direct multisearch for multiobjective optimization: dms
  • (Large-scale) nonlinear programming/optimization: ipfilter
  • Global derivative-free optimization: PSwarm
  • Direct search: sid-psm
PI of the research grants:
Editorial Activity
Meetings
Teaching


Most Cited/Selected Papers

Links

Other Stuff