z-logo
open-access-imgOpen Access
Mesh Adaptive Direct Search Algorithms for Constrained Optimization
Author(s) -
Charles Audet,
J. E. Dennis
Publication year - 2006
Publication title -
siam journal on optimization
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.066
H-Index - 136
eISSN - 1095-7189
pISSN - 1052-6234
DOI - 10.1137/040603371
Subject(s) - mathematics , mathematical optimization , convergence (economics) , iterated function , limit (mathematics) , set (abstract data type) , algorithm , computer science , mathematical analysis , economics , programming language , economic growth
This paper addresses the problem of minimization of a nonsmooth function under general nonsmooth constraints when no derivatives of the objective or constraint functions are available. We introduce the mesh adaptive direct search (MADS) class of algorithms which extends the generalized pattern search (GPS) class by allowing local exploration, called polling, in an asymptotically dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained.The main GPS convergence result is to identify limit points $\hat{x}$, where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions span the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many linear constraints for GPS. The main result of this paper is that the general MADS framework is flexible enough to allow the generation of an asymptotically dense set of refining directions along which the Clarke derivatives are nonnegative. We propose an instance of MADS for which the refining directions are dense in the hypertangent cone at $\hat{x}$ with probability 1 whenever the iterates associated with the refining directions converge to a single $\hat{x}$. The instance of MADS is compared to versions of GPS on some test problems. We also illustrate the limitation of our results with examples.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom