Abstract.
We propose feasible descent methods for constrained minimization that do not make explicit use of the derivative of the objective function. The methods iteratively sample the objective function value along a finite set of feasible search arcs and decrease the sampling stepsize if an improved objective function value is not sampled. The search arcs are obtained by projecting search direction rays onto the feasible set and the search directions are chosen such that a subset approximately generates the cone of first-order feasible variations at the current iterate. We show that these methods have desirable convergence properties under certain regularity assumptions on the constraints. In the case of linear constraints, the projections are redundant and the regularity assumptions hold automatically. Numerical experience with the methods in the linearly constrained case is reported.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
Received: November 12, 1999 / Accepted: April 6, 2001¶Published online October 26, 2001
Rights and permissions
About this article
Cite this article
Lucidi, S., Sciandrone, M. & Tseng, P. Objective-derivative-free methods for constrained optimization. Math. Program. 92, 37–59 (2002). https://doi.org/10.1007/s101070100266
Issue Date:
DOI: https://doi.org/10.1007/s101070100266