Abstract
In this paper sufficient conditions for local and superlinear convergence to a Kuhn—Tucker point are established for a class of algorithms which may be broadly defined and comprise a quadratic programming algorithm for repeated solution of a subproblem and a variable metric update to develop the Hessian in the subproblem. In particular the DFP update and an update attributed to Powell are shown to provide a superlinear convergent subclass of algorithms provided a start is made sufficiently close to the solution and the initial Hessian in the subproblem is sufficiently close to the Hessian of the Lagrangian at this point.
Similar content being viewed by others
References
K.J. Arrow, F.J. Gould and S.M. Howe, “A general saddle point result for constrained optimization”,Mathematical Programming 5 (1973) 225–234.
C.G. Broyden, “A class of methods for solving nonlinear simultaneous equations”,Mathematics of Computation 19 (1965) 577–593.
C.G. Broyden, J.E. Dennis and J.J. Moré, “On the local and superlinear convergence of quasi-Newton methods”,Journal of the Institute of Mathematics and its Applications 12 (1973) 223–245.
A.R. Colville, “A comparative study on nonlinear programming codes”, IBM New York Scientific Center, Tech. Rept. 320-2949 (1968).
R.W. Cottle, “The principal pivoting method of quadratic programming”, in: G.B. Dantzig and A.F. Veinott, eds., Mathematics of the decision sciences, part 1. Am. Math. Soc., Providence, R.I., (1968) 144–162.
W.C. Davidon, “Variable metric method for minimization”, A.E.C. Res. and Dev. Report # ANL-5990 (1959).
J.E. Dennis, “On some methods based on Broyden's secant approximation to the Hessian”, in F.A. Lootsma, ed.,Numerical methods for nonlinear optimization (Academic Press, New York 1972) 19–34.
J.E. Dennis and J.J. Moré, “A characterization of superlinear convergence and its application to quasi-Newton methods”,Mathematics of Computation 28, (126) 1974.
L.C.W. Dixon, “All the quasi-Newton family generate identical points”,Journal of Optimization Theory and Applications 10 (1972) 34–40.
A.V. Fiacco and G.P. McCormick,Nonlinear programming: Sequential unconstrained minimization techniques (Wiley, New York, 1968).
R. Fletcher and M.J.D. Powell, “A rapidly convergent descent method for minimization”,The Computer Journal 6 (1963) 163–168.
U.M. Garcia-Palomares, “Superlinearly convergent quasi-Newton method for nonlinear programming”, Ph.D. dissertation, University of Wisconsin, Madison, Wisc. (1973).
U.M. Garcia-Palomares and O.L. Mangasarian, “Superlinearly convergent quasi-Newton algorithms for nonlinearly constrained optimization problems”, Computer Sciences Technical Report # 195, University of Wisconsin, Madison, Wisc. (1974).
P.E. Gill and W. Murray, “Quasi-Newton methods for linearly constrained optimization”, in: P.E. Gill and W. Murray, eds.,Numerical methods for constrained optimization (Academic Press, London, 1974).
D. Goldfarb, “Extension of Davidon's variable metric method to maximization under linear and inequality constraints”,SIAM Journal on Applied Mathematics 17 (1969) 739–764.
S.P. Han, “Superlinearly convergent variable metric methods for general nonlinear programming problems”, Ph.D. dissertation, University of Wisconsin, Madison, Wisc. (1974).
S.P. Han, “Dual variable metric algorithms for constrained optimization”,SIAM Journal on Control and Optimization, to appear.
S.P. Han, “A globally convergent method for nonlinear programming”,Journal of Optimization Theory and Applications, to appear.
S.P. Han, “A hybrid method for constrained optimization problems”, in preparation.
L.A. Liusternik and V.J. Sobolev,Elements of functional analysis (Frederick Ungan, New York, 1961).
F.A. Lootsma, “A survey of methods for solving constrained minimization problems via unconstrained minimization”, in: F.A. Lootsma, ed.,Numerical methods for nonlinear optimization, (Academic Press, New York, 1972) 313–347.
O.L. Mangasarian, Private communication.
J.M. Ortega and W.C. Rheinboldt,Iterative solution of nonlinear equations in several variables (Academic Press, New York, 1970).
J.D. Pearson, “Variable metric methods of minimization”,The Computer Journal 12 (1969) 171–178.
M.J.D. Powell, “A new algorithm for unconstrained optimization, in: J.B. Rosen, O.L. Mangasarian and K. Ritter, eds.,Nonlinear programming, (Academic Press, New York, 1970).
M.J.D. Powell, “A fortran subroutine for unconstrained minimization, requiring first derivatives of the objective functions”, A.E.R.E. Harwell report R64-69 (1970).
S.M. Robinson, “A quadratically convergent algorithm for general nonlinear programming problems”,Mathematical Programming 3 (1972) 145–156.
S.M. Robinson, “Perturbed Kuhn—Tucker points and rates of convergence for a class of nonlinear-programming algorithms”,Mathematical Programming 7 (1974) 1–16.
R.T. Rockafellar, “New applications of duality in nonlinear programming”, Symposium on Mathematical Programming. The Hague, September 1970.
G.W. Stewart, “A modification of Davidon's minimization method to accept difference approximations of derivatives”,Journal of the Association for Computing Machinery 14 (1967) 72–83.
C. van de Panne,Methods for linear and quadratic programming (North-Holland, Amsterdam, 1975).
R.B. Wilson, “A simplicial method for concave programming”, Ph.D. dissertation, Harvard University, Cambridge, Mass. (1963).
Author information
Authors and Affiliations
Additional information
This research was supported in part by the National Science Foundation under Grants ENG 75-10486 and GJ 35292.
Rights and permissions
About this article
Cite this article
Han, SP. Superlinearly convergent variable metric algorithms for general nonlinear programming problems. Mathematical Programming 11, 263–282 (1976). https://doi.org/10.1007/BF01580395
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01580395