Copyright
Jorge Nocedal.

Projected Hessian updating algorithms for nonlinearly constrained optimization online

. (page 1 of 4)
Online LibraryJorge NocedalProjected Hessian updating algorithms for nonlinearly constrained optimization → online text (page 1 of 4)
Font size
QR-code for this ebook


Computer Science Department






TECHNICAL REPORT



PROJECTED HESSIAN UPDATING ALGORITHMS FOR
NONLINEARLY CONSTRAINED OPTIMIZATION



by



Jorge Nocedal* and Michael Overton'



November 1983
Report No. 95



NEW YORK UNIVERSITY




Department of Computer Science
Courant Institute of Mathematical Sciences

251 MERCER STREET, NEW YORK, N.Y. 10012



k ...2!!



PROJECTED HESSIAN UPDATING ALGORITHMS FOR
NONLINEARLY CONSTRAINED OPTIMIZATION

by
Jorge Nocedal* and Michael Overton

November 1983

Report No. 95



-Department of Electrical Engineering and Computer Science,
Northwestern University, Evanston, Illinois 60201. The work of
this author was supported in part during 1982-83 at the Courant
Institute of Mathematical Sciences under National Science Foundation
Grant No. MCS-81-175_26 and U.S. Department of Energy Contract
DE-AC02-76-ERO3077-V.

i-

' Courant Institute of Mathematical Sciences, New York University,
251 Mercer St., New York, N.Y. 10012. The work of this author was
supported in part by National Science Foundation Grant Nos. MCS-81-01924
and MCS-83-02021.



-2-

i. INTRODUCTION

Consider the foliowtne; optimization problem with nonlinear
equaiitv constraints:

min t (x) subject to c(x) = (i.l)

where f: 1^ * IR and c: ^R ^ ->вАҐ K '^ are twice different iable functions
and "1 < n. We will further assune that the Hessian matrices V -f and
{V~c^}, i=l,...,m, are Lipschitz continuous matrix functions of x.
Although this problem arises frequently in many applications, a more
common version involves inequality constraints. We confine our
attention to (1.1), partJ.y because it is easier to address and partlv
because a ^ood understanding of algorithms for solving f 1 . 1 ") is
essential for the design of methods for the more general problem. ^Tote
that we are concerned only with finding a local minimum of (1.1).
Finding the global minimum is a much more difficult task.

In Section 2 we discuss some of the Newton-type methods which have
been proposed to solve (1.1). In Section 3 we describe a new
Broyden-type irethod, which was suggested bv the work of Hoodman (1982).
This method updates an n x (n-m) matrix which approximates a "one-sided
proiected Hessian" of a Lagrangian function for (1.1). "Hie method is
easily shown to have a local one-step o-superlLnear convergence
propertv. We also give a new proof of the ^oggs-Tolle-'feng necessary
and sufficient condition for O-superlinear convergence of a class of
quasl-Newton methods for solving (i.l). In Section 4 we describe an
algorithn which updates an approximation to a "two-sided projected
Hessian," a symmetric matrix of order n-m which can be expected to be



-3-

posltlve definite near a solution to (1.1). This idea was sug


1 3 4

Online LibraryJorge NocedalProjected Hessian updating algorithms for nonlinearly constrained optimization → online text (page 1 of 4)