Copyright
Jorge Nocedal.

Projected Hessian updating algorithms for nonlinearly constrained optimization online

. (page 4 of 4)
Online LibraryJorge NocedalProjected Hessian updating algorithms for nonlinearly constrained optimization → online text (page 4 of 4)
Font size
QR-code for this ebook


parameter is to allow sumnins^ the infinite series in the proof of
bounded deterioration (4,?4)(i'). However, these methods are certainlv
s^ensitive to the value of n. If n is too small the BFnS update mav
ne^'er he used and the converc;ence hecomes extremeJ.v slow. ^or example,
this happens when n = 10~^ for Problem H'^111. On the other hand, if n
is too large, the method irav fail, as the following example shows.



Problem P5. n=2,m = l,






This example was provided by Richard Byrd (private communication),
min f(0 =l^\+l^\- S5^C,



subject to Ci (x) = =-^ +1=0,
Solution: x^ = (5,1),



Starting point: x^ = (5.n001, l.nnooi).



If Method Cl (for example) is applied to Problem PS with ^^ taken as
the true Hessian at x and n = 1, v = 10 -, the first update causes '^^
to become aJ.most exactlv singular, since v^s is nearlv zero. "''he
convergence theorem does not apply with this value for n , since
II Hj(.ll = 1, II Mil = 1, II C^ii = 5, and Sn is not less than ^r-, violating



-6S-
(4.23). iftien n = in~^, the BFHS update fomuia Is not useH, B-
and the al^crtthin teminates with II r^il < '^OL.



C



Finally, we outline an example where II s II c^(x) = -



'-> h



c^(x) =



Figure 1



-fS5-
It Is clearlv possible to construct: a smooth function f with these

oropertles, aithoii^h ohtalnln? a fomuia for f would he complicated.

T
We therefore have x^ = (0,0"). Let x = (l,i). Bv construction, 7 g

vanishes at x as well as at x*. Regardless of the value of 3 the

step to the next point K^ Is orthogonal to the constraint contours,

giving X, = (1, -1), Now f can be chosen so that g(x|) has a nonzero

first component, and tlius H v II > 0, although H s II = '^. We can perturb

X sllghtlv so that H s H Is a verv smalJ. positive number while H v II Is

of moderate size. Thus If the RFGS update formula Is used, 11^,11 Is

J.

very large. However, If Update Rule 1 Is used, ^.rLth anv reasonable
value for t\ , B, = ^^ and the dlfflcultv Is avoided.

It Is clearlv Important for practical purposes to deveJ.op a
variant of \J,gorlthm i. 1 which automatically controls the value of n In
some suitable wav. We leave this as a topic for future research.



aL(XN 0^^JL^n(?1ENTS

We would like to thank ^hlllp 0111, Jonathan Goodman, Walter
Murray, Richard Tapla and ^'argaret Wright for various, helpful
conversations and suggestions.



-67-



REFERENC^:?;



D.P. Bertsekas (1982). Constrained Op t imi^a t ton and T.a^raage
Multiplier Methods, Academic Press, Mew York and London.

M.C. Bartholeraew-Biggs (1982). Recursive quadratic prof^ramminp,

methods for nonlinear constraints, in: M.J.D. Powell (ed.).

Nonlinear Optimization 1981, pp. 2 13-222, Academic Press, London
and New York.

M.C. Biggs (1972). Constrained minimization using recursive equalitv
quadratic programming, in: ^.k. Lootsma(ed. ) , Numerical Methods for
Nonlinear Optimization, Academic Press, London and New Vork.

P.T. Boggs, J.W. Tolle and P. Wang (1982). On the local convergence
of quasi-Newton irethods for constrained optimization, SI,V1 J.
Control and Optimization 20, ' pp. 161-171.

. i - ' ^* - - ^ , вЦ†
CO. Brovden, J.E. 'Dennis and J. Ji' ' More (1973). On the local and

superlinear convergence of quasi-Newton methods, J. Inst, "'ath.

Appi. 12, pp. 22V245. "

T.F. Coleman and A.R. Conn (1982). On the local convergence of a
quasi-Newton irethod for the nonlinear programming problem, Computer
Science Hept. Report TR 82-509, Cornell "niversity, Ithaca, N.Y.

T.F. Coleman and 0. Sorensen (19B2). A note on the computation of an
orthomormal basis for the null space of a matrix. Computer Science
Dept. Report TR 82-509, Cornell University, Ithaca, N.Y.

J.E. Dennis and J.J. More (1977). Ouasi-Newton methods, motivation
and theory, SI.Afl Review 19, pp. 46-89.

R. Fletcher (1974). Methods related to Lagrangian functions, in: P.E.
Gill and W, Murray (eds. ), Numerical Methods for Constrained
Optimization, Academic Press, London and New York, pp. 29-66.

R. Fletcher (1981). Practical Methods of Optimization, vol. 2, Wiley,
Chichester and New York.

R. ^letcher and Womersley (1982). An algorithm for composite nonsmooth
optimization problems, Dept. of Mathematics report NA/60,

University of Dundee, Dundee, Scotland,

R. Fontecilia, T. Steihaug and R.A. Tapia (1983). A convergence
theory for a class of quasi-Newton methods for constrained
optimization. Report 83-15, Dept. of 'Mathematical Sciences, Rice
University, Houston, Texas.

D. Gabay (1982). Reduced quasi-Newton methods with feasibility

improvement for nonlineariy constrained optimization. Math,
Programming Study 16, pp. 18-44.



-68-

U.M. Oarcia-Paioraa res and O.L. >1angasarlan (1*^74). "^upe riinearlv
convergent quasl-Newton aJ.gorithms for nonlLneariv constrained
optL-nizaClon problems, Computer Sciences Tech. Report 1'^'^,

University of Wisconsin, Madison, Wisconsin.

P.E. Giii and U. Murrav (l'574a). Newton-tvpe methods for

unconstrained and llnearlv constrained op tlm.l5:a t Ion, Math.
Programming 2S, pp. 311-350.

P.E. Giii and W. Murray (l


1 2 4

Online LibraryJorge NocedalProjected Hessian updating algorithms for nonlinearly constrained optimization → online text (page 4 of 4)