孫中波,祝英杰,朱振超,高海音
(1.東北師范大學人文學院數學教育系,長春 130117;2.吉林大學通信工程學院,長春 130012;3.長春大學理學院,長春 130022)
一類無約束優(yōu)化的修正共軛梯度法
孫中波1,2,祝英杰3,朱振超2,高海音1
(1.東北師范大學人文學院數學教育系,長春 130117;2.吉林大學通信工程學院,長春 130012;3.長春大學理學院,長春 130022)
針對無約束優(yōu)化問題,提出一種新的充分下降共軛梯度法.該算法在每次迭代過程中,產生的搜索方向均為充分下降方向.在適當條件下,證明了算法的全局收斂性.數值結果表明算法是可行和有效的.
共軛梯度法;全局收斂;無約束優(yōu)化;充分下降方向
考慮如下無約束優(yōu)化問題:
其中Rn為Euclid空間.對于問題(1),一般采用如下迭代法進行求解:
其中:αk為搜索步長;d k為搜索方向.
文獻[1-7]分別給出了如下不同的共軛梯度公式,進而得到了不同的共軛梯度法:
假設條件:
(H2)存在Ω的某個鄰域N,使得f在該鄰域上連續(xù)可微,且▽f滿足Lipschitz條件,即存在常數L>0,使得‖▽f(x)-▽f(y)‖≤L‖x-y‖(?x,y∈N).
采用文獻[17]中的部分函數為測試函數.在MATLAB7.1環(huán)境下編寫程序代碼:CPU為奔騰(R)2.19 GHz,1 Gb內存.測試函數為一般二次函數f(x)=及Rosenbrock函數和Freudenstein Roth函數.測試初始點x0均為(1,2),目標函數的極小解x*分別為(0,0),(1,1),(5,4),程序代碼中的參數選取為t=1.2,σ2=0.6,ρ=1/4,ε=10-6.用NI表示所有迭代的總次數,CPU-TIME表示實際運行時間,Er表示數值計算的精度,m為變量的維數.表1~表3分別列出了算法1、FR共軛梯度法和CD共軛梯度法的數值結果.
表1 算法1的數值結果Table 1 Test report results of algorithm 1
表2 FR共軛梯度法的數值結果Table 2 Test report results of FR conjugate gradient algorithm
表3 CD共軛梯度法的數值結果Table 3 Test report results of CD conjugate gradient algorithm
由表1~表3可見:算法1可行且有效;FR共軛梯度法具有很好的收斂性,但計算效率不高;CD共軛梯度法的收斂性依賴于線搜索準則,算法的計算效率也不高;從算法的迭代次數和CPU時間分析,本文算法優(yōu)于經典的共軛梯度法.由于本文算法產生的搜索方向具有充分下降的性質,而且算法的收斂性也獨立于線搜索準則,使算法具有較好的計算效率.
[1] Fletcher R,Reeves C M.Function Minimization by Conjugate Gradients[J].The Computer Journal,1964,7(2):149-154.
[2] Polyak B T.The Conjugate Gradient Method in Extreme Problems[J].USSR Computational Mathematic and Mathematical Physics,1969,9:94-112.
[3] Fletcher R.Practical Methods of Optimization:Constrained Optimization[M].Vol.2.New York:Wiley,1981.
[4] Polak E,Ribiere G.Note Sur la Convergence de Méthodes de Directions Conjuguées[J].Rev Francaise Imformat Recherche Opérionelle,1969,16:35-43.
[5] Liu Y,Storey C.Efficient Generalized Conjugate Gradient Algorithms,Part 1:Theory[J].Journal of Optimization Theory and Applications,1991,69(1):129-137.
[6] Dai Y H,Yuan Y.An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization[J].Annals Operation Reseach,2001,103:33-47.
[7] Hestenes M R,Stiefel E.Methods of Conjugate Gradients for Solving Linear System[J].Res Nat Bur Stand,1952,49(6):409-436.
[8] Hager W W,ZHANG Hongchao.A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search[J].SIAM Journal on Optimization,2005,16(1):170-192.
[9] Hager W W,ZHANG Hongchao.A Survey of Nonlinear Conjugate Methods[J].J Optim,2006,2:35-58.
[10] Birgin E G,Martínez J M.A Spectral Conjugate Gradient Method for Unconstrained Optimization[J].Appl Math Optim,2001,43(2):117-128.
[11] Raydan M.The Barzilain and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem[J].SIAM J Optim,1997,7(1):26-33.
[12] ZHANG Li,ZHOU Weijun.Two Descent Hybrid Conjugate Gradient Methods for Optimization[J].Comput Appl Math,2008,216(1):251-264.
[13] ZHANG Li,ZHOU Weijun,LI Donghui.A Descent Modified Polak-Ribière-Polyak Conjugate Gradient Method and Its Global Convergence[J].IMA Journal of Numerical Analysis,2006,26(4):629-640.
[14] SUN Zhongbo,ZHU Tianxiao,WENG Shiyou,et al.A New Family Sufficient Descent Conjugate Gradient Methods for Unconstrained Optimization[C]//Proceedings of the 31st Chinese Control Conference.Piscataway:IEEE Press,2012:2532-2536.
[15] SUN Zhongbo,ZHU Tianxiao,GAO Haiyin.A Sufficient Descent Hybrid Conjugate Gradient Method and Its Global Convergence for Unconstrained Optimization[C]//Proceedings of the 31st Chinese Control and Decision Conference.Piscataway:IEEE Press,2012:735-739.
[16] 李董輝,童小嬌,萬中.數值最優(yōu)化算法與理論[M].2版.北京:科學出版社,2010.(LI Donghui,TONG Xiaojiao,WAN Zhong.Numerical Optimization Algorithms and Theory[M].2nd ed.Beijing:Science Press,2010.)
[17] MoréJ J,Garbow B S,Hillstrom K E.Testing Unconstrained Optimization Software[J].ACM Trans Math Soft,1981,7(1):17-41.
A Modified Conjugate Gradient Method for Unconstrained Optimization
SUN Zhongbo1,2,ZHU Yingjie3,ZHU Zhenchao2,GAO Haiyin1
(1.DepartmentofMathematicalEducation,CollegeofHumanitiesandSciencesofNortheastNormalUniversity,Changchun130117,China;2.CollegeofCommunicationEngineering,JilinUniversity,Changchun130012,China;3.CollegeofScience,ChangchunUniversity,Changchun130022,China)
A modified conjugate gradient method for unconstrained optimization was proposed.The direction is sufficient descent at each iteration.Under some suitable conditions,the method is global convergence.Numerical results show that these methods are feasible and effective.
conjugate gradient method;global convergence;unconstrained optimization;sufficient descent direction
O224
A
1671-5489(2014)03-0460-05
10.13413/j.cnki.jdxblxb.2014.03.10
2013-09-05.
孫中波(1982—),男,漢族,博士研究生,講師,從事最優(yōu)化理論與算法及其應用的研究,E-mail:zhongbosun2012@163.com.通信作者:高海音(1961—),女,漢族,博士,教授,從事最優(yōu)控制理論及其應用的研究,E-mail:gaohaiyinhealthy@163.com.
吉林省教育廳“十二五”科學技術研究項目(批準號:2013577;2013267;2013287;2014636).
趙立芹)