A NONLINEAR CONJUGATE GRADIENT METHOD AND ITS GLOBAL CONVERGENCE ANALYSIS†
A NONLINEAR CONJUGATE GRADIENT METHOD AND ITS GLOBAL CONVERGENCE ANALYSIS†
Journal of Applied Mathematics & Informatics. 2016. Jan, 34(1_2): 157-165
• Received : November 22, 2014
• Accepted : September 08, 2015
• Published : January 30, 2016
PDF
e-PUB
PPT
Export by style
Share
Article
Author
Metrics
Cited by
AJIE CHU
YIXIAO SU
SHOUQIANG DU

Abstract
In this paper, we develop a new hybridization conjugate gradient method for solving the unconstrained optimization problem. Under mild assumptions, we get the sufficient descent property of the given method. The global convergence of the given method is also presented under the Wolfe-type line search and the general Wolfe line search. The numerical results show that the method is also efficient. AMS Mathematics Subject Classification : 65K05, 90C30.
Keywords
1. Introduction
We consider the unconstrained optimization problem
PPT Slide
Lager Image
where f . Rn R is a continuously differentiable function. The nonlinear conjugate gradient method is very useful for solving (1.1), especially when n is large. For any given x Rn , the nonlinear conjugate gradient method generates xk , k = 1, 2,..., n , by the following recursive relation
PPT Slide
Lager Image
PPT Slide
Lager Image
where gk = ∇ f ( xk ) is the gradient of f at xk and βk is typically given by some formulas (such as [1 - 5] ).
PPT Slide
Lager Image
To achieve good computational performance and maintain the attractive feature of strong global convergence, in the past years, there exist many hybridizations of the basic conjugate gradient methods (see [6 - 10] ). Based on the above papers, in this paper, we present a new hybridization nonlinear conjugate gradient method, where βk is given as
PPT Slide
Lager Image
αk is computed by the Wolfe-type line search, which is proposed in [11]
PPT Slide
Lager Image
PPT Slide
Lager Image
where 0 <, δ <, σ <, 1, 0 <, γ <, 1. Based on the hybridization of βk as given by (1.4) we give the nonlinear conjugate gradient methods under the Wolfe-type line search and the general Wolfe line search.
In Section 2, we give the Method 2.1 and prove the global convergence of the proposed method with Wolfe-type line search. In Section 3, some discussions and the numerical results of the Method 2.1 are also given.
2. Method 2.1 and its global convergence analysis
Now, we give the Method 2.1 for solving (1.1).
Method 2.1
Step 1. Choose initial point x 0 Rn , ε ≥ 0, 0 <, δ <, σ <, 1, u, γ ∈ (0, 1).
Step 2. Set d 1 = − g 1 , k = 1, if ║ g 1 ║ = 0, then stop.
Step 3. Let x k+1 = xk + αkdk , compute αk by (1.5) and (1.6).
Step 4. Compute g k+1 , if ║ g k+1 ║ ≤ ε , then stop. Otherwise, go to next step.
Step 5. Compute βk +1 by (1.4) and generate d k+1 by (1.3).
Step 6. Set k = k + 1, go to step 3.
In order to establish the global convergence of the Method 2.1, we need the following assumption, which are often used in the literature to analyze the global convergence of nonlinear conjugate gradient methods.
Assumption 2.2
(i) The level set Ω = { x Rn | f ( x ) ≤ f ( x 1 )} is bounded, i.e., there exists a positive constant C such that ║ x ║ ≤ C , for all x ∈ Ω.
(ii) In some neighborhood Ω of L , f is continuously differentiable and its gradient is Lipchitz continuous, i.e., there exists a constant L > 0, such that
PPT Slide
Lager Image
for all x,y ∈ Ω.
Theorem 2.1. Let the sequences { xk } and { dk } be generated by the method (1.2), (1.3), and βk is computed by (1.4). Then, we have
PPT Slide
Lager Image
for all k ≥ 1 , where u ∈ (0, 1).
Proof . If k = 1, from (1.3), we get
PPT Slide
Lager Image
Then, we can easily conclude (2.1). If k ≥ 2, multiplying (1.3) by
PPT Slide
Lager Image
, from (1.4), we get
PPT Slide
Lager Image
Theorem 2.2. Suppose that Assumption 2.2 holds. By the Method 2.1, we have
PPT Slide
Lager Image
Proof . From Theorem 2.1 and Assumption (i), we can know that { f ( xk )} is bounded and monotonically decreasing, i.e., { f ( xk )}, k = 1, 2,..., n , is convergent series. By (1.6), we have that
PPT Slide
Lager Image
From Assumption 2.2, we get
PPT Slide
Lager Image
So, from (2.3) and (2.4), we have
PPT Slide
Lager Image
Square both sides of (2.5), we have
PPT Slide
Lager Image
Therefore, by
PPT Slide
Lager Image
, we get
PPT Slide
Lager Image
According to the convergence of { f ( xk )}, we can conclude that
PPT Slide
Lager Image
Remark 2.3. Suppose that Assumption 2.2 holds. By the Method 2.1, we know that
PPT Slide
Lager Image
Proof . From Theorem 2.1, we know that
PPT Slide
Lager Image
where u ∈ (0, 1).
Square both sides of (2.7), we have
PPT Slide
Lager Image
Divided both sides of the above inequation by ║ dk 2 , we get
PPT Slide
Lager Image
From Theorem 2.2, we can conclude that
PPT Slide
Lager Image
Theorem 2.4. Suppose that Assumption 2.2 holds. If { xk } (k = 1, 2,..., n) is generated by Method 2.1, we have
PPT Slide
Lager Image
Proof . If (2.8) does not hold, there exists
PPT Slide
Lager Image
, such that
PPT Slide
Lager Image
holds for all k ≥ 1. From (1.4), if
PPT Slide
Lager Image
, we have
PPT Slide
Lager Image
From (1.3) and (1.4), we know that
PPT Slide
Lager Image
Square both sides of (2.10) , we get
PPT Slide
Lager Image
Divided both sides of the above equation by
PPT Slide
Lager Image
, we get
PPT Slide
Lager Image
By
PPT Slide
Lager Image
we have
PPT Slide
Lager Image
By (2.9) and (2.11), we know that
PPT Slide
Lager Image
Therefore, by
PPT Slide
Lager Image
, we have
PPT Slide
Lager Image
If
PPT Slide
Lager Image
, we get
PPT Slide
Lager Image
We can easily conclude that
PPT Slide
Lager Image
which leads to a contradiction with (2.2). This shows (2.8) holds. We finish the proof of the theorem. □
3. Discussions of the Method 2.1 and Numerical Results
The line search in Method 2.1 can also given by the general Wolfe line search
PPT Slide
Lager Image
PPT Slide
Lager Image
where 0 <, σ 1 σ 2 <, 1.
Theorem 3.1. Suppose that Assumption 2.2 holds. Consider the Method 2.1, where αk satisfies (3.1), (3.2). Then, we have
PPT Slide
Lager Image
Proof. From (3.2), we get
PPT Slide
Lager Image
So
PPT Slide
Lager Image
From (3.1), we have
PPT Slide
Lager Image
By (3.3), we get
PPT Slide
Lager Image
That is
PPT Slide
Lager Image
We have
PPT Slide
Lager Image
Discussion 3.1 By Theorem 3.1, we also can get the global convergence of the Method 2.1 with (3.1), (3.2).
Discussion 3.2 If the line search in the Method 2.1 is given by the other Wolfe-type line search, which is given in [12]
PPT Slide
Lager Image
the method is also globally convergent.
Discussion 3.3 If in the Method 2.1, βk is given as
PPT Slide
Lager Image
αk satisfies (1.5), (1.6) or (3.1), (3.2), we also can get the global convergence of the Method 2.1.
Numerical Results 3.4
Now, we test the Method 2.1, where αk satisfing (1.5), (1.6) or (3.1), (3.2) by using double precision versions of the unconstrained optimization problems in the CUTE library [13] .
For the Method 2.1, αk is computed by (1.5) and (1.6) with δ = 0.4 and σ = 0.7 in the Table 3.1 . αk is computed by (3.1) and (3.2) with σ 1 = 0.5 and σ 2 = 0.6 in the Table 3.2 .
PPT Slide
Lager Image
PPT Slide
Lager Image
The numerical results are given in the form of NI/NF/NG/g, where NI, NF, NG denote the numbers of iterations, function evaluations, and gradient evaluations and g denotes the finally gradient norm. Finally, all attempts to solve the test problems were limited to reaching maximum of achieving a solution with ║ gk ║ ≤ 10 −3 .
BIO
Ajie Chu received her BS from Xingtan CollegeQufu Normal University, China in 2013. Since 2013 she has been at Qingdao University, China. Her research interests include optimization, nonsmooth equations.
College of Mathematics, Qingdao University, Qingdao 266071, China.
e-mail: 13658689660@163.com
Yixiao Su received his BS from Dezhou College, China in 2013. Since 2013 he has been at Qingdao University, China. His research interests include optimization, complementarity problems.
College of Mathematics, Qingdao University, Qingdao 266071, China.
e-mail: 124890405@qq.com
Shouqiang Du PhD, Vice Professor. Since 2003 he has been at College of Mathematics, Qingdao University. His research interests include numerical optimization and nonsmooth optimization.
College of Mathematics, Qingdao University, Qingdao 266071, China.
e-mail: dsq8933@163.com; sqdu@qdu.edu.cn
References
Fletcher R. , Reeves C.M. (1964) Function minimization by conjugate gradients Computer Journal 7 149 - 154    DOI : 10.1093/comjnl/7.2.149
Polak E. , Ribiere G. (1969) Note sur la convergence de directions conjugees Rev Francaise Informat Recherche Operatinelle, 3e Annee 16 35 - 43
Polyak B.T. (1969) The conjugate gradient method in extreme problems USSR Computational Mathematics and Mathematical Physics 9 94 - 112    DOI : 10.1016/0041-5553(69)90035-4
Hestenes M.R. , Stiefel E.L. (1952) Methods of conjugate gradients for solving linear systems Journal of Research of the National Bureau of Standards 49 409 - 436    DOI : 10.6028/jres.049.044
Dai Y.H. , Yuan Y. (1999) A nonlinear conjugate gradient method with a strong global convergence property SIAM Journal on Optimization 10 177 - 182    DOI : 10.1137/S1052623497318992
Gilbert J.C. , Nocedal J. (1992) Global convergence properties of conjugate gradient methods for optimization SIAM Journal on Optimization 2 21 - 42    DOI : 10.1137/0802003
Andrei N. (2010) Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization Numerical Algorithms 54 23 - 46    DOI : 10.1007/s11075-009-9321-0
Babaie-Kafaki S. , Ghanbari R. , Mahdavi-Amiri N. (2010) Two new conjugate gradient methods based on modified secant equations Journal of Computational and Applied Mathematics 234 1374 - 1386    DOI : 10.1016/j.cam.2010.01.052
Dai Y.H. , Yuan Y. (2001) An efficient hybrid conjugate gradient method for unconstrained optimization Annals of Operations Research 103 33 - 47    DOI : 10.1023/A:1012930416777
Li G. , Tang C. , Wei Z. (2007) New conjugacy condition and related new conjugate gradient methods for unconstrained optimization Journal of Computational and Applied Mathematics 202 523 - 539    DOI : 10.1016/j.cam.2006.03.005
Zhang X.J. , Xu A.N. (2005) Global convergence properties of a new class of nonlinear conjugate gradient methods Guangxi Sciences 12 282 - 283
Wang C.Y. , Du S.Q. , Chen Y.Y. (2004) Global covergence properties of three-term conjugate gradient method with new-type line search Journal of Systems Science and Complexity 17 412 - 420
Bongartz I. , Conn A.R. , Gould N. , Toint P.L. (1995) Toint, CUTE: constrained and unconstrained testing environment Acm Transactions on Mathematical Software 50 123 - 160    DOI : 10.1145/200979.201043