SlideShare a Scribd company logo
1 of 6
Download to read offline
IOSR Journal of Engineering (IOSRJEN) www.iosrjen.org
ISSN (e): 2250-3021, ISSN (p): 2278-8719
Vol. 04, Issue 05 (May. 2014), ||V7|| PP 38-43
International organization of Scientific Research 38 | P a g e
New Homotopy Conjugate Gradient for Unconstrained
Optimization using Hestenes- Stiefel and Conjugate Descent
Salah Gazi Sharee, Hussein Ageel Khatab
Dept. of Mathematics, Faculty of Science, University of Zakho, Kurdistan Region-Iraq
Abstract: - In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by
using homotopy formula , We calculate the parameter 𝛽𝑘 as aconvex combination of 𝛽 𝐻𝑆
(Hestenes Stiefel)[5]
and 𝛽 𝐶𝐷
(Conjugate descent)[3].
Keywords: - Unconstrained optimization, line search, conjugate gradient method. homotopy formula.
I. INTRODACTION
Suppose that f : R
n
→R is continuously differentiable function whose gradient is denoted by 𝑔(𝑥). Consider
the nonlinear following unconstrained optimization problem
Min f(x)
n
Rx (1.1)
The iterates for solving (1.1)are given by
𝑥 𝑘+1 = 𝑥 𝑘 +∝ 𝑘 𝑑 𝑘 ,k=0,1,…..,n (1.2)
Where ∝ 𝑘is a positive size obtained by line search and 𝑑 𝑘 is a search direction. The search direction at the
very first iteration is the steepest descent 𝑑0 = −𝑔0 , the directions along the iterations are computed according
to;
𝑑 𝑘+1 = −𝑔 𝑘+1 + 𝛽𝑘 𝑑 𝐾 , 𝑘 ≥ 0 (1.3)
where 𝛽𝑘 ∈ 𝑅 is known as conjugate gradient coefficient and different 𝛽𝑘 will yield different conjugate
gradient methods. Some well known formulas are given as follows:
𝛽𝑘
𝑃𝑅
=
𝑔 𝑘
𝑇
(𝑔 𝑘− 𝑔 𝑘−1)
||𝑔 𝑘−1||2 (1.4)
𝛽𝑘
𝐹𝑅
=
𝑔 𝑘+1
𝑇
𝑔 𝑘+1
||𝑔 𝑘 ||2 (1.5)
𝛽𝑘
𝐻𝑆
=
𝑔 𝑘+1
𝑇
(𝑔 𝑘+1− 𝑔 𝑘)
(𝑔 𝑘+1− 𝑔 𝑘 ) 𝑇 𝑑 𝑘
(1.6)
𝛽𝑘
𝐷𝑌
=
𝑔 𝑘+1
𝑇
𝑔 𝑘+1
(𝑔 𝑘+1− 𝑔 𝑘) 𝑇 𝑑 𝑘
(1.7)
𝛽𝑘
𝐶𝐷
=
𝑔 𝑘+1
𝑇
𝑔 𝑘+1
−𝑑 𝑘
𝑇 𝑔 𝑘
(1.8)
𝛽𝑘
𝐿𝑆
=
𝑔 𝑘+1
𝑇
(𝑔 𝑘+1− 𝑔 𝑘 )
−𝑑 𝑘
𝑇 𝑔 𝑘
(1.9)
Where 𝑔 𝑘+1 and g k are gradients ∇𝑓(𝑥 𝑘+1) and ∇𝑓(𝑥 𝑘 ) of ∇𝑓(𝑥) at the point 𝑥 𝑘+1and 𝑥 𝑘, respectively , .
denotes the Euclidian norm of vectors. The line search in the conjugate gradient algorithms often is based on
the standard Wolfe conditions:
𝑓 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 − 𝑓 𝑥 𝑘 ≤ 𝛿𝛼 𝑘 𝑔𝑘
𝑇
𝑑 𝑘 (1.10)
𝑔(𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘) 𝑇
𝑑 𝑘 ≥ 𝜎𝑔 𝑘
𝑇
𝑑 𝑘 (1.11)
where 0 <δ<σ< 1.
The strong Wolfe line search corresponds to: that
𝑓 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 − 𝑓 𝑥 𝑘 ≤ 𝛿𝛼 𝑘 𝑔𝑘
𝑇
𝑑 𝑘 (1.12)
𝑔(𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 ) 𝑇
𝑑 𝑘 ≤ 𝜎𝑔 𝑘
𝑇
𝑑 𝑘 (1.13)
where 0 <δ<σ< 1.[10]
II. HYBRID CONJUGATE GRADIENT ALGORITHMS
The hybrid conjugate gradient algorithms are combinations of different conjugate gradient algorithms.
They are mainly purposed in order to avoid the jamming phenomenon.[1],these methods are an important class
of conjugate gradient algorithms.[6],[9]
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
International organization of Scientific Research 39 | P a g e
The methods of (FR)[4],(DY) [2] and (CD)[3] have strong convergence properties, but they may have
modest practical performance due to jamming.
On the other hand, the methods of (PR)[8],(HS)[5] and (LS)[7] may not always be convergent, but the often
have better computational performances.[1]
III. NEW HYBRID SUGGESTION
our suggestion generates iterates 𝑥0 , 𝑥1, 𝑥2,… computed by means of the recurrence (𝑥 𝑘+1 = 𝑥 𝑘 +∝ 𝑘 𝑑 𝑘),
where the stepsize ∝ 𝑘 > 0 is determined according to the Wolf line search condition (1.10) and (1.11), and the
directions 𝑑 𝑘 are generated by the rule :
𝑑 𝑘+1 = −𝑔 𝑘+1 + 𝛽𝑘
𝑁𝐸𝑊
𝑑 𝑘 (3.1)
Where
𝛽𝑘
𝑁𝐸𝑊
= 1 − 𝜃 𝑘 𝛽𝑘
𝐻𝑆
+ 𝜃 𝑘 𝛽𝑘
𝐶𝐷
, 0 ≤ 𝜃 ≤ 1
𝛽𝑘
𝑁𝐸𝑊
= 1 − 𝜃 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇 𝑦 𝑘
− 𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇 𝑔 𝑘
(3.2)
Observe that , if 𝜃 𝑘 = 0, then 𝛽𝑘
𝑁𝐸𝑊
= 𝛽 𝐻𝑆
, if 𝜃 𝑘 = 1, then 𝛽𝑘
𝑁𝐸𝑊
= 𝛽 𝐶𝐷
,
On the other hand , if 0 < 𝜃 𝑘 < 1, then we can find 𝛽𝑘
𝑁𝐸𝑊
as follows:
We know that
𝑑 𝑘+1 = −𝑔 𝑘+1 + 1 − 𝜃 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
𝑑 𝑘 − 𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑑 𝑘 (3.3)
Our motivation is to choose the parameter 𝜃 𝑘 in such a way so that the direction 𝑑 𝑘+1 given (3.3( to be the
Newton direction. Therefore
−∇2
𝑓 𝑥 𝑘+1
−1
𝑔𝑘+1 = −𝑔 𝑘+1 + 1 − 𝜃 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
𝑑 𝑘 − 𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑑 𝑘
Multiply both sides of above equation by 𝑑 𝑘
𝑇
∇2
𝑓(𝑥 𝑘+1) , we get
−𝑑 𝑘
𝑇
𝑔 𝑘+1 = −𝑑 𝑘
𝑇
∇2
𝑓(𝑥 𝑘+1)𝑔 𝑘+1 + 1 − 𝜃 𝑘 𝑑 𝑘
𝑇
∇2
𝑓 𝑥 𝑘+1
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
𝑑 𝑘
−𝜃 𝑘 𝑑 𝑘
𝑇
∇2
𝑓(𝑥 𝑘+1)
𝑔 𝑘+1
2
𝑑 𝑘
𝑇 𝑔 𝑘
𝑑 𝑘
−𝑑 𝑘
𝑇
𝑔 𝑘+1 = −𝑑 𝑘
𝑇
∇2
𝑓(𝑥 𝑘+1)𝑔 𝑘+1 + 1 − 𝜃 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
(𝑑 𝑘
𝑇
∇2
𝑓 𝑥 𝑘+1 𝑑 𝑘 )
−𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇 𝑔 𝑘
(𝑑 𝑘
𝑇
∇2
𝑓 𝑥 𝑘+1 𝑑 𝑘 )
Since 𝑑 𝑘
𝑇
∇2
𝑓 𝑥 𝑘+1 = 𝑦 𝑘, then we have
−𝑑 𝑘
𝑇
𝑔 𝑘+1 = −𝑦 𝑘
𝑇
𝑔𝑘+1 + 1 − 𝜃 𝑘 𝑔 𝑘+1
𝑇
𝑦 𝑘 − 𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑦 𝑘
𝑇
𝑑 𝑘
−𝑑 𝑘
𝑇
𝑔 𝑘+1 = −𝜃 𝑘 𝑔𝑘+1
𝑇
𝑦 𝑘 − 𝜃 𝑘
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑦 𝑘
𝑇
𝑑 𝑘
Implies that
𝜃 𝑘 =
𝑑 𝑘
𝑇
𝑔 𝑘+1
𝑔 𝑘+1
𝑇
𝑦 𝑘 +
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑦𝑘
𝑇
𝑑 𝑘
Or
𝜃 𝑘 =
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
(3.4)
3.1. Convergence of the new hybrid conjugate gradient algorithm
Theorem 3.1.1 : Assume that dk is a descent direction and ∝ 𝑘 in algorithm (1.2) and (3.2) where θk is given by
(2.4) ,is determined by the wolfe line search (1.10)and (1.11). If 0 < 𝜃 < 1, then the direction dk+1 given by
(3.3) is a descent direction.
Proof:-
From (3.3) and (3.4) we have
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
International organization of Scientific Research 40 | P a g e
𝑑 𝑘+1 = −𝑔 𝑘+1 + 1 −
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
𝑑 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
(3.1.1)
Multiply both sides of (3.1.1) by 𝑔 𝑘+1
𝑇
, we get
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 = − 𝑔 𝑘+1
2
+ 1 −
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔𝑘 ) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
(𝑔𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑔 𝑘
(3.1.2)
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 = − 𝑔 𝑘+1
2
+
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
(𝑔𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑔 𝑘
(3.1.3)
The prove is complete if the step length ∝ 𝑘 is chosen by an exact line search which requires 𝑑 𝑘
𝑇
𝑔 𝑘+1 = 0.
Now, if the step length ∝ 𝑘 is chosen by an inexact line search which requires 𝑑 𝑘
𝑇
𝑔 𝑘+1 ≠ 0,
We know that the first two terms of equation (3.1.3) are less than or equal to zero because the algorithm of
Hestenes – Stiefel (HS) is satisfies the descent condition (i.e)
- ||𝑔 𝑘+1||2
+
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘)
𝑑 𝑘
𝑇 𝑦 𝑘
≤ 0, (3.1.4)
It remains to consider the third and fourth terms
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘) 𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘) 𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 ) 𝑔𝑘+1
2
(𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 ) 𝑑 𝑘
𝑇
𝑔𝑘
=
−
𝑑 𝑘
𝑇
𝑔 𝑘+1
2
(𝑑 𝑘
𝑇
𝑔𝑘 ) 𝑔𝑘+1
𝑇
𝑦 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 ) 𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1
2
(𝑑 𝑘
𝑇
𝑔 𝑘) 𝑔 𝑘+1
2
𝑔 𝑘+1
𝑇
𝑦 𝑘 𝑑 𝑘
𝑇
𝑔 𝑘
2 + 𝑔 𝑘+1
2 𝑦𝑘
𝑇
𝑑 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘)
=
− 𝑑 𝑘
𝑇
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑦 𝑘
(𝑑 𝑘
𝑇
𝑔𝑘 ) 𝑔𝑘+1
𝑇
𝑦 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
+
(𝑑 𝑘
𝑇
𝑔 𝑘) 𝑔 𝑘+1
2
𝑔 𝑘+1
𝑇
𝑦 𝑘 𝑑 𝑘
𝑇
𝑔𝑘
2
𝑑 𝑘
𝑇
𝑦 𝑘
+ 𝑔 𝑘+1
2(𝑑 𝑘
𝑇
𝑔𝑘 )
=
− 𝑑 𝑘
𝑇
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑦 𝑘
(3.1.5)
We know that dk
T
gk+1
2
is greater than or equal to zero and
dk
T
yk > 0. Consequently, we have
− dk
T
gk+1
2
dk
T
yk
≤ 0
Implies that
gk+1
T
dk+1 ≤ 0.
Then the proof is completed.∎
Theorem 3.1.2 :- Assume that the conditions in theorem (3.1.1) hold and
𝑦 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘+1)
𝑦 𝑘
𝑇 𝑑 𝑘
≤ 𝑔 𝑘+1
2
. If there
exists a constant 𝑐1 > 0, such that gk+1
T
dk+1 ≤ −𝑐1 𝑔 𝑘+1
2
, then the direction dk+1 satisfies the sufficient
descent condition.
Proof:
From (3.3) and (3.4) we have
𝑑 𝑘+1
= −𝑔 𝑘+1 + 1 −
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑑 𝑘
𝑇
𝑦 𝑘
𝑑 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑔 𝑘
𝑑 𝑘 (4.2.1)
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
International organization of Scientific Research 41 | P a g e
Multiply both sides of (4.2.1) by 𝑔 𝑘+1
𝑇
we get
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 = − 𝑔 𝑘+1
2
+ 1 −
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘)
𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
(𝑔𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑔 𝑘
(3.1.6)
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤ − 𝑔 𝑘+1
2
+ 𝑔 𝑘+1
2
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔𝑘 ) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑔 𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑦 𝑘
−
𝑑 𝑘
𝑇
𝑔 𝑘+1 (𝑑 𝑘
𝑇
𝑔 𝑘 )
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘 )
𝑔 𝑘+1
2
(𝑔𝑘+1
𝑇
𝑑 𝑘 )
𝑑 𝑘
𝑇
𝑔 𝑘
(3.1.7)
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤
− 𝑑 𝑘
𝑇
𝑔 𝑘+1
2
𝑑 𝑘
𝑇
𝑦 𝑘
(𝑑 𝑘
𝑇
𝑔 𝑘) 𝑔 𝑘+1
𝑇
𝑦 𝑘
𝑔 𝑘+1
𝑇
𝑦 𝑘 (𝑑 𝑘
𝑇
𝑔 𝑘) + 𝑔 𝑘+1
2(𝑦𝑘
𝑇
𝑑 𝑘)
+
(𝑑 𝑘
𝑇
𝑔 𝑘 ) 𝑔𝑘+1
2
𝑔 𝑘+1
𝑇
𝑦 𝑘 𝑑 𝑘
𝑇
𝑔 𝑘
2
𝑑 𝑘
𝑇
𝑦 𝑘
+ 𝑔 𝑘+1
2(𝑑 𝑘
𝑇
𝑔 𝑘)
After some operations, we get
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤
−(𝑑 𝑘
𝑇
𝑔 𝑘+1)(𝑑 𝑘
𝑇
𝑔 𝑘+1)
(𝑑 𝑘
𝑇
𝑦 𝑘)
(3.1.8)
Multiply and divided right hand side of above inequality by 𝑦 𝑘
𝑇
𝑔𝑘+1 , we get
gk+1
T
dk+1 ≤
−(dk
T
gk+1)(dk
T
gk+1)( 𝑦 𝑘
𝑇
𝑔𝑘+1)
( 𝑦𝑘
𝑇
𝑔𝑘+1)(dk
T
yk)
(3.1.9)
By hypothesis , (3.1.9) gives
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤
−(𝑑 𝑘
𝑇
𝑔 𝑘+1)
( 𝑦𝑘
𝑇
𝑔𝑘+1)
𝑔 𝑘+1
2
Multiply and divided right hand side by 𝑦 𝑘
𝑇
𝑔𝑘+1 𝑤𝑒 𝑔𝑒𝑡
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤
−(𝑑 𝑘
𝑇
𝑔 𝑘+1)( 𝑦 𝑘
𝑇
𝑔𝑘+1)
( 𝑦𝑘
𝑇
𝑔𝑘+1)2
𝑔 𝑘+1
2
Since dk
T
gk+1 < dk
T
yk , then
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤ −
(dk
T
yk)( 𝑦 𝑘
𝑇
𝑔𝑘+1)
( 𝑦𝑘
𝑇
𝑔𝑘+1)2
𝑔 𝑘+1
2
(3.1.10)
Now, if 𝑦 𝑘
𝑇
𝑔𝑘+1 > 0,
Let 𝑐1 =
(dk
T
yk )( 𝑦 𝑘
𝑇
𝑔 𝑘+1)
( 𝑦 𝑘
𝑇 𝑔 𝑘+1)2
Then ,( 3.1.10) gives
gk+1
T
dk+1 ≤ −𝑐1 gk+1
2
If 𝑦 𝑘
𝑇
𝑔 𝑘+1 < 0 and we know that dk
T
yk > 0 , then,
(dk
T
yk) 𝑦 𝑘
𝑇
𝑔𝑘+1 < dk
T
yk
,
Then, (3.1.10) gives
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤ −
dk
T
yk
( 𝑦𝑘
𝑇
𝑔𝑘+1)2
𝑔 𝑘+1
2
Let 𝑐1 =
dk
T
yk
( 𝑦 𝑘
𝑇 𝑔 𝑘+1)2
Hence
𝑔 𝑘+1
𝑇
𝑑 𝑘+1 ≤ −𝑐1 𝑔 𝑘+1
2
Then the proof is completed.∎
3.2 Theorem of global convergence
Since the new hybrid conjugate gradient algorithm is satisfies the sufficient descent condition by using wolfe
conditions , then the new hybrid conjugate gradient algorithm is satisfies the global convergence property .
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
International organization of Scientific Research 42 | P a g e
3.3 Algorithm of New Hybrid Conjugate Gradient algorithm
step (1) :- set k=0 , select the initial point 𝑥 𝑘 .
step( 2) :- 𝑔 𝑘 = f(𝑥 𝑘) , If 𝑔 𝑘 = 0 ,then stop .
else
set 𝑑 𝑘 = - 𝑔 𝑘 .
step (3) :- compute 𝛼 𝑘 > 0 satisfying the wolfe lline search condition to mini mize f(xk+1).
step (4) :- 𝑥 𝑘+1 = 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 .
step (5) :- 𝑔 𝑘+1 = f(𝑥 𝑘+1) , If 𝑔 𝑘+1 = 0 ,then stop .
step (6):- compute θk as in (3.4).
Step (7):-if 0 < 𝜃 𝑘 < 1, then compute 𝛽𝑘
𝑁𝐸𝑊
as in (3.2). If 𝜃 𝑘 ≥ 1, then set
𝛽𝑘
𝑁𝐸𝑊
= 𝛽 𝐶𝐷
. If 𝜃 𝑘 ≤ 0, then set 𝛽𝑘
𝑁𝐸𝑊
= 𝛽 𝐻𝑆
.
step (8) :- 𝑑 𝑘+1 = - 𝑔 𝑘+1 + 𝛽𝑘
𝑁𝐸𝑊
𝑑 𝑘 .
step (9) :- If k=n then go to step 2,
else
k=k+1 and go to step 3.
3.4 NUMRICAL RESULTS:-
This section is devoted to test the implementation of the new formula . We compare the hybrid algorithm with
standard Hestenes – Stiefel (HS) and conjugate direction (CD) ,the comparative tests involve well-known
nonlinear problems (standard test function) with different dimension 4 ≤n≤5000, all programs are written in
FORTRAN95 language and for all cases the stopping condition is
5
1 10
 kg
The results are given in
below table is specifically quote the number of functions NOF and the number of iteration NOI .experimental
results in below table confirm that the new CG method is superior to standard CG method with respect to the
NOI and NOF.
Table Comparative Performance of the three algorithms (Standard HS,CD and New formula)
Test fun. N Standard formula
( HS)
Standard formula (CD) New formula
NOI NOF NOI NOF NOI NOF
Powell 4
100
500
1000
3000
5000
65
105
502
637
879
1008
170
276
1062
1332
1816
2074
994
3102
134002
*
*
*
2077
6942
270117
*
*
*
30
108
502
241
249
409
74
242
1011
532
568
913
Wood 4
100
500
1000
3000
5000
26
27
28
28
28
28
59
61
63
63
63
63
353
928
1008
561
500
517
709
2115
2277
1165
1038
1111
26
26
27
27
27
27
61
61
6
63
63
63
Cubic 4
100
500
1000
3000
5000
15
14
14
14
14
14
43
40
40
40
40
40
540
801
1501
*
*
*
1104
2060
5287
*
*
*
11
11
11
11
11
11
34
32
32
32
32
32
Rosen 4
100
500
1000
3000
5000
23
17
*
*
*
*
66
52
*
*
*
*
611
461
2063
4036
*
*
1262
1374
6612
13523
*
*
23
21
21
21
21
21
64
62
62
62
62
62
New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent
International organization of Scientific Research 43 | P a g e
Mile 4
100
500
1000
3000
5000
28
142
501
1001
1442
1660
101
346
1108
2312
3252
3688
*
*
*
*
*
*
*
*
*
*
*
*
18
149
501
998
1270
1418
50
355
1092
2290
2834
3130
Non Digonal 4
100
500
1000
3000
5000
23
22
22
22
22
22
61
60
59
59
59
59
249
*
*
*
*
*
558
*
*
*
*
*
23
18
18
18
19
19
59
51
53
53
55
55
Woolf 4
100
500
1000
3000
5000
12
49
56
70
166
176
27
99
113
141
343
365
49
901
3893
8001
*
*
99
1879
8393
19675
*
*
12
49
55
69
166
175
27
99
111
139
343
363
IV. CONCLUSION
In this paper we have presented a new hybrid conjugate gradient method in which a famous parameter 𝛽𝑘 is
computed as a convex combination of 𝛽𝑘
𝐻𝑆
and 𝛽𝑘
𝐶𝐷
and comparative numerical performances of a number of
well known conjugate gradient algorithms Hestenes Stiefel (HS) and Conjugate decsent (CD). We saw that the
performance profile of our method was higher than those of the well established conjugate gradient algorithms
HS and CD.
REFRENCES
[1] N. Andrei, Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization , J.Optim. Theory
Appl., 141, 2009, 249-264.
[2] Y.H. Dai, and Y. Yuan, A nonlinear Conjugate Gradient Method With a Strong Global Convergence
Property, SIAM J. Optimization , 10,1999, Pp. 177-182.
[3] R. Fletcher, Practical Methods of Optimization,(Vol. 1: Unconstrined Optimization , John Wiley and
Sons, New York, 1987).
[4] R. Fletcher, and C. Reeves, Function Minimization by Conjugate Gradients, Comput. J.,7, 1964 , Pp.
149-154.
[5] M.R. Hestenes, and E.L. Stiefel, Methods of Conjugate Gradients for Solving Linear Systems, J.Research
Nat. Bur. Standards ,49 ,1952, Pp. 409-436.
[6] Y.F. Hu and C. Storey, Global Convergence Result for Conjugate Gradient Methods, J. Optim. , Theory
Appl., 71,1991, Pp. 399-405.
[7] Y. Liu , and C. Storey, Efficient generalized Conjugate Gradient Algorithms , Part 1: Theory.
JOTA,69,1991, Pp.129-137.
[8] E. Polak , andG. Ribiere, Note sur Ia convergence de directions Conjugate , Rev. Francaise
Information, Recherche Operationelle, 3e Annee, 16,1969, Pp. 35-43.
[9] D. Touati-Ahmed and C. Storey, Efficient Hybrid Conjugate Gradient Techniques , J. Optim. , Theory
Appl., 64,1990, Pp.379-397.
[10] P. Wolfe, Convergence conditions for ascent methods, SIAM Rev., 11,1969, 226-235.

More Related Content

What's hot

Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mapping
Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix MappingDual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mapping
Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mappinginventionjournals
 
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime Ring
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime RingJordan Higher (𝜎, 𝜏)-Centralizer on Prime Ring
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime RingIOSR Journals
 
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltAnalysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltVarun Jadhav
 
Some properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesSome properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesIOSR Journals
 
3 recursive bayesian estimation
3  recursive bayesian estimation3  recursive bayesian estimation
3 recursive bayesian estimationSolo Hermelin
 
3 capitulo-iii-matriz-asociada-sem-13-t-l-c
3 capitulo-iii-matriz-asociada-sem-13-t-l-c3 capitulo-iii-matriz-asociada-sem-13-t-l-c
3 capitulo-iii-matriz-asociada-sem-13-t-l-cFernandoDanielMamani1
 
A Characterization of the Zero-Inflated Logarithmic Series Distribution
A Characterization of the Zero-Inflated Logarithmic Series DistributionA Characterization of the Zero-Inflated Logarithmic Series Distribution
A Characterization of the Zero-Inflated Logarithmic Series Distributioninventionjournals
 
Helmholtz equation (Motivations and Solutions)
Helmholtz equation (Motivations and Solutions)Helmholtz equation (Motivations and Solutions)
Helmholtz equation (Motivations and Solutions)Hassaan Saleem
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodSEMINARGROOT
 
5 cramer-rao lower bound
5 cramer-rao lower bound5 cramer-rao lower bound
5 cramer-rao lower boundSolo Hermelin
 
Duel of cosmological screening lengths
Duel of cosmological screening lengthsDuel of cosmological screening lengths
Duel of cosmological screening lengthsMaxim Eingorn
 
A Non Local Boundary Value Problem with Integral Boundary Condition
A Non Local Boundary Value Problem with Integral Boundary ConditionA Non Local Boundary Value Problem with Integral Boundary Condition
A Non Local Boundary Value Problem with Integral Boundary ConditionIJMERJOURNAL
 
Generalised Statistical Convergence For Double Sequences
Generalised Statistical Convergence For Double SequencesGeneralised Statistical Convergence For Double Sequences
Generalised Statistical Convergence For Double SequencesIOSR Journals
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleLiang Kai Hu
 
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear Equations
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear EquationsTenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear Equations
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear EquationsQUESTJOURNAL
 
Second-order Cosmological Perturbations Engendered by Point-like Masses
Second-order Cosmological Perturbations Engendered by Point-like MassesSecond-order Cosmological Perturbations Engendered by Point-like Masses
Second-order Cosmological Perturbations Engendered by Point-like MassesMaxim Eingorn
 

What's hot (20)

C0560913
C0560913C0560913
C0560913
 
Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mapping
Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix MappingDual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mapping
Dual Spaces of Generalized Cesaro Sequence Space and Related Matrix Mapping
 
Z transforms
Z transformsZ transforms
Z transforms
 
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime Ring
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime RingJordan Higher (𝜎, 𝜏)-Centralizer on Prime Ring
Jordan Higher (𝜎, 𝜏)-Centralizer on Prime Ring
 
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving beltAnalysis of a self-sustained vibration of mass-spring oscillator on moving belt
Analysis of a self-sustained vibration of mass-spring oscillator on moving belt
 
Some properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spacesSome properties of two-fuzzy Nor med spaces
Some properties of two-fuzzy Nor med spaces
 
3 recursive bayesian estimation
3  recursive bayesian estimation3  recursive bayesian estimation
3 recursive bayesian estimation
 
3 capitulo-iii-matriz-asociada-sem-13-t-l-c
3 capitulo-iii-matriz-asociada-sem-13-t-l-c3 capitulo-iii-matriz-asociada-sem-13-t-l-c
3 capitulo-iii-matriz-asociada-sem-13-t-l-c
 
A Characterization of the Zero-Inflated Logarithmic Series Distribution
A Characterization of the Zero-Inflated Logarithmic Series DistributionA Characterization of the Zero-Inflated Logarithmic Series Distribution
A Characterization of the Zero-Inflated Logarithmic Series Distribution
 
Helmholtz equation (Motivations and Solutions)
Helmholtz equation (Motivations and Solutions)Helmholtz equation (Motivations and Solutions)
Helmholtz equation (Motivations and Solutions)
 
Strong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's methodStrong convexity on gradient descent and newton's method
Strong convexity on gradient descent and newton's method
 
5 cramer-rao lower bound
5 cramer-rao lower bound5 cramer-rao lower bound
5 cramer-rao lower bound
 
Exponential decay for the solution of the nonlinear equation induced by the m...
Exponential decay for the solution of the nonlinear equation induced by the m...Exponential decay for the solution of the nonlinear equation induced by the m...
Exponential decay for the solution of the nonlinear equation induced by the m...
 
Duel of cosmological screening lengths
Duel of cosmological screening lengthsDuel of cosmological screening lengths
Duel of cosmological screening lengths
 
A Non Local Boundary Value Problem with Integral Boundary Condition
A Non Local Boundary Value Problem with Integral Boundary ConditionA Non Local Boundary Value Problem with Integral Boundary Condition
A Non Local Boundary Value Problem with Integral Boundary Condition
 
Generalised Statistical Convergence For Double Sequences
Generalised Statistical Convergence For Double SequencesGeneralised Statistical Convergence For Double Sequences
Generalised Statistical Convergence For Double Sequences
 
A05330107
A05330107A05330107
A05330107
 
Maximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of BeetleMaximum Likelihood Estimation of Beetle
Maximum Likelihood Estimation of Beetle
 
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear Equations
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear EquationsTenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear Equations
Tenth-Order Iterative Methods withoutDerivatives forSolving Nonlinear Equations
 
Second-order Cosmological Perturbations Engendered by Point-like Masses
Second-order Cosmological Perturbations Engendered by Point-like MassesSecond-order Cosmological Perturbations Engendered by Point-like Masses
Second-order Cosmological Perturbations Engendered by Point-like Masses
 

Similar to F04573843

Lecture 11 state observer-2020-typed
Lecture 11 state observer-2020-typedLecture 11 state observer-2020-typed
Lecture 11 state observer-2020-typedcairo university
 
Assignment_1_solutions.pdf
Assignment_1_solutions.pdfAssignment_1_solutions.pdf
Assignment_1_solutions.pdfAbhayRupareliya1
 
publisher in research
publisher in researchpublisher in research
publisher in researchrikaseorika
 
International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)irjes
 
One solution for many linear partial differential equations with terms of equ...
One solution for many linear partial differential equations with terms of equ...One solution for many linear partial differential equations with terms of equ...
One solution for many linear partial differential equations with terms of equ...Lossian Barbosa Bacelar Miranda
 
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICS
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICSBSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICS
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICSRai University
 
ANALISIS RIIL 1 3.2 ROBERT G BARTLE
ANALISIS RIIL 1 3.2 ROBERT G BARTLEANALISIS RIIL 1 3.2 ROBERT G BARTLE
ANALISIS RIIL 1 3.2 ROBERT G BARTLEMuhammad Nur Chalim
 
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTION
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTIONPROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTION
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTIONJournal For Research
 
Complex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxComplex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxjyotidighole2
 
B.tech ii unit-5 material vector integration
B.tech ii unit-5 material vector integrationB.tech ii unit-5 material vector integration
B.tech ii unit-5 material vector integrationRai University
 
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)iosrjce
 
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...IJERD Editor
 
Laplace & Inverse Transform convoltuion them.pptx
Laplace & Inverse Transform convoltuion them.pptxLaplace & Inverse Transform convoltuion them.pptx
Laplace & Inverse Transform convoltuion them.pptxjyotidighole2
 
taller transformaciones lineales
taller transformaciones linealestaller transformaciones lineales
taller transformaciones linealesemojose107
 
Matrix Transformations on Some Difference Sequence Spaces
Matrix Transformations on Some Difference Sequence SpacesMatrix Transformations on Some Difference Sequence Spaces
Matrix Transformations on Some Difference Sequence SpacesIOSR Journals
 
Differential Geometry for Machine Learning
Differential Geometry for Machine LearningDifferential Geometry for Machine Learning
Differential Geometry for Machine LearningSEMINARGROOT
 

Similar to F04573843 (20)

Lecture 11 state observer-2020-typed
Lecture 11 state observer-2020-typedLecture 11 state observer-2020-typed
Lecture 11 state observer-2020-typed
 
Assignment_1_solutions.pdf
Assignment_1_solutions.pdfAssignment_1_solutions.pdf
Assignment_1_solutions.pdf
 
publisher in research
publisher in researchpublisher in research
publisher in research
 
International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)
 
Taller 1 parcial 3
Taller 1 parcial 3Taller 1 parcial 3
Taller 1 parcial 3
 
International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)International Refereed Journal of Engineering and Science (IRJES)
International Refereed Journal of Engineering and Science (IRJES)
 
One solution for many linear partial differential equations with terms of equ...
One solution for many linear partial differential equations with terms of equ...One solution for many linear partial differential equations with terms of equ...
One solution for many linear partial differential equations with terms of equ...
 
lec29.ppt
lec29.pptlec29.ppt
lec29.ppt
 
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICS
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICSBSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICS
BSC_COMPUTER _SCIENCE_UNIT-2_DISCRETE MATHEMATICS
 
ANALISIS RIIL 1 3.2 ROBERT G BARTLE
ANALISIS RIIL 1 3.2 ROBERT G BARTLEANALISIS RIIL 1 3.2 ROBERT G BARTLE
ANALISIS RIIL 1 3.2 ROBERT G BARTLE
 
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTION
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTIONPROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTION
PROBABILITY DISTRIBUTION OF SUM OF TWO CONTINUOUS VARIABLES AND CONVOLUTION
 
Complex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxComplex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptx
 
B.tech ii unit-5 material vector integration
B.tech ii unit-5 material vector integrationB.tech ii unit-5 material vector integration
B.tech ii unit-5 material vector integration
 
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)
Run Or Walk In The Rain? (Orthogonal Projected Area of Ellipsoid)
 
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...
IJERD(www.ijerd.com)International Journal of Engineering Research and Develop...
 
www.ijerd.com
www.ijerd.comwww.ijerd.com
www.ijerd.com
 
Laplace & Inverse Transform convoltuion them.pptx
Laplace & Inverse Transform convoltuion them.pptxLaplace & Inverse Transform convoltuion them.pptx
Laplace & Inverse Transform convoltuion them.pptx
 
taller transformaciones lineales
taller transformaciones linealestaller transformaciones lineales
taller transformaciones lineales
 
Matrix Transformations on Some Difference Sequence Spaces
Matrix Transformations on Some Difference Sequence SpacesMatrix Transformations on Some Difference Sequence Spaces
Matrix Transformations on Some Difference Sequence Spaces
 
Differential Geometry for Machine Learning
Differential Geometry for Machine LearningDifferential Geometry for Machine Learning
Differential Geometry for Machine Learning
 

More from IOSR-JEN

More from IOSR-JEN (20)

C05921721
C05921721C05921721
C05921721
 
B05921016
B05921016B05921016
B05921016
 
A05920109
A05920109A05920109
A05920109
 
J05915457
J05915457J05915457
J05915457
 
I05914153
I05914153I05914153
I05914153
 
H05913540
H05913540H05913540
H05913540
 
G05913234
G05913234G05913234
G05913234
 
F05912731
F05912731F05912731
F05912731
 
E05912226
E05912226E05912226
E05912226
 
D05911621
D05911621D05911621
D05911621
 
C05911315
C05911315C05911315
C05911315
 
B05910712
B05910712B05910712
B05910712
 
A05910106
A05910106A05910106
A05910106
 
B05840510
B05840510B05840510
B05840510
 
I05844759
I05844759I05844759
I05844759
 
H05844346
H05844346H05844346
H05844346
 
G05843942
G05843942G05843942
G05843942
 
F05843238
F05843238F05843238
F05843238
 
E05842831
E05842831E05842831
E05842831
 
D05842227
D05842227D05842227
D05842227
 

Recently uploaded

The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGSujit Pal
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 

Recently uploaded (20)

The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAG
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 

F04573843

  • 1. IOSR Journal of Engineering (IOSRJEN) www.iosrjen.org ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 05 (May. 2014), ||V7|| PP 38-43 International organization of Scientific Research 38 | P a g e New Homotopy Conjugate Gradient for Unconstrained Optimization using Hestenes- Stiefel and Conjugate Descent Salah Gazi Sharee, Hussein Ageel Khatab Dept. of Mathematics, Faculty of Science, University of Zakho, Kurdistan Region-Iraq Abstract: - In this paper, we suggest a hybrid conjugate gradient method for unconstrained optimization by using homotopy formula , We calculate the parameter 𝛽𝑘 as aconvex combination of 𝛽 𝐻𝑆 (Hestenes Stiefel)[5] and 𝛽 𝐶𝐷 (Conjugate descent)[3]. Keywords: - Unconstrained optimization, line search, conjugate gradient method. homotopy formula. I. INTRODACTION Suppose that f : R n →R is continuously differentiable function whose gradient is denoted by 𝑔(𝑥). Consider the nonlinear following unconstrained optimization problem Min f(x) n Rx (1.1) The iterates for solving (1.1)are given by 𝑥 𝑘+1 = 𝑥 𝑘 +∝ 𝑘 𝑑 𝑘 ,k=0,1,…..,n (1.2) Where ∝ 𝑘is a positive size obtained by line search and 𝑑 𝑘 is a search direction. The search direction at the very first iteration is the steepest descent 𝑑0 = −𝑔0 , the directions along the iterations are computed according to; 𝑑 𝑘+1 = −𝑔 𝑘+1 + 𝛽𝑘 𝑑 𝐾 , 𝑘 ≥ 0 (1.3) where 𝛽𝑘 ∈ 𝑅 is known as conjugate gradient coefficient and different 𝛽𝑘 will yield different conjugate gradient methods. Some well known formulas are given as follows: 𝛽𝑘 𝑃𝑅 = 𝑔 𝑘 𝑇 (𝑔 𝑘− 𝑔 𝑘−1) ||𝑔 𝑘−1||2 (1.4) 𝛽𝑘 𝐹𝑅 = 𝑔 𝑘+1 𝑇 𝑔 𝑘+1 ||𝑔 𝑘 ||2 (1.5) 𝛽𝑘 𝐻𝑆 = 𝑔 𝑘+1 𝑇 (𝑔 𝑘+1− 𝑔 𝑘) (𝑔 𝑘+1− 𝑔 𝑘 ) 𝑇 𝑑 𝑘 (1.6) 𝛽𝑘 𝐷𝑌 = 𝑔 𝑘+1 𝑇 𝑔 𝑘+1 (𝑔 𝑘+1− 𝑔 𝑘) 𝑇 𝑑 𝑘 (1.7) 𝛽𝑘 𝐶𝐷 = 𝑔 𝑘+1 𝑇 𝑔 𝑘+1 −𝑑 𝑘 𝑇 𝑔 𝑘 (1.8) 𝛽𝑘 𝐿𝑆 = 𝑔 𝑘+1 𝑇 (𝑔 𝑘+1− 𝑔 𝑘 ) −𝑑 𝑘 𝑇 𝑔 𝑘 (1.9) Where 𝑔 𝑘+1 and g k are gradients ∇𝑓(𝑥 𝑘+1) and ∇𝑓(𝑥 𝑘 ) of ∇𝑓(𝑥) at the point 𝑥 𝑘+1and 𝑥 𝑘, respectively , . denotes the Euclidian norm of vectors. The line search in the conjugate gradient algorithms often is based on the standard Wolfe conditions: 𝑓 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 − 𝑓 𝑥 𝑘 ≤ 𝛿𝛼 𝑘 𝑔𝑘 𝑇 𝑑 𝑘 (1.10) 𝑔(𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘) 𝑇 𝑑 𝑘 ≥ 𝜎𝑔 𝑘 𝑇 𝑑 𝑘 (1.11) where 0 <δ<σ< 1. The strong Wolfe line search corresponds to: that 𝑓 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 − 𝑓 𝑥 𝑘 ≤ 𝛿𝛼 𝑘 𝑔𝑘 𝑇 𝑑 𝑘 (1.12) 𝑔(𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 ) 𝑇 𝑑 𝑘 ≤ 𝜎𝑔 𝑘 𝑇 𝑑 𝑘 (1.13) where 0 <δ<σ< 1.[10] II. HYBRID CONJUGATE GRADIENT ALGORITHMS The hybrid conjugate gradient algorithms are combinations of different conjugate gradient algorithms. They are mainly purposed in order to avoid the jamming phenomenon.[1],these methods are an important class of conjugate gradient algorithms.[6],[9]
  • 2. New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent International organization of Scientific Research 39 | P a g e The methods of (FR)[4],(DY) [2] and (CD)[3] have strong convergence properties, but they may have modest practical performance due to jamming. On the other hand, the methods of (PR)[8],(HS)[5] and (LS)[7] may not always be convergent, but the often have better computational performances.[1] III. NEW HYBRID SUGGESTION our suggestion generates iterates 𝑥0 , 𝑥1, 𝑥2,… computed by means of the recurrence (𝑥 𝑘+1 = 𝑥 𝑘 +∝ 𝑘 𝑑 𝑘), where the stepsize ∝ 𝑘 > 0 is determined according to the Wolf line search condition (1.10) and (1.11), and the directions 𝑑 𝑘 are generated by the rule : 𝑑 𝑘+1 = −𝑔 𝑘+1 + 𝛽𝑘 𝑁𝐸𝑊 𝑑 𝑘 (3.1) Where 𝛽𝑘 𝑁𝐸𝑊 = 1 − 𝜃 𝑘 𝛽𝑘 𝐻𝑆 + 𝜃 𝑘 𝛽𝑘 𝐶𝐷 , 0 ≤ 𝜃 ≤ 1 𝛽𝑘 𝑁𝐸𝑊 = 1 − 𝜃 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 (3.2) Observe that , if 𝜃 𝑘 = 0, then 𝛽𝑘 𝑁𝐸𝑊 = 𝛽 𝐻𝑆 , if 𝜃 𝑘 = 1, then 𝛽𝑘 𝑁𝐸𝑊 = 𝛽 𝐶𝐷 , On the other hand , if 0 < 𝜃 𝑘 < 1, then we can find 𝛽𝑘 𝑁𝐸𝑊 as follows: We know that 𝑑 𝑘+1 = −𝑔 𝑘+1 + 1 − 𝜃 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 𝑑 𝑘 − 𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑑 𝑘 (3.3) Our motivation is to choose the parameter 𝜃 𝑘 in such a way so that the direction 𝑑 𝑘+1 given (3.3( to be the Newton direction. Therefore −∇2 𝑓 𝑥 𝑘+1 −1 𝑔𝑘+1 = −𝑔 𝑘+1 + 1 − 𝜃 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 𝑑 𝑘 − 𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑑 𝑘 Multiply both sides of above equation by 𝑑 𝑘 𝑇 ∇2 𝑓(𝑥 𝑘+1) , we get −𝑑 𝑘 𝑇 𝑔 𝑘+1 = −𝑑 𝑘 𝑇 ∇2 𝑓(𝑥 𝑘+1)𝑔 𝑘+1 + 1 − 𝜃 𝑘 𝑑 𝑘 𝑇 ∇2 𝑓 𝑥 𝑘+1 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 𝑑 𝑘 −𝜃 𝑘 𝑑 𝑘 𝑇 ∇2 𝑓(𝑥 𝑘+1) 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑑 𝑘 −𝑑 𝑘 𝑇 𝑔 𝑘+1 = −𝑑 𝑘 𝑇 ∇2 𝑓(𝑥 𝑘+1)𝑔 𝑘+1 + 1 − 𝜃 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 ∇2 𝑓 𝑥 𝑘+1 𝑑 𝑘 ) −𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 (𝑑 𝑘 𝑇 ∇2 𝑓 𝑥 𝑘+1 𝑑 𝑘 ) Since 𝑑 𝑘 𝑇 ∇2 𝑓 𝑥 𝑘+1 = 𝑦 𝑘, then we have −𝑑 𝑘 𝑇 𝑔 𝑘+1 = −𝑦 𝑘 𝑇 𝑔𝑘+1 + 1 − 𝜃 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 − 𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑦 𝑘 𝑇 𝑑 𝑘 −𝑑 𝑘 𝑇 𝑔 𝑘+1 = −𝜃 𝑘 𝑔𝑘+1 𝑇 𝑦 𝑘 − 𝜃 𝑘 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑦 𝑘 𝑇 𝑑 𝑘 Implies that 𝜃 𝑘 = 𝑑 𝑘 𝑇 𝑔 𝑘+1 𝑔 𝑘+1 𝑇 𝑦 𝑘 + 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑦𝑘 𝑇 𝑑 𝑘 Or 𝜃 𝑘 = 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) (3.4) 3.1. Convergence of the new hybrid conjugate gradient algorithm Theorem 3.1.1 : Assume that dk is a descent direction and ∝ 𝑘 in algorithm (1.2) and (3.2) where θk is given by (2.4) ,is determined by the wolfe line search (1.10)and (1.11). If 0 < 𝜃 < 1, then the direction dk+1 given by (3.3) is a descent direction. Proof:- From (3.3) and (3.4) we have
  • 3. New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent International organization of Scientific Research 40 | P a g e 𝑑 𝑘+1 = −𝑔 𝑘+1 + 1 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 𝑑 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 (3.1.1) Multiply both sides of (3.1.1) by 𝑔 𝑘+1 𝑇 , we get 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 = − 𝑔 𝑘+1 2 + 1 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘 ) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 (𝑔𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑔 𝑘 (3.1.2) 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 = − 𝑔 𝑘+1 2 + 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 (𝑔𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑔 𝑘 (3.1.3) The prove is complete if the step length ∝ 𝑘 is chosen by an exact line search which requires 𝑑 𝑘 𝑇 𝑔 𝑘+1 = 0. Now, if the step length ∝ 𝑘 is chosen by an inexact line search which requires 𝑑 𝑘 𝑇 𝑔 𝑘+1 ≠ 0, We know that the first two terms of equation (3.1.3) are less than or equal to zero because the algorithm of Hestenes – Stiefel (HS) is satisfies the descent condition (i.e) - ||𝑔 𝑘+1||2 + 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘) 𝑑 𝑘 𝑇 𝑦 𝑘 ≤ 0, (3.1.4) It remains to consider the third and fourth terms − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔𝑘+1 2 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑔𝑘 = − 𝑑 𝑘 𝑇 𝑔 𝑘+1 2 (𝑑 𝑘 𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑇 𝑦 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 2 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 2 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑔 𝑘 2 + 𝑔 𝑘+1 2 𝑦𝑘 𝑇 𝑑 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) = − 𝑑 𝑘 𝑇 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘 ) 𝑔𝑘+1 𝑇 𝑦 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) + (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 2 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑔𝑘 2 𝑑 𝑘 𝑇 𝑦 𝑘 + 𝑔 𝑘+1 2(𝑑 𝑘 𝑇 𝑔𝑘 ) = − 𝑑 𝑘 𝑇 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑦 𝑘 (3.1.5) We know that dk T gk+1 2 is greater than or equal to zero and dk T yk > 0. Consequently, we have − dk T gk+1 2 dk T yk ≤ 0 Implies that gk+1 T dk+1 ≤ 0. Then the proof is completed.∎ Theorem 3.1.2 :- Assume that the conditions in theorem (3.1.1) hold and 𝑦 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘+1) 𝑦 𝑘 𝑇 𝑑 𝑘 ≤ 𝑔 𝑘+1 2 . If there exists a constant 𝑐1 > 0, such that gk+1 T dk+1 ≤ −𝑐1 𝑔 𝑘+1 2 , then the direction dk+1 satisfies the sufficient descent condition. Proof: From (3.3) and (3.4) we have 𝑑 𝑘+1 = −𝑔 𝑘+1 + 1 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑦 𝑘 𝑑 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑔 𝑘 𝑑 𝑘 (4.2.1)
  • 4. New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent International organization of Scientific Research 41 | P a g e Multiply both sides of (4.2.1) by 𝑔 𝑘+1 𝑇 we get 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 = − 𝑔 𝑘+1 2 + 1 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 (𝑔𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑔 𝑘 (3.1.6) 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ − 𝑔 𝑘+1 2 + 𝑔 𝑘+1 2 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔𝑘 ) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑔 𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑦 𝑘 − 𝑑 𝑘 𝑇 𝑔 𝑘+1 (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘 ) 𝑔 𝑘+1 2 (𝑔𝑘+1 𝑇 𝑑 𝑘 ) 𝑑 𝑘 𝑇 𝑔 𝑘 (3.1.7) 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ − 𝑑 𝑘 𝑇 𝑔 𝑘+1 2 𝑑 𝑘 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑔 𝑘+1 𝑇 𝑦 𝑘 (𝑑 𝑘 𝑇 𝑔 𝑘) + 𝑔 𝑘+1 2(𝑦𝑘 𝑇 𝑑 𝑘) + (𝑑 𝑘 𝑇 𝑔 𝑘 ) 𝑔𝑘+1 2 𝑔 𝑘+1 𝑇 𝑦 𝑘 𝑑 𝑘 𝑇 𝑔 𝑘 2 𝑑 𝑘 𝑇 𝑦 𝑘 + 𝑔 𝑘+1 2(𝑑 𝑘 𝑇 𝑔 𝑘) After some operations, we get 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ −(𝑑 𝑘 𝑇 𝑔 𝑘+1)(𝑑 𝑘 𝑇 𝑔 𝑘+1) (𝑑 𝑘 𝑇 𝑦 𝑘) (3.1.8) Multiply and divided right hand side of above inequality by 𝑦 𝑘 𝑇 𝑔𝑘+1 , we get gk+1 T dk+1 ≤ −(dk T gk+1)(dk T gk+1)( 𝑦 𝑘 𝑇 𝑔𝑘+1) ( 𝑦𝑘 𝑇 𝑔𝑘+1)(dk T yk) (3.1.9) By hypothesis , (3.1.9) gives 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ −(𝑑 𝑘 𝑇 𝑔 𝑘+1) ( 𝑦𝑘 𝑇 𝑔𝑘+1) 𝑔 𝑘+1 2 Multiply and divided right hand side by 𝑦 𝑘 𝑇 𝑔𝑘+1 𝑤𝑒 𝑔𝑒𝑡 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ −(𝑑 𝑘 𝑇 𝑔 𝑘+1)( 𝑦 𝑘 𝑇 𝑔𝑘+1) ( 𝑦𝑘 𝑇 𝑔𝑘+1)2 𝑔 𝑘+1 2 Since dk T gk+1 < dk T yk , then 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ − (dk T yk)( 𝑦 𝑘 𝑇 𝑔𝑘+1) ( 𝑦𝑘 𝑇 𝑔𝑘+1)2 𝑔 𝑘+1 2 (3.1.10) Now, if 𝑦 𝑘 𝑇 𝑔𝑘+1 > 0, Let 𝑐1 = (dk T yk )( 𝑦 𝑘 𝑇 𝑔 𝑘+1) ( 𝑦 𝑘 𝑇 𝑔 𝑘+1)2 Then ,( 3.1.10) gives gk+1 T dk+1 ≤ −𝑐1 gk+1 2 If 𝑦 𝑘 𝑇 𝑔 𝑘+1 < 0 and we know that dk T yk > 0 , then, (dk T yk) 𝑦 𝑘 𝑇 𝑔𝑘+1 < dk T yk , Then, (3.1.10) gives 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ − dk T yk ( 𝑦𝑘 𝑇 𝑔𝑘+1)2 𝑔 𝑘+1 2 Let 𝑐1 = dk T yk ( 𝑦 𝑘 𝑇 𝑔 𝑘+1)2 Hence 𝑔 𝑘+1 𝑇 𝑑 𝑘+1 ≤ −𝑐1 𝑔 𝑘+1 2 Then the proof is completed.∎ 3.2 Theorem of global convergence Since the new hybrid conjugate gradient algorithm is satisfies the sufficient descent condition by using wolfe conditions , then the new hybrid conjugate gradient algorithm is satisfies the global convergence property .
  • 5. New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent International organization of Scientific Research 42 | P a g e 3.3 Algorithm of New Hybrid Conjugate Gradient algorithm step (1) :- set k=0 , select the initial point 𝑥 𝑘 . step( 2) :- 𝑔 𝑘 = f(𝑥 𝑘) , If 𝑔 𝑘 = 0 ,then stop . else set 𝑑 𝑘 = - 𝑔 𝑘 . step (3) :- compute 𝛼 𝑘 > 0 satisfying the wolfe lline search condition to mini mize f(xk+1). step (4) :- 𝑥 𝑘+1 = 𝑥 𝑘 + 𝛼 𝑘 𝑑 𝑘 . step (5) :- 𝑔 𝑘+1 = f(𝑥 𝑘+1) , If 𝑔 𝑘+1 = 0 ,then stop . step (6):- compute θk as in (3.4). Step (7):-if 0 < 𝜃 𝑘 < 1, then compute 𝛽𝑘 𝑁𝐸𝑊 as in (3.2). If 𝜃 𝑘 ≥ 1, then set 𝛽𝑘 𝑁𝐸𝑊 = 𝛽 𝐶𝐷 . If 𝜃 𝑘 ≤ 0, then set 𝛽𝑘 𝑁𝐸𝑊 = 𝛽 𝐻𝑆 . step (8) :- 𝑑 𝑘+1 = - 𝑔 𝑘+1 + 𝛽𝑘 𝑁𝐸𝑊 𝑑 𝑘 . step (9) :- If k=n then go to step 2, else k=k+1 and go to step 3. 3.4 NUMRICAL RESULTS:- This section is devoted to test the implementation of the new formula . We compare the hybrid algorithm with standard Hestenes – Stiefel (HS) and conjugate direction (CD) ,the comparative tests involve well-known nonlinear problems (standard test function) with different dimension 4 ≤n≤5000, all programs are written in FORTRAN95 language and for all cases the stopping condition is 5 1 10  kg The results are given in below table is specifically quote the number of functions NOF and the number of iteration NOI .experimental results in below table confirm that the new CG method is superior to standard CG method with respect to the NOI and NOF. Table Comparative Performance of the three algorithms (Standard HS,CD and New formula) Test fun. N Standard formula ( HS) Standard formula (CD) New formula NOI NOF NOI NOF NOI NOF Powell 4 100 500 1000 3000 5000 65 105 502 637 879 1008 170 276 1062 1332 1816 2074 994 3102 134002 * * * 2077 6942 270117 * * * 30 108 502 241 249 409 74 242 1011 532 568 913 Wood 4 100 500 1000 3000 5000 26 27 28 28 28 28 59 61 63 63 63 63 353 928 1008 561 500 517 709 2115 2277 1165 1038 1111 26 26 27 27 27 27 61 61 6 63 63 63 Cubic 4 100 500 1000 3000 5000 15 14 14 14 14 14 43 40 40 40 40 40 540 801 1501 * * * 1104 2060 5287 * * * 11 11 11 11 11 11 34 32 32 32 32 32 Rosen 4 100 500 1000 3000 5000 23 17 * * * * 66 52 * * * * 611 461 2063 4036 * * 1262 1374 6612 13523 * * 23 21 21 21 21 21 64 62 62 62 62 62
  • 6. New Homotopy for Unconstrained Optimization using Hestenes Stiefel and Conjugate descent International organization of Scientific Research 43 | P a g e Mile 4 100 500 1000 3000 5000 28 142 501 1001 1442 1660 101 346 1108 2312 3252 3688 * * * * * * * * * * * * 18 149 501 998 1270 1418 50 355 1092 2290 2834 3130 Non Digonal 4 100 500 1000 3000 5000 23 22 22 22 22 22 61 60 59 59 59 59 249 * * * * * 558 * * * * * 23 18 18 18 19 19 59 51 53 53 55 55 Woolf 4 100 500 1000 3000 5000 12 49 56 70 166 176 27 99 113 141 343 365 49 901 3893 8001 * * 99 1879 8393 19675 * * 12 49 55 69 166 175 27 99 111 139 343 363 IV. CONCLUSION In this paper we have presented a new hybrid conjugate gradient method in which a famous parameter 𝛽𝑘 is computed as a convex combination of 𝛽𝑘 𝐻𝑆 and 𝛽𝑘 𝐶𝐷 and comparative numerical performances of a number of well known conjugate gradient algorithms Hestenes Stiefel (HS) and Conjugate decsent (CD). We saw that the performance profile of our method was higher than those of the well established conjugate gradient algorithms HS and CD. REFRENCES [1] N. Andrei, Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization , J.Optim. Theory Appl., 141, 2009, 249-264. [2] Y.H. Dai, and Y. Yuan, A nonlinear Conjugate Gradient Method With a Strong Global Convergence Property, SIAM J. Optimization , 10,1999, Pp. 177-182. [3] R. Fletcher, Practical Methods of Optimization,(Vol. 1: Unconstrined Optimization , John Wiley and Sons, New York, 1987). [4] R. Fletcher, and C. Reeves, Function Minimization by Conjugate Gradients, Comput. J.,7, 1964 , Pp. 149-154. [5] M.R. Hestenes, and E.L. Stiefel, Methods of Conjugate Gradients for Solving Linear Systems, J.Research Nat. Bur. Standards ,49 ,1952, Pp. 409-436. [6] Y.F. Hu and C. Storey, Global Convergence Result for Conjugate Gradient Methods, J. Optim. , Theory Appl., 71,1991, Pp. 399-405. [7] Y. Liu , and C. Storey, Efficient generalized Conjugate Gradient Algorithms , Part 1: Theory. JOTA,69,1991, Pp.129-137. [8] E. Polak , andG. Ribiere, Note sur Ia convergence de directions Conjugate , Rev. Francaise Information, Recherche Operationelle, 3e Annee, 16,1969, Pp. 35-43. [9] D. Touati-Ahmed and C. Storey, Efficient Hybrid Conjugate Gradient Techniques , J. Optim. , Theory Appl., 64,1990, Pp.379-397. [10] P. Wolfe, Convergence conditions for ascent methods, SIAM Rev., 11,1969, 226-235.