全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Several Guaranteed Descent Conjugate Gradient Methods for Unconstrained Optimization

DOI: 10.1155/2014/825958

Full-Text   Cite this paper   Add to My Lib

Abstract:

This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent condition and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization. 1. Introduction Consider the following unconstrained optimization problem: where is the -dimensional Euclidean space, is continuously differentiable, and its gradient is available. Conjugate gradient methods are very efficient to solve problem (1) due to their simple iteration and their low memory requirements. For any given starting point , they generate a sequence by the following recursive relation: where , is a step length obtained by means of a one-dimensional search, and is a scalar that characterizes the method. In general, the step length in (2) is obtained by fulfilling the following weak Wolfe conditions [1, 2]: where . And different choices for the scalar in (3) result in different nonlinear conjugate gradient methods. Well-known formulas for are the Fletcher-Reeves (FR), Hestenes-Stiefel (HS), Polak-Ribiére-Polyak (PRP), Dai-Yuan (DY), and Liu-Storey (LS) formulas (see [3], [4], [5], [6], [7], and [8], resp.) and are given by where means the Euclidean norm and . Their corresponding conjugate gradient methods are viewed as basic conjugate gradient methods. Among these basic conjugate gradient methods, the PRP and HS methods perform very similarly and perform better than other basic conjugate gradient methods [9]. While Powell [10] utilized an example illustrating that the PRP and HS methods may cycle without approaching any solution point, then modified versions of the PRP and HS methods were presented by many researchers (see, e.g., [11–16]). The following (sufficient) descent condition, is very important for conjugate gradient methods, so we are particularly interested in the conjugate gradient methods with sufficient descent conditions. Up to now, there are many descent conjugate gradient methods proposed by researchers; please see [12, 16–19] and references therein. One well-known guaranteed descent conjugate gradient method was proposed by Hager and Zhang [16, 20, 21] with The method is designed based on the HS method and satisfies the sufficient descent condition (6) with for any (inexact) line search. In [18], Zhang and Li proposed a general case of the HZ method with where and is a scalar to be specified. It also satisfies the sufficient descent condition (6) with , and

References

[1]  P. Wolfe, “Convergence conditions for ascent methods,” SIAM Review, vol. 11, no. 2, pp. 226–235, 1969.
[2]  P. Wolfe, “Convergence conditions for ascent methods. II: some corrections,” SIAM Review, vol. 13, no. 2, pp. 185–188, 1971.
[3]  R. Fletcher and C. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, pp. 149–154, 1964.
[4]  M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952.
[5]  E. Polak and G. Ribière, “Note sur la convergence de méthodes de directions conjuguées,” Revue Fran?aise D'informatique et de Recherche Opérationnelle, Série Rouge, vol. 3, no. 16, pp. 35–43, 1969.
[6]  B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969.
[7]  Y. H. Dai and Y. Yuan, “A nonlinear conjugate gradient method with a strong global convergence property,” SIAM Journal on Optimization, vol. 10, no. 1, pp. 177–182, 2000.
[8]  Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991.
[9]  Y. H. Dai and Q. Ni, “Testing different conjugate gradient methods for large-scale unconstrained optimization,” Journal of Computational Mathematics, vol. 21, no. 3, pp. 311–320, 2003.
[10]  M. J. D. Powell, “Nonconvex minimization calculations and the conjugate gradient method,” in Numerical Analysis, vol. 1066 of Lecture Notes in Mathematics, pp. 122–141, Springer, Berlin, Germany, 1984.
[11]  J. C. Gilbert and J. Nocedal, “Global convergence properties of conjugate gradient methods for optimization,” SIAM Journal on Optimization, vol. 2, no. 1, pp. 21–42, 1992.
[12]  L. Zhang, W. Zhou, and D.-H. Li, “A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence,” IMA Journal of Numerical Analysis, vol. 26, no. 4, pp. 629–640, 2006.
[13]  Y. F. Hu and C. Storey, “Global convergence result for conjugate gradient methods,” Journal of Optimization Theory and Applications, vol. 71, no. 2, pp. 399–405, 1991.
[14]  Y. H. Dai and Y. Yuan, “An efficient hybrid conjugate gradient method for unconstrained optimization,” Annals of Operations Research, vol. 103, no. 1–4, pp. 33–47, 2001.
[15]  Y.-H. Dai and L.-Z. Liao, “New conjugacy conditions and related nonlinear conjugate gradient methods,” Applied Mathematics & Optimization, vol. 43, no. 1, pp. 87–101, 2001.
[16]  W. W. Hager and H. Zhang, “A new conjugate gradient method with guaranteed descent and an efficient line search,” SIAM Journal on Optimization, vol. 16, no. 1, pp. 170–192, 2006.
[17]  L. Zhang, W. Zhou, and D. Li, “Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search,” Numerische Mathematik, vol. 104, no. 4, pp. 561–572, 2006.
[18]  L. Zhang and J. Li, “A new globalization technique for nonlinear conjugate gradient methods for nonconvex minimization,” Applied Mathematics and Computation, vol. 217, no. 24, pp. 10295–10304, 2011.
[19]  W. Nakamura, Y. Narushima, and H. Yabe, “Nonlinear conjugate gradient methods with sufficient descent properties for unconstrained optimization,” Journal of Industrial and Management Optimization, vol. 9, no. 3, pp. 595–619, 2013.
[20]  W. W. Hager and H. Zhang, “A survey of nonlinear conjugate gradient methods,” Pacific Journal of Optimization, vol. 2, no. 1, pp. 35–58, 2006.
[21]  W. W. Hager and H. Zhang, “Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent,” ACM Transactions on Mathematical Software, vol. 32, no. 1, pp. 113–137, 2006.
[22]  Y. H. Dai, “Nonlinear conjugate gradient methods,” in Wiley Encyclopedia of Operations Research and Management Science, J. J. Cochran, L. A. Cox Jr., P. Keskinocak, J. P. Kharoufeh, and J. C. Smith, Eds., vol. 8, John Wiley & Sons, Hoboken, NJ, USA, 2011.
[23]  G. Zoutendijk, “Nonlinear programming, computational methods,” in Integer and Nonlinear Programming, J. Abadie, Ed., pp. 37–86, North-Holland, Amsterdam, The Netherlands, 1970.
[24]  D.-H. Li and M. Fukushima, “A modified BFGS method and its global convergence in nonconvex minimization,” Journal of Computational and Applied Mathematics, vol. 129, no. 1-2, pp. 15–35, 2001.
[25]  H. Yabe and M. Takano, “Global convergence properties of nonlinear conjugate gradient methods with modified secant condition,” Computational Optimization and Applications, vol. 28, no. 2, pp. 203–225, 2004.
[26]  S. Babaie-Kafaki, R. Ghanbari, and N. Mahdavi-Amiri, “Two new conjugate gradient methods based on modified secant equations,” Journal of Computational and Applied Mathematics, vol. 234, no. 5, pp. 1374–1386, 2010.
[27]  K. Sugiki, Y. Narushima, and H. Yabe, “Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization,” Journal of Optimization Theory and Applications, vol. 153, no. 3, pp. 733–757, 2012.
[28]  Y. Narushima and H. Yabe, “Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization,” Journal of Computational and Applied Mathematics, vol. 236, no. 17, pp. 4303–4317.
[29]  I. Bongartz, A. R. Conn, N. Gould, and P. L. Toint, “CUTE: constrained and unconstrained testing environment,” ACM Transactions on Mathematical Software, vol. 21, no. 1, pp. 123–160, 1995.
[30]  N. I. M. Gould, D. Orban, and P. L. Toint, “CUTEr and SifDec: a constrained and unconstrained testing environment, revisited,” ACM Transactions on Mathematical Software, vol. 29, no. 4, pp. 373–394, 2003.
[31]  E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002.

Full-Text

Contact Us

[email protected]

QQ:3279437679

WhatsApp +8615387084133