[go: up one dir, main page]

Skip to main content
Log in

Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of \({y_{k}^{T}} s_{k}\). For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research, Springer, Berlin (2006)

    MATH  Google Scholar 

  2. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  3. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  4. Polak, E., Ribiére, G.: Note sur la convergence de directions conjuguées. Francaise Infomat Recherche Operatonelle 16, 35–43 (1969)

    MATH  Google Scholar 

  5. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  6. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  7. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  8. Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001)

    Article  MathSciNet  Google Scholar 

  9. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204, 410–420 (2010)

    Article  MathSciNet  Google Scholar 

  10. Fatemi, M.: A scaled conjugate gradient method for nonlinear unconstrained optimization. Optim. Methods Softw. 32, 1095–1112 (2016)

    Article  MathSciNet  Google Scholar 

  11. Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  12. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  13. Zheng, Y., Zhen, B.: Two new Dai-Liao-type conjugate gradient methods for unconstrained optimization problems. J. Optim. Theory Appl. 175, 502–509 (2017)

    Article  MathSciNet  Google Scholar 

  14. Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comput. Appl. Math. 325, 149–164 (2017)

    Article  MathSciNet  Google Scholar 

  15. Yang, Y., Chen, Y., Lu, Y.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numerical Algorithms 76, 813–828 (2017)

    Article  MathSciNet  Google Scholar 

  16. Alhawarat, A., Salleh, Z., Mamat, M., Rivaie, M.: An efficient modified Polak-Ribiére-Polyak conjugate gradient method with global convergence properties. Optim. Methods Softw. 32, 1299–1312 (2017)

    Article  MathSciNet  Google Scholar 

  17. Liu, Z., Liu, H.: Several efficient gradient methods with approximate optimal step sizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)

    Article  MathSciNet  Google Scholar 

  18. Babaie-Kafaki, S., Ghanbari, R.: A descent family of Dai-Liao conjugate gradient methods. Optim. Methods Softw. 29, 583–591 (2014)

    Article  MathSciNet  Google Scholar 

  19. Babaie-Kafaki, S.: On optimality of two adaptive choices for the parameter of Dai-Liao method. Optim. Lett. 10, 1789–1797 (2016)

    Article  MathSciNet  Google Scholar 

  20. Fatemi, M.: An optimal parameter for Dai–Liao family of conjugate gradient methods. J. Optim. Theory Appl. 169, 587–605 (2016)

    Article  MathSciNet  Google Scholar 

  21. Beale, E.M.L.: A derivation of conjugate gradients. In: Lootsma, F.A. (ed.) Numerical Methods for Nonlinear Optimization, pp 39–43. Academic Press, London (1972)

  22. Nazareth, L.: A conjugate direction algorithm without line searches. J. Optim. Theory Appl. 23, 373–387 (1977)

    Article  MathSciNet  Google Scholar 

  23. Dixon, L.C.W., Ducksbury, P.G., Ningh, P.: A new three-term conjugate gradient method. J. Optim. Theory Appl. 47, 285–300 (1985)

    Article  MathSciNet  Google Scholar 

  24. Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)

    Article  MathSciNet  Google Scholar 

  25. Andrei, N.: A modified Polak-Ribiére-Polyak conjugate gradient algorithm for unconstrained optimization. Optimization 60, 1457–1471 (2011)

    Article  MathSciNet  Google Scholar 

  26. Liu, J.K., Wu, X.S.: New three-term conjugate gradient method for solving unconstrained optimization problems. Sci. Asia 40, 295–300 (2014)

    Article  Google Scholar 

  27. Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60, 89–110 (2015)

    Article  MathSciNet  Google Scholar 

  28. Wu, Y.: A modified three-term PRP conjugate gradient algorithm for optimization models. Journal of Inequalities and Applications 97, 1–14 (2017)

    MathSciNet  Google Scholar 

  29. Liu, J.K., Feng, Y.M., Zou, L.M.: Some three-term conjugate gradient methods with the inexact line search condition. Calcolo 55, 1–16 (2018)

    Article  MathSciNet  Google Scholar 

  30. Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribiére-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)

    Article  MathSciNet  Google Scholar 

  31. Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comput. Appl. Math. 241, 19–29 (2013)

    Article  MathSciNet  Google Scholar 

  32. Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219, 6316–6327 (2013)

    MathSciNet  MATH  Google Scholar 

  33. Yao, S., Ning, L.: An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix. J. Comput. Appl. Math. 332, 72–85 (2018)

    Article  MathSciNet  Google Scholar 

  34. Zhang, L., Zhou, W.J., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)

    Article  MathSciNet  Google Scholar 

  35. Liu, J.K., Li, S.J.: New three-term conjugate gradient method with guaranteed global convergence. Int. J. Comput. Math. 91, 1744–1754 (2014)

    Article  MathSciNet  Google Scholar 

  36. Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)

    Article  MathSciNet  Google Scholar 

  37. Dong, X.L., Liu, H.W., He, Y.B.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)

    MathSciNet  MATH  Google Scholar 

  38. Arzuka, I., Abu Bakar, M.R., Leong, W.J.: A scaled three-term conjugate gradient method for unconstrained optimization. Journal of Inequalities and Applications 325, 1–16 (2016)

    MathSciNet  MATH  Google Scholar 

  39. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

    Article  MathSciNet  Google Scholar 

  40. Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer Optimization and its Applications, Springer (2006)

  41. Andrei, N.: An unconstrained optimization test functions collection. Advanced Modeling and Optimization 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  42. Burke, J.V., Engle, A.: Line search methods for convex-composite optimization. arXiv:1806.05218v1 (2018)

  43. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the anonymous reviewers for their useful suggestions and valuable comments which were greatly helpful to improve the quality of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. R. Eslahchi.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

In this Appendix, we present all the numerical results of our first set of experiments of Section 4 in Table 7. The numbers in the first and second columns represent the number of problems (see Table 3) and their dimensions. In the reminder columns, the symbol − indicates that the method failed to solve the problem.

Table 7 The numerical results of five modified three-term CG algorithms

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bojari, S., Eslahchi, M.R. Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization. Numer Algor 83, 901–933 (2020). https://doi.org/10.1007/s11075-019-00709-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-019-00709-7

Keywords

Mathematics Subject Classification (2010)