[go: up one dir, main page]

Skip to main content
Log in

A projected hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods with application to nonnegative matrix factorization

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

Hybrid conjugate gradient methods are considered as an efficient family of conjugate gradient methods to solve unconstrained optimization problems. In this work, based on the memoryless BFGS update, a convex hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate parameters is presented. To put in place safeguards to protect the sufficient descent property, the given search direction is projected to the orthogonal subspace to the gradient of the objective function. The convergence analysis of the proposed method is addressed under standard assumptions for general functions. The practical merits of the proposed method are computationally demonstrated on a set of CUTEr test functions as well as the well-known nonnegative matrix factorization problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Data availability

The authors confirm that the data supporting the findings of this study are available within the manuscript.

References

  1. Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34(4), 770–782 (2019)

    MathSciNet  MATH  Google Scholar 

  2. Aminifard, Z., Babaie-Kafaki, S.: A modified descent Polak-Ribière-Polyak conjugate gradient method with global convergence property for nonconvex functions. Calcolo 56(2), 1–11 (2019)

    MATH  Google Scholar 

  3. Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89(3), 1369–1387 (2022)

    MathSciNet  MATH  Google Scholar 

  4. Aminifard, Z., Babaie-Kafaki, S., Dargahi, F.: Nonmonotone Quasi-Newton-based conjugate gradient methods with application to signal processing. Numer. Algorithms 93(4), 1527–1541 (2023)

    MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)

    MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S.: A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. J. Optim. Theory Appl. 154, 916–932 (2012)

    MathSciNet  MATH  Google Scholar 

  7. Babaie–Kafaki, S., Dargahi, F., Aminifard, Z.: On solving a revised model of the nonnegative matrix factorization problem by the modified adaptive versions of the Dai–Liao method, Numer. Algorithms, pp. 1–15 (2024)

  8. Babaie-Kafaki, S., Ghanbari, R.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization 63(7), 1027–1042 (2014)

    MathSciNet  MATH  Google Scholar 

  9. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)

    MathSciNet  MATH  Google Scholar 

  10. Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. Numer. Algorithm 68, 481–495 (2015)

    MATH  Google Scholar 

  11. Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)

    MathSciNet  MATH  Google Scholar 

  12. Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152, 243–252 (2020)

    MathSciNet  MATH  Google Scholar 

  13. Cheng, W.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28(11–12), 1217–1230 (2007)

    MathSciNet  MATH  Google Scholar 

  14. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    MathSciNet  MATH  Google Scholar 

  15. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math Program 91, 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  16. Exl, L., Fischbacher, J., Kovacs, A., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys. Comm. 235, 179–186 (2019)

    MATH  Google Scholar 

  17. Fletcher, R.: Practical methods of optimization. John Wiley and Sons, New York (2000)

    MATH  Google Scholar 

  18. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    MathSciNet  MATH  Google Scholar 

  19. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    MathSciNet  MATH  Google Scholar 

  20. Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    MATH  Google Scholar 

  21. Hafshejani, S.F., Gaur, D., Hossain, S., Benkoczi, R.: Barzilai and Borwein conjugate gradient method equipped with a non-monotone line search technique and its application on non-negative matrix factorization,[SPACE]arXiv:2109.05685 (2021)

  22. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  23. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    MathSciNet  MATH  Google Scholar 

  24. Hu, W., Wu, J., Yuan, G.: Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration. Appl. Numer. Math. 158, 360–376 (2020)

    MathSciNet  MATH  Google Scholar 

  25. Jian, J., Chen, Q., Jiang, X., Zeng, Y., Yin, J.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)

    MathSciNet  MATH  Google Scholar 

  26. Khoshsimaye-Bargard, M., Ashrafi, A.: A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)

    MATH  Google Scholar 

  27. Khoshsimaye-Bargard, M., Ashrafi, A.: A descent spectral Dai-Liao method based on the quasi–Newton aspects. Numer. Algorithms 94(1), 397–411 (2023)

    MathSciNet  MATH  Google Scholar 

  28. Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)

    MATH  Google Scholar 

  29. Li, L., Xie, X., Gao, T., Wang, J.: A modified conjugate gradient-based Elman neural network. Cogn. Syst. Res. 68, 62–72 (2021)

    MATH  Google Scholar 

  30. Li, X., Zhang, W., Dong, X.: A class of modified FR conjugate gradient method and applications to non-negative matrix factorization. Comput. Math. Appl. 73(2), 270–276 (2017)

    MathSciNet  MATH  Google Scholar 

  31. Liu, H., Li, X.: Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization. Comput. Optim. Appl. 55(1), 173–196 (2013)

    MathSciNet  MATH  Google Scholar 

  32. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    MathSciNet  MATH  Google Scholar 

  33. Narayanan, S., Kaelo, P.: A linear hybridization of Dai-Yuan and Hestenes-Stiefel conjugate gradient method for unconstrained optimization. Numer. Math. Theor. Meth. Appl. 14, 527–539 (2021)

    MathSciNet  MATH  Google Scholar 

  34. Nataj, S., Lui, S.H.: Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point. Appl. Math. Comput. 369, 124829 (2020)

    MathSciNet  MATH  Google Scholar 

  35. Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)

    MATH  Google Scholar 

  36. Paatero, P., Tapper, U.: Positive matrix factorization: a nonn-egative factor model with optimal utilization of error estimates of data values. Environmetrics 5(2), 111–126 (1994)

    MATH  Google Scholar 

  37. Polak, E., Ribière, G.: Note sur la convergence de méthods de directions conjuguées. Rev. Fr. Inform. Rech. Oper. 3(16), 35–43 (1969)

    MATH  Google Scholar 

  38. Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    MATH  Google Scholar 

  39. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. program. 12(1), 241–254 (1977)

    MathSciNet  MATH  Google Scholar 

  40. Salihu, N., Babando, H.A., Arzuka, I., Salihu, S.: A hybrid conjugate gradient method for unconstrained optimization with application. Bangmod. Int. J. Math. Comput. Sci. 9, 24–44 (2023)

    MATH  Google Scholar 

  41. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    MathSciNet  MATH  Google Scholar 

  42. Sun, M., Liu, J., Wang, Y.: Two improved conjugate gradient methods with application in compressive sensing and motion control. Math. Probl. Eng. 2020, 1–11 (2020)

    MathSciNet  MATH  Google Scholar 

  43. Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)

    MATH  Google Scholar 

  44. Tits, A.L., Yang, Y.: Globally convergent algorithms for robust pole assignment by state feedback. IEEE Trans. Autom. Control. 41(10), 1432–1452 (1996)

    MathSciNet  MATH  Google Scholar 

  45. Wang, Y., Jia, Y., Hu, C., Turk, M.: Non-negative matrix factorization framework for face recognition. Intern. J. Pattern Recognit. Artif. Intell. 19(04), 495–511 (2005)

    MATH  Google Scholar 

  46. Xue, W., Wan, P., Li, Q., Zhong, P., Yu, G., Tao, T.: An online conjugate gradient algorithm for large-scale data analysis in machine learning. AIMS Math. 6(2), 1515–1537 (2021)

    MathSciNet  MATH  Google Scholar 

  47. Yahaya, J., Arzuka, I., Isyaku, M.: Descent modified conjugate gradient methods for vector optimization problems. Bangmod. Int. J. Math. Comput. Sci. 9, 72–91 (2023)

    MATH  Google Scholar 

  48. Yang, Y.: A robust BFGS algorithm for unconstrained nonlinear optimization problems. Optimization 73(3), 851–873 (2024)

    MathSciNet  MATH  Google Scholar 

  49. Yang, X., Luo, Z., Dai, X.: A global convergence of LS-CD hybrid conjugate gradient method. Adv. Numer. Anal. 2013, 517452 (2013)

    MathSciNet  MATH  Google Scholar 

  50. Yuan, G., Li, P., Lu, J.: The global convergence of the BFGS method with a modified WWP line search for nonconvex functions. Numer. Algorithms 91(1), 353–365 (2022)

    MathSciNet  MATH  Google Scholar 

  51. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)

    MathSciNet  MATH  Google Scholar 

  52. Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216(1), 251–264 (2008)

    MathSciNet  MATH  Google Scholar 

  53. Zhu, Z., Zhang, D., Wang, S.: Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl. Math. Comput. 373, 125004 (2020)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This research was supported by the Research Council of Semnan University. The authors thank the anonymous reviewers for their valuable comments and suggestions that helped to improve the quality of this work. They are also grateful to Professor Michael Navon for providing the line search code.

Funding

This research was supported by the Postdoc grant no. 21272 from the Research Council of Semnan University.

Author information

Authors and Affiliations

Authors

Contributions

The authors confirm contribution to the manuscript as follows:

\(\bullet \) study conception and design: A. Ashrafi;

\(\bullet \) convergence analysis: M. Khoshsimaye–Bargard;

\(\bullet \) performing numerical tests and interpretation of results: M. Khoshsimaye–Bargard;

\(\bullet \) draft manuscript preparation: M. Khoshsimaye–Bargard, A. Ashrafi.

All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Ali Ashrafi.

Ethics declarations

Conflict of interest

The authors declare that they have no Conflict of interest.

Ethical approval

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khoshsimaye-Bargard, M., Ashrafi, A. A projected hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods with application to nonnegative matrix factorization. J. Appl. Math. Comput. 71, 551–571 (2025). https://doi.org/10.1007/s12190-024-02245-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-024-02245-7

Keywords

Mathematics Subject Classification