Abstract
Hybrid conjugate gradient methods are considered as an efficient family of conjugate gradient methods to solve unconstrained optimization problems. In this work, based on the memoryless BFGS update, a convex hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate parameters is presented. To put in place safeguards to protect the sufficient descent property, the given search direction is projected to the orthogonal subspace to the gradient of the objective function. The convergence analysis of the proposed method is addressed under standard assumptions for general functions. The practical merits of the proposed method are computationally demonstrated on a set of CUTEr test functions as well as the well-known nonnegative matrix factorization problem.

Similar content being viewed by others
Data availability
The authors confirm that the data supporting the findings of this study are available within the manuscript.
References
Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34(4), 770–782 (2019)
Aminifard, Z., Babaie-Kafaki, S.: A modified descent Polak-Ribière-Polyak conjugate gradient method with global convergence property for nonconvex functions. Calcolo 56(2), 1–11 (2019)
Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89(3), 1369–1387 (2022)
Aminifard, Z., Babaie-Kafaki, S., Dargahi, F.: Nonmonotone Quasi-Newton-based conjugate gradient methods with application to signal processing. Numer. Algorithms 93(4), 1527–1541 (2023)
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)
Babaie-Kafaki, S.: A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. J. Optim. Theory Appl. 154, 916–932 (2012)
Babaie–Kafaki, S., Dargahi, F., Aminifard, Z.: On solving a revised model of the nonnegative matrix factorization problem by the modified adaptive versions of the Dai–Liao method, Numer. Algorithms, pp. 1–15 (2024)
Babaie-Kafaki, S., Ghanbari, R.: Two hybrid nonlinear conjugate gradient methods based on a modified secant equation. Optimization 63(7), 1027–1042 (2014)
Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenes-Stiefel and Dai-Yuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30(4), 673–681 (2015)
Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods. Numer. Algorithm 68, 481–495 (2015)
Babaie-Kafaki, S., Mahdavi-Amiri, N.: Two modified hybrid conjugate gradient methods based on a hybrid secant equation. Math. Model. Anal. 18(1), 32–52 (2013)
Cao, J., Wu, J.: A conjugate gradient algorithm and its applications in image restoration. Appl. Numer. Math. 152, 243–252 (2020)
Cheng, W.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28(11–12), 1217–1230 (2007)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math Program 91, 201–213 (2002)
Exl, L., Fischbacher, J., Kovacs, A., Oezelt, H., Gusenbauer, M., Schrefl, T.: Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization. Comput. Phys. Comm. 235, 179–186 (2019)
Fletcher, R.: Practical methods of optimization. John Wiley and Sons, New York (2000)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEr: A constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hafshejani, S.F., Gaur, D., Hossain, S., Benkoczi, R.: Barzilai and Borwein conjugate gradient method equipped with a non-monotone line search technique and its application on non-negative matrix factorization,[SPACE]arXiv:2109.05685 (2021)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Hu, W., Wu, J., Yuan, G.: Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration. Appl. Numer. Math. 158, 360–376 (2020)
Jian, J., Chen, Q., Jiang, X., Zeng, Y., Yin, J.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)
Khoshsimaye-Bargard, M., Ashrafi, A.: A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update. Comput. Appl. Math. 40(8), 1–17 (2021)
Khoshsimaye-Bargard, M., Ashrafi, A.: A descent spectral Dai-Liao method based on the quasi–Newton aspects. Numer. Algorithms 94(1), 397–411 (2023)
Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)
Li, L., Xie, X., Gao, T., Wang, J.: A modified conjugate gradient-based Elman neural network. Cogn. Syst. Res. 68, 62–72 (2021)
Li, X., Zhang, W., Dong, X.: A class of modified FR conjugate gradient method and applications to non-negative matrix factorization. Comput. Math. Appl. 73(2), 270–276 (2017)
Liu, H., Li, X.: Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization. Comput. Optim. Appl. 55(1), 173–196 (2013)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)
Narayanan, S., Kaelo, P.: A linear hybridization of Dai-Yuan and Hestenes-Stiefel conjugate gradient method for unconstrained optimization. Numer. Math. Theor. Meth. Appl. 14, 527–539 (2021)
Nataj, S., Lui, S.H.: Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point. Appl. Math. Comput. 369, 124829 (2020)
Nocedal, J., Wright, S.J.: Numerical optimization. Springer, New York (2006)
Paatero, P., Tapper, U.: Positive matrix factorization: a nonn-egative factor model with optimal utilization of error estimates of data values. Environmetrics 5(2), 111–126 (1994)
Polak, E., Ribière, G.: Note sur la convergence de méthods de directions conjuguées. Rev. Fr. Inform. Rech. Oper. 3(16), 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. program. 12(1), 241–254 (1977)
Salihu, N., Babando, H.A., Arzuka, I., Salihu, S.: A hybrid conjugate gradient method for unconstrained optimization with application. Bangmod. Int. J. Math. Comput. Sci. 9, 24–44 (2023)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, M., Liu, J., Wang, Y.: Two improved conjugate gradient methods with application in compressive sensing and motion control. Math. Probl. Eng. 2020, 1–11 (2020)
Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, New York (2006)
Tits, A.L., Yang, Y.: Globally convergent algorithms for robust pole assignment by state feedback. IEEE Trans. Autom. Control. 41(10), 1432–1452 (1996)
Wang, Y., Jia, Y., Hu, C., Turk, M.: Non-negative matrix factorization framework for face recognition. Intern. J. Pattern Recognit. Artif. Intell. 19(04), 495–511 (2005)
Xue, W., Wan, P., Li, Q., Zhong, P., Yu, G., Tao, T.: An online conjugate gradient algorithm for large-scale data analysis in machine learning. AIMS Math. 6(2), 1515–1537 (2021)
Yahaya, J., Arzuka, I., Isyaku, M.: Descent modified conjugate gradient methods for vector optimization problems. Bangmod. Int. J. Math. Comput. Sci. 9, 72–91 (2023)
Yang, Y.: A robust BFGS algorithm for unconstrained nonlinear optimization problems. Optimization 73(3), 851–873 (2024)
Yang, X., Luo, Z., Dai, X.: A global convergence of LS-CD hybrid conjugate gradient method. Adv. Numer. Anal. 2013, 517452 (2013)
Yuan, G., Li, P., Lu, J.: The global convergence of the BFGS method with a modified WWP line search for nonconvex functions. Numer. Algorithms 91(1), 353–365 (2022)
Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comput. Appl. Math. 327, 274–294 (2018)
Zhang, L., Zhou, W.: Two descent hybrid conjugate gradient methods for optimization. J. Comput. Appl. Math. 216(1), 251–264 (2008)
Zhu, Z., Zhang, D., Wang, S.: Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl. Math. Comput. 373, 125004 (2020)
Acknowledgements
This research was supported by the Research Council of Semnan University. The authors thank the anonymous reviewers for their valuable comments and suggestions that helped to improve the quality of this work. They are also grateful to Professor Michael Navon for providing the line search code.
Funding
This research was supported by the Postdoc grant no. 21272 from the Research Council of Semnan University.
Author information
Authors and Affiliations
Contributions
The authors confirm contribution to the manuscript as follows:
\(\bullet \) study conception and design: A. Ashrafi;
\(\bullet \) convergence analysis: M. Khoshsimaye–Bargard;
\(\bullet \) performing numerical tests and interpretation of results: M. Khoshsimaye–Bargard;
\(\bullet \) draft manuscript preparation: M. Khoshsimaye–Bargard, A. Ashrafi.
All authors reviewed the results and approved the final version of the manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no Conflict of interest.
Ethical approval
Not applicable.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Khoshsimaye-Bargard, M., Ashrafi, A. A projected hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods with application to nonnegative matrix factorization. J. Appl. Math. Comput. 71, 551–571 (2025). https://doi.org/10.1007/s12190-024-02245-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12190-024-02245-7
Keywords
- Nonlinear programming
- Hybrid conjugate gradient method
- Projecting strategy
- Sufficient descent property
- Global convergence
- Nonnegative matrix factorization