A sufficient descent modified nonlinear conjugate gradient method for solving large scale unconstrained optimization problems

Authors

  • Moyi A. U. Author
  • Abdullahi N. Author
  • Aliyu N. Author

DOI:

https://doi.org/10.33003/jobasr-2024-v2i3-61

Keywords:

Unconstrained optimization, Conjugate gradient method, Line search, Global convergence

Abstract

Nonlinear conjugate gradient methods (CG) are prominent iterative techniques widely used for solving large-scale unconstrained optimization problems. Their wide application in many fields is due to their simplicity, low memory requirements, computationally less costs and global convergence properties. However, some of the CG methods do not possess the sufficient descent conditions, global convergence properties and good numerical result. To overcome these drawbacks, numerous studies and modification have been conducted to improve on these methods. In this research, a modification of a new Conjugate gradient parameter that posses sufficient descent conditions and global convergence properties is presented. The global convergence result is established using the Strong Wolf Powell condition (SWP). Extensive numerical experiment was conducted on a set of standard unconstrained optimization test functions.  The results show that the method outperforms some well-known methods in terms of efficiency and robustness.

References

Downloads

Published

30.09.2024

Issue

Section

Articles

How to Cite

Moyi A. U., Abdullahi N., & Aliyu N. (2024). A sufficient descent modified nonlinear conjugate gradient method for solving large scale unconstrained optimization problems. JOURNAL OF BASICS AND APPLIED SCIENCES RESEARCH, 2(3), 36-44. https://doi.org/10.33003/jobasr-2024-v2i3-61