A Sufficient Descent Modified Nonlinear Conjugate Gradient Method for Solving Large Scale Unconstrained Optimization Problems
DOI: https://doi.org/10.33003/jobasr
Moyi, A. U.
Abdullahi, N.
Aliyu, N.
Abstract
Nonlinear conjugate gradient methods (CG) are prominent iterative techniques widely used for solving large-scale unconstrained optimization problems. Their wide application in many fields is due to their simplicity, low memory requirements, computationally less costs and global convergence properties. However, some of the CG methods do not possess the sufficient descent conditions, global convergence properties and good numerical result. To overcome these drawbacks, numerous studies and modification have been conducted to improve on these methods. In this research, a modification of a new Conjugate gradient parameter that posses sufficient descent conditions and global convergence properties is presented. The global convergence result is established using the Strong Wolf Powell condition (SWP). Extensive numerical experiment was conducted on a set of standard unconstrained optimization test functions. The results show that the method outperforms some well-known methods in terms of efficiency and robustness.
References
Aini, N. Rivaie, M. Mamat, M., Sulaiman, IM. (2019) A Hybrid of Quasi-Newton Method with CG Method for Unconstrained optimization. Journal of Physics conf. Ser. 1366(012079)
Al-Baali, M. (1985). Descent property and global convergence of the Fletcher-Reeves method with inexact line search. IMA .Journal of Numerical analysis 5(1): 121-124.
Alhawarat, A., Salleh, Z., Mamat, M., Rivaie, M. (2017) An efficient modified Polak-Ribiere-Polayak conjugate gradient method with global convergence properties. Optimization Methods Software. 32(6), 1299-1312.
Andrei, N. (2009) Accelerated conjugate gradient algorithm with finite difference Hessian / vector product approximation for unconstrained optimization. Journal Computation and Applied Mathematics. 230 : 570-582.
Andrei, N. (2011) 40 Conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34 : 319-330
A.M. Awwal, L.Wang, P.Kumam,M.I. Sulaiman, S. Salisu, N. Salihu, and P.Yodjai, (2023) Generalized RMIL conjugate gradient method under the strong Wolfe line search with application in image processing, Math. Methods Appl. Sci. 46(16), pp. 17544–17556
Dai, Y., Yuan, (1999) Y. A nonlinear conjugate gradient with strong global convergence properties SIAM Journal on optimization 10(1): 177-182.
Dolan, E. D and More J.J. (2002) Benchmarking optimization software with performance profile, Mathematical programming 91(2): 201-213.
Fletcher, R. and Reeves, C. M (1964) Function minimization by conjugate gradients. The Computer Journal 7(2): 149-154.
Fletcher R., (1989) Practical methods of optimization, Chichester, England, John wiley & Sons.
Gilbert, J. C. and Nocedal, J. (1992). Global convergence properties of conjugate gradient methods for optimization. SIAM journal on optimization 2(1): 21-42.
Hager, W.W and Zhang, H. (2006). A survey of nonlinear conjugate gradient methods. Pacific journal of optimization 2(1): 35-58.
Hestenes, M.R.and Stiefel E. (1952) methods of conjugate gradient for solving linear systems, Journal of Research of the National. Bureu of Standards 49: 409-436.
Kabiru, A., Mohammed, Y.W., Abubakar, S.H., and Salisu M., (2024) Two RMIL- Type scheme with compressed sensing applications. Optimization Methods and Software https://doi.org/10.1080/10556788.2024.2425001
Liu, Y., storey, C. (1991) Efficient generalized conjugate gradient algorithms, part 1: theory, Journal of Optimization Theory Applications 69 (1): 129-137.
Polak. E., Ribiere, G. (1969) Note Sur la convergence de methods de direction conjugues. ESAIM: Mathematical Modeling and Numerical Analysis Modlisation Mathematique of Analyse Numrique 3 (RI): 35-43.
Powell, M.J.D. (1977). Restart procedures for the conjugate gradient method. Mathematical programming 12(1): 241-254.
Powell, M.J.D. (1984) Nonconvex minimization calculation and the conjugate gradient method lecture notes in mathematics. 1066 Springer-Verlag. Berlin. pp. 122-141.
Powell, M.J.D. (1986) Convergence properties of algorithm for nonlinear optimization, SIAM Rev. 28: 487-500
Rivaie, M., Mamat, M., June, Ismail, M. (2012) A new class of nonlinear conjugate gradient coefficient with global convergence properties. Applied Mathematics and Computation 218: 0096-3003.
Rivaie, M.,Mamat, M., Abashar, A. (2015) A new class of nonlinear conjugate gradient coefficient with exact and inexact line search. Applied Mathematics and Computation, 268: 1152-1163.
Saleh, N.A, Sulaiman , I.M., Mamat, M., Deiby, T.S., Nelson, N. (2020) A new Hestene-Steifel and Fletcher-Reeves conjugate gradient method with Descent properties for optimization models. International journal of Supply and operation Management, 7(4) 344-349.
Salih, Y., Hamoda, M.A., Rivaie, M. (2018) New hybrid conjugate gradient method with global convergence properties for unconstrained optimization. Malays. Journal of Computer and Applied mathematics 1(1), 29-38.
Touati-Ahmed, D., Storey, C. (1990) Efficient hybrid conjugate gradient techniques Journal of Optimization Theory Applications 64(2): 379-397.
Wei, Z.G. Li. L. (2006) New nonlinear conjugate gradient method formulas for large- scaled unconstrained optimizations problems, Applied Mathematics, 179: 407-430.
Yuan, G. S. Lu. Z. Wei. (2010) A line search algorithm for unconstrained optimization. Journal of Software Engineering. 3: 503-509.
Zoutendijk, G. (1970). Nonlinear programming computational methods Integer and Nonlinear programming 143(1): 37-86.
PDF