Exploring Enhanced Conjugate Gradient Methods: A Novel Family of Techniques for Efficient Unconstrained Minimization
Abstract
Given that the conjugate gradient method (CGM) is computationally efficient and user-friendly, it is often used to address large-scale, unconstrained minimization issues. Numerous researchers have created new conjugate gradient (CG) update parameters by modifying the initial set, also referred to as classical CGMs. This has resulted in the development of several hybrid approaches. This work's major goal is to create a new family of techniques that can be used to create even more new methods. Consequently, Hestenes-Stiefel's update parameter and a new family involving Polak-Ribiere-Polyak and Liu-Storey CGMs are considered. By changing the parameters of this CGM family, a novel approach that possesses sufficient descent characteristics is obtained. A numerical experiment including many unconstrained minimization problems (UMP) is carried out to assess the novel method's efficacy compared to existing approaches. The result reveals that the new CG approach performs better than the current ones.
References
Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49, 409-436. https://nvlpubs.nist.gov/nistpubs/jres/049/jresv49n6p409_a1b.pdf
Andrei, N. (2008b). 40 conjugate gradient algorithms for unconstrained optimization; a survey on their definition. ICI Technical Report, 13, 1-8. https://camo.ici.ro/neculai/p13a08.pdf
Dai, Y. H. (2011). Nonlinear conjugate gradient methods. Wiley Encyclopedia of Operations Research and Management Science. Beijing P.R. China. https://doi.org/10.1002/9780470400531.eorms0183
Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. The Computer Journal, 7, 149-154. https://doi.org/10.1093/COMJNL/7.2.149
Polak, E., & Ribiere, G. (1969). Note sur la convergence de directions conjuguées. ESAIM: Mathematical Modelling and Numerical Analysis, 3, 35-43. https://doi.org/10.1051/M2AN/196903R100351
Polyak, B. T. (1969). The conjugate gradient method in extreme problems. Computational Mathematics and Mathematical Physics, 9, 94-112. https://doi.org/10.1016/0041-5553(69)90035-4
Fletcher, R. (1987). Practical method of optimization (2nd ed.). New York: John Wiley. https://doi.org/10.2307/2008742
Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms part 1: Theory. Journal of Optimization Theory and Applications, 69, 322-340. https://doi.org/10.1007/BF00940464
Dai, Y. H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization, 10, 177-182. https://doi.org/10.1137/S1052623497318992
Nocedal, J. (1992). Theory of algorithms for unconstrained optimization. Acta Numerica, 1, 199-242.
Djordjevic, S. S. (2017). New hybrid conjugate gradient method as a convex combination of LS and CD methods. Filomat, 31(6), 1813-1825. https://doi.org/10.2298/FIL1706813D
Mohammed, N. S., Mustapha, M., Mohd, R., & Shazlyn, M. S. (2020). A new hybrid coefficient of conjugate gradient method. Indonesian Journal of Electrical Engineering and Computer Science, 18(3), 1454-1463. https://doi.org/10.11591/ijeecs.v18.i3.pp1454-1463
Akinduko, O. B. (2021). A new conjugate gradient method with sufficient descent property. Earthline Journal of Mathematical Sciences, 6(1), 163-174. https://doi.org/10.34198/EJMS.6121.163174
Stanimirović, P. S., Inanov, B., Ma, H., & Mosić, D. (2020). A survey of gradient methods for solving nonlinear optimization. Electronic Research Archive, 28(4), 1573-1624. https://doi.org/10.3934/era.2020115
Dai, Y. H., & Yuan, Y. (1998). A class of globally convergent conjugate gradient methods. Research Report ICM-98-030, Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences.
Nazareth, J. L. (1999). Conjugate gradient methods. In C. Floudas & P. Pardalos (Eds.), Encyclopedia of optimization. Boston: Kluwer Academic Publishers.
Dai, Y. H., & Yuan, Y. (2001). A three-parameter family of hybrid conjugate gradient methods. Mathematics of Computation, 70, 1155-1167. https://doi.org/10.1090/S0025-5718-00-01253-9
Djordjevic, S. S. (2018). New hybrid conjugate gradient method as a convex combination of HS and FR methods. Journal of Applied Mathematics and Computation, 2, 366-378. https://doi.org/10.26855/jamc.2018.09.002
Djordjevic, S. S. (2019). New hybrid conjugate gradient method as a convex combination of LS and FR methods. Acta Mathematica Scientia, 39(1), 214-228. https://doi.org/10.1007/s10473-019-0117-6
Salihu, N., Odekunle, M., Waziri, M., & Haliu, A. (2020). A new hybrid conjugate gradient method based on secant equation for solving large scale unconstrained optimization problems. Iranian Journal of Optimization, 12(1), 33-44.
Salihu, N., Odekunle, M. R., Saleh, A. M., & Salihu, S. (2021). A Dai-Liao hybrid Hestenes-Stiefel and Fletcher-Reeves methods for unconstrained optimization. International Journal of Industrial Optimization, 2(1), 33-50. https://doi.org/10.12928/IJIO.V2I1.3054
Sabiú, J., Muangchoo, K., Shah, A., Abubakah, A. B., & Aremu, K. O. (2021). An inexact optimal hybrid conjugate gradient method for solving symmetric nonlinear equations. Symmetry, 13, 1829. https://doi.org/10.3390/sym13101829
Jardow, F. N., & Al-Naemi, G. M. (2020). A new hybrid conjugate gradient algorithm for unconstrained optimization with inexact line search. Indonesian Journal of Electrical Engineering and Computer Science, 20(2), 939-947. https://doi.org/10.11591/ijeecs.v20.i2
Mohammed, S. I., Bakar, N. A., Mamat, M., Hassan, B. A., Malik, M., & Ahmed, A. M. (2021). A new hybrid conjugate gradient algorithm for optimization models and its application to regression analysis. Indonesian Journal of Electrical Engineering and Computer Science, 23, 1100-1109. https://doi.org/10.11591/ijeecs.v23.i2.pp1100-1109
Min, S., Liu, J., & Wang, Y. (2020). Two improved conjugate gradient methods with application in compressive sensing and motion control. Mathematical Problems in Engineering, 2020, 1-11.
Xianzhen, J., & Jian, J. (2019). Improved Fletcher-Reeves and Dai-Yuan conjugate gradient methods with the strong Wolfe line search. Journal of Computational and Applied Mathematics, 348, 525-534. https://10.1016/j.cam.2018.09.012
Rivaie, M., Mamat, M., Leong, W. J., & Ismail, M. (2012). A new class of nonlinear conjugate gradient coefficients with global convergence properties. Applied Mathematics and Computation, 218, 11323-11332. https://doi.org/10.1016/j.amc.2012.05.030
Mandara, A. V., Mamat, M., Waziri, M. Y., Mohammed, M. A., & Yakubu, U. A. (2018). A new conjugate gradient coefficient with exact line search for unconstrained optimization. Far East Journal of Mathematical Sciences (FJMS), 105(2), 193-206. https://doi.org/10.17654/ms105020193
Onuoha, O. B. (2023). A sufficient descent Dai-Liao type conjugate gradient update parameter. Earthline Journal of Mathematical Sciences, 13(2), 353-368. https://doi.org/10.34198/ejms.13223.353368
Onuoha, O. B. (2024). Global convergence properties of a Dai-Liao type CGM for unconstrained optimization. Recent Advances in Natural Sciences, 2, 30-39. https://doi.org/10.61298/rans.2024.2.1.30
Onuoha, O. B., Aborisade, Y. J., Egenti, G., & Ayinla, R. O. (2024). Integration of modified classical conjugate gradient methods for unconstrained optimization. Arid Zone Journal of Basic and Applied Research, 3(1), 26-41. https://doi.org/10.55639/607.474645
Roman, S. (2005). Advanced linear algebra (2nd ed.). Springer.
Chong, E. K., & Zak, S. H. (2001). An introduction to optimization (2nd ed.). New York: Wiley-Interscience.
Bongartz, I., Conn, A. R., Gould, N. I. M., & Toint, P. L. (1995). CUTE: Constrained and unconstrained testing environments. ACM Transactions on Mathematical Software, 21, 123-160. https://doi.org/10.1145/200979.201043
Andrei, N. (2008a). An unconstrained optimization test functions collection. Advanced Modeling and Optimization, 10(1), 147-161. https://camo.ici.ro/journal/vol10/v10a10.pdf
Dolan, E. D., & Moré, J. J. (2002). Benchmarking optimization software with performance profiles. Mathematical Programming, 91, 201-213. https://doi.org/10.1007/s101070100623
This work is licensed under a Creative Commons Attribution 4.0 International License.