A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization

Authors

DOI:

https://doi.org/10.15587/1729-4061.2022.254017

Keywords:

conjugate gradient method, descent direction, global property, strong Wolfe-Powell line search, unconstrained optimization

Abstract

Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search (ILS). Furthermore, the HS method does not always satisfy the descent property. The goal of this work is to create a new (modified) formula by reformulating the classic parameter HS-CG and adding a new term to the classic HS-CG formula. It is critical that the proposed method generates sufficient descent property (SDP) search direction with Wolfe-Powell line (sWPLS) search at every iteration, and that global convergence property (GCP) for general non-convex functions can be guaranteed. Using the inexact sWPLS, the modified HS-CG (mHS-CG) method has SDP property regardless of line search type and guarantees GCP. When using an sWPLS, the modified formula has the advantage of keeping the modified scalar non-negative sWPLS. This paper is significant in that it quantifies how much better the new modification of the HS performance is when compared to standard HS methods. As a result, numerical experiments between the mHSCG method using the sWPL search and the standard HS optimization problem show that the CG method with the mHSCG conjugate parameter is more robust and effective than the CG method without the mHSCG parameter

Supporting Agency

  • The author would like to express gratitude to the University of Mosul's College of Computer Sciences and Mathematics for their encouragement and support.

Author Biography

Ghada Moayid Al-Naemi, University of Mosul

Doctor of Mathematics, Assistance Proffesor

Depatement of Mathematics

References

  1. Hestenes, M. R., Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49 (6), 409. doi: https://doi.org/10.6028/jres.049.044
  2. Fletcher, R. (1964). Function minimization by conjugate gradients. The Computer Journal, 7 (2), 149–154. doi: https://doi.org/10.1093/comjnl/7.2.149
  3. Dai, Y.-H. (2001). New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods. Applied Mathematics and Optimization, 43 (1), 87–101. doi: https://doi.org/10.1007/s002450010019
  4. Al-Naemi, G. M. (2014). A Modified Hestenes-Stiefel Conjugate Gradient Method and its Global convergence for unconstrained optimization. Iraqi Journal of Science, 55 (1), 202–217. Available at: https://iasj.net/iasj/download/9be4a3f4393e9e31
  5. Li, Y., Du, S. (2019). Modified HS conjugate gradient method for solving generalized absolute value equations. Journal of Inequalities and Applications, 2019 (1). doi: https://doi.org/10.1186/s13660-019-2018-6
  6. Wang, G., Shan, R., Huang, W., Liu, W., Zhao, J. (2017). Two new spectral conjugate gradient algorithms based on Hestenes–Stiefel. Journal of Algorithms & Computational Technology, 11 (4), 345–352. doi: https://doi.org/10.1177/1748301817725296
  7. Salleh, Z., Alhawarat, A. (2016). An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property. Journal of Inequalities and Applications, 2016 (1). doi: https://doi.org/10.1186/s13660-016-1049-5
  8. Japri, N. A., Basri, S., Mamat, M. (2021). New modification of the Hestenes-Stiefel with strong Wolfe line search. AIP Conference Proceedings. doi: https://doi.org/10.1063/5.0053211
  9. Wu, X., Zhu, Y., Yin, J. (2021). A HS-PRP-Type Hybrid Conjugate Gradient Method with Sufficient Descent Property. Computational Intelligence and Neuroscience, 2021, 1–8. doi: https://doi.org/10.1155/2021/2087438
  10. Zoutendij, G. (1970). Nonlinear programming, computational methods. Integer and nonlinear programming, 143, 37–86.
  11. Malik, M., Mamat, S., Abas, S., Sulaiman, I. M., Sukono (2020). A new coefficient of the conjugate gradient method with the sufficient descent condition and global convergence properties. Engineering Letters, 28 (3), 704–714.
  12. Hager, W., Zhang, H. (2006). A survey of non-linear conjugate gradient methods. Pacific Journal of Optimization, 2, 35–58.
  13. Gilbert, J. C., Nocedal, J. (1992). Global Convergence Properties of Conjugate Gradient Methods for Optimization. SIAM Journal on Optimization, 2 (1), 21–42. doi: https://doi.org/10.1137/0802003
  14. Bongartz, I., Conn, A. R., Gould, N., Toint, P. L. (1995). CUTE. ACM Transactions on Mathematical Software, 21 (1), 123–160. doi: https://doi.org/10.1145/200979.201043
  15. Andrei, N. (2013). Test functions for unconstrained optimization. ICI Tecchnical Report, 3–5.
  16. Powell, M. J. D. (1977). Restart procedures for the conjugate gradient method. Mathematical Programming, 12 (1), 241–254. doi: https://doi.org/10.1007/bf01593790
  17. Al-Namat, F. N., Al-Naemi, G. M. (2020). Global Convergence Property with Inexact Line Search for a New Hybrid Conjugate Gradient Method. OALib, 07 (02), 1–14. doi: https://doi.org/10.4236/oalib.1106048
  18. Jardow, F. N., Al-Naemi, G. M. (2021). A new hybrid conjugate gradient algorithm as a convex combination of MMWU and RMIL nonlinear problems. Journal of Interdisciplinary Mathematics, 24 (3), 637–655. doi: https://doi.org/10.1080/09720502.2020.1815346

Downloads

Published

2022-04-28

How to Cite

Al-Naemi, G. M. (2022). A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization. Eastern-European Journal of Enterprise Technologies, 2(4 (116), 14–21. https://doi.org/10.15587/1729-4061.2022.254017

Issue

Section

Mathematics and Cybernetics - applied aspects