A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization
DOI:
https://doi.org/10.15587/1729-4061.2022.254017Keywords:
conjugate gradient method, descent direction, global property, strong Wolfe-Powell line search, unconstrained optimizationAbstract
Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search (ILS). Furthermore, the HS method does not always satisfy the descent property. The goal of this work is to create a new (modified) formula by reformulating the classic parameter HS-CG and adding a new term to the classic HS-CG formula. It is critical that the proposed method generates sufficient descent property (SDP) search direction with Wolfe-Powell line (sWPLS) search at every iteration, and that global convergence property (GCP) for general non-convex functions can be guaranteed. Using the inexact sWPLS, the modified HS-CG (mHS-CG) method has SDP property regardless of line search type and guarantees GCP. When using an sWPLS, the modified formula has the advantage of keeping the modified scalar non-negative sWPLS. This paper is significant in that it quantifies how much better the new modification of the HS performance is when compared to standard HS methods. As a result, numerical experiments between the mHSCG method using the sWPL search and the standard HS optimization problem show that the CG method with the mHSCG conjugate parameter is more robust and effective than the CG method without the mHSCG parameter
Supporting Agency
- The author would like to express gratitude to the University of Mosul's College of Computer Sciences and Mathematics for their encouragement and support.
References
- Hestenes, M. R., Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Bureau of Standards, 49 (6), 409. doi: https://doi.org/10.6028/jres.049.044
- Fletcher, R. (1964). Function minimization by conjugate gradients. The Computer Journal, 7 (2), 149–154. doi: https://doi.org/10.1093/comjnl/7.2.149
- Dai, Y.-H. (2001). New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods. Applied Mathematics and Optimization, 43 (1), 87–101. doi: https://doi.org/10.1007/s002450010019
- Al-Naemi, G. M. (2014). A Modified Hestenes-Stiefel Conjugate Gradient Method and its Global convergence for unconstrained optimization. Iraqi Journal of Science, 55 (1), 202–217. Available at: https://iasj.net/iasj/download/9be4a3f4393e9e31
- Li, Y., Du, S. (2019). Modified HS conjugate gradient method for solving generalized absolute value equations. Journal of Inequalities and Applications, 2019 (1). doi: https://doi.org/10.1186/s13660-019-2018-6
- Wang, G., Shan, R., Huang, W., Liu, W., Zhao, J. (2017). Two new spectral conjugate gradient algorithms based on Hestenes–Stiefel. Journal of Algorithms & Computational Technology, 11 (4), 345–352. doi: https://doi.org/10.1177/1748301817725296
- Salleh, Z., Alhawarat, A. (2016). An efficient modification of the Hestenes-Stiefel nonlinear conjugate gradient method with restart property. Journal of Inequalities and Applications, 2016 (1). doi: https://doi.org/10.1186/s13660-016-1049-5
- Japri, N. A., Basri, S., Mamat, M. (2021). New modification of the Hestenes-Stiefel with strong Wolfe line search. AIP Conference Proceedings. doi: https://doi.org/10.1063/5.0053211
- Wu, X., Zhu, Y., Yin, J. (2021). A HS-PRP-Type Hybrid Conjugate Gradient Method with Sufficient Descent Property. Computational Intelligence and Neuroscience, 2021, 1–8. doi: https://doi.org/10.1155/2021/2087438
- Zoutendij, G. (1970). Nonlinear programming, computational methods. Integer and nonlinear programming, 143, 37–86.
- Malik, M., Mamat, S., Abas, S., Sulaiman, I. M., Sukono (2020). A new coefficient of the conjugate gradient method with the sufficient descent condition and global convergence properties. Engineering Letters, 28 (3), 704–714.
- Hager, W., Zhang, H. (2006). A survey of non-linear conjugate gradient methods. Pacific Journal of Optimization, 2, 35–58.
- Gilbert, J. C., Nocedal, J. (1992). Global Convergence Properties of Conjugate Gradient Methods for Optimization. SIAM Journal on Optimization, 2 (1), 21–42. doi: https://doi.org/10.1137/0802003
- Bongartz, I., Conn, A. R., Gould, N., Toint, P. L. (1995). CUTE. ACM Transactions on Mathematical Software, 21 (1), 123–160. doi: https://doi.org/10.1145/200979.201043
- Andrei, N. (2013). Test functions for unconstrained optimization. ICI Tecchnical Report, 3–5.
- Powell, M. J. D. (1977). Restart procedures for the conjugate gradient method. Mathematical Programming, 12 (1), 241–254. doi: https://doi.org/10.1007/bf01593790
- Al-Namat, F. N., Al-Naemi, G. M. (2020). Global Convergence Property with Inexact Line Search for a New Hybrid Conjugate Gradient Method. OALib, 07 (02), 1–14. doi: https://doi.org/10.4236/oalib.1106048
- Jardow, F. N., Al-Naemi, G. M. (2021). A new hybrid conjugate gradient algorithm as a convex combination of MMWU and RMIL nonlinear problems. Journal of Interdisciplinary Mathematics, 24 (3), 637–655. doi: https://doi.org/10.1080/09720502.2020.1815346
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Ghada Moayid Al-Naemi
This work is licensed under a Creative Commons Attribution 4.0 International License.
The consolidation and conditions for the transfer of copyright (identification of authorship) is carried out in the License Agreement. In particular, the authors reserve the right to the authorship of their manuscript and transfer the first publication of this work to the journal under the terms of the Creative Commons CC BY license. At the same time, they have the right to conclude on their own additional agreements concerning the non-exclusive distribution of the work in the form in which it was published by this journal, but provided that the link to the first publication of the article in this journal is preserved.
A license agreement is a document in which the author warrants that he/she owns all copyright for the work (manuscript, article, etc.).
The authors, signing the License Agreement with TECHNOLOGY CENTER PC, have all rights to the further use of their work, provided that they link to our edition in which the work was published.
According to the terms of the License Agreement, the Publisher TECHNOLOGY CENTER PC does not take away your copyrights and receives permission from the authors to use and dissemination of the publication through the world's scientific resources (own electronic resources, scientometric databases, repositories, libraries, etc.).
In the absence of a signed License Agreement or in the absence of this agreement of identifiers allowing to identify the identity of the author, the editors have no right to work with the manuscript.
It is important to remember that there is another type of agreement between authors and publishers – when copyright is transferred from the authors to the publisher. In this case, the authors lose ownership of their work and may not use it in any way.