Metaheuristic optimization algorithm based on the two-step Adams-Bashforth method in training multi-layer perceptrons

Authors

DOI:

https://doi.org/10.15587/1729-4061.2022.254023

Keywords:

algorithm, Adams-Bashforth method, approximation, classification, global, metaheuristic, multilayer, perceptron, training, optimization

Abstract

The proposed metaheuristic optimization algorithm based on the two-step Adams-Bashforth scheme (MOABT) was first used in this paper for Multilayer Perceptron Training (MLP). In computer science and mathematical examples, metaheuristic is high-level procedures or guidelines designed to find, devise, or select algorithmic research methods to obtain high-quality solutions to an example problem, especially if the information is insufficient or incomplete, or if computational capacity is limited. Many metaheuristic methods include some stochastic example operations, which means that the resulting solution is dependent on the random variables that are generated during the search. The use of higher evidence can frequently find good solutions with less computational effort than iterative methods and algorithms because it searches a broad range of feasible solutions at the same time. Therefore, metaheuristic is a useful approach to solving example problems. There are several characteristics that distinguish metaheuristic strategies for the research process. The goal is to efficiently explore the search perimeter to find the best and closest solution. The techniques that make up metaheuristic algorithms range from simple searches to complex learning processes. Eight model data sets are used to calculate the proposed approach, and there are five classification data sets and three proximate job data sets included in this set. The numerical results were compared with those of the well-known evolutionary trainer Gray Wolf Optimizer (GWO). The statistical study revealed that the MOABT algorithm can outperform other algorithms in terms of avoiding local optimum and speed of convergence to global optimum. The results also show that the proposed problems can be classified and approximated with high accuracy

Supporting Agency

  • We extend our sincere thanks and appreciation to the University of Mosul, the College of Computer Science and Mathematics, and the College of Education for Pure Sciences, for their cooperation and support for the completion of this paper.

Author Biographies

Hisham M. Khudhur, University of Mosul

Lecturer

Department of Mathematics

College of Computer Science and Mathematics

Kais I. Ibraheem, University of Mosul

Assistant Professor, Dean of College

Department of Computer Science

College of Education for Pure Science

References

  1. Yilmaz, M., Kayabasi, E., Akbaba, M. (2019). Determination of the effects of operating conditions on the output power of the inverter and the power quality using an artificial neural network. Engineering Science and Technology, an International Journal, 22 (4), 1068–1076. doi: https://doi.org/10.1016/j.jestch.2019.02.006
  2. Vahora, S. A., Chauhan, N. C. (2019). Deep neural network model for group activity recognition using contextual relationship. Engineering Science and Technology, an International Journal, 22 (1), 47–54. doi: https://doi.org/10.1016/j.jestch.2018.08.010
  3. Maleki, Ghazvini, Ahmadi, Maddah, Shamshirband. (2019). Moisture Estimation in Cabinet Dryers with Thin-Layer Relationships Using a Genetic Algorithm and Neural Network. Mathematics, 7 (11), 1042. doi: https://doi.org/10.3390/math7111042
  4. Farzaneh-Gord, M., Mohseni-Gharyehsafa, B., Arabkoohsar, A., Ahmadi, M. H., Sheremet, M. A. (2020). Precise prediction of biogas thermodynamic properties by using ANN algorithm. Renewable Energy, 147, 179–191. doi: https://doi.org/10.1016/j.renene.2019.08.112
  5. Basheer, I. A., Hajmeer, M. (2000). Artificial neural networks: fundamentals, computing, design, and application. Journal of Microbiological Methods, 43 (1), 3–31. doi: https://doi.org/10.1016/s0167-7012(00)00201-3
  6. Li, J., Cheng, J., Shi, J., Huang, F. (2012). Brief Introduction of Back Propagation (BP) Neural Network Algorithm and Its Improvement. Advances in Computer Science and Information Engineering, 553–558. doi: https://doi.org/10.1007/978-3-642-30223-7_87
  7. Aljarah, I., Faris, H., Mirjalili, S. (2016). Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Computing, 22 (1), 1–15. doi: https://doi.org/10.1007/s00500-016-2442-1
  8. Wang, L., Zeng, Y., Chen, T. (2015). Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Systems with Applications, 42 (2), 855–863. doi: https://doi.org/10.1016/j.eswa.2014.08.018
  9. Yang, X. (2010). Nature-Inspired Metaheuristic Algorithms. Luniver Press, 75.
  10. Wang, G.-G., Gandomi, A. H., Alavi, A. H., Hao, G.-S. (2013). Hybrid krill herd algorithm with differential evolution for global numerical optimization. Neural Computing and Applications, 25 (2), 297–308. doi: https://doi.org/10.1007/s00521-013-1485-9
  11. Reed, R., Marks, R. J. (1999). Neural Smithing. The MIT Press. doi: https://doi.org/10.7551/mitpress/4937.001.0001
  12. Dey, N., Ashour, A. S., Bhattacharyya, S. (Eds.) (2020). Applied Nature-Inspired Computing: Algorithms and Case Studies. Springer, 275. doi: https://doi.org/10.1007/978-981-13-9263-4
  13. Dey, N. (Ed.) (2018). Advancements in Applied Metaheuristic Computing. IGI Global. doi: https://doi.org/10.4018/978-1-5225-4151-6
  14. Gandomi, A. H., Yang, X.-S., Alavi, A. H. (2013). Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Engineering with Computers, 29 (1), 17–35. doi: https://doi.org/10.1007/s00366-011-0241-y
  15. Mirjalili, S., Mirjalili, S. M., Lewis, A. (2014). Grey Wolf Optimizer. Advances in Engineering Software, 69, 46–61. doi: https://doi.org/10.1016/j.advengsoft.2013.12.007
  16. Mirjalili, S., Lewis, A. (2016). The Whale Optimization Algorithm. Advances in Engineering Software, 95, 51–67. doi: https://doi.org/10.1016/j.advengsoft.2016.01.008
  17. Yang, X.-S. (2009). Firefly Algorithms for Multimodal Optimization. Lecture Notes in Computer Science, 169–178. doi: https://doi.org/10.1007/978-3-642-04944-6_14
  18. Yang, X., Hossein Gandomi, A. (2012). Bat algorithm: a novel approach for global engineering optimization. Engineering Computations, 29 (5), 464–483. doi: https://doi.org/10.1108/02644401211235834
  19. Yu, J., Wang, S., Xi, L. (2008). Evolving artificial neural networks using an improved PSO and DPSO. Neurocomputing, 71 (4-6), 1054–1060. doi: https://doi.org/10.1016/j.neucom.2007.10.013
  20. Valian, E., Mohanna, S., Tavakoli, S. (2011). Improved Cuckoo Search Algorithm for Feed forward Neural Network Training. International Journal of Artificial Intelligence & Applications, 2 (3), 36–43. doi: https://doi.org/10.5121/ijaia.2011.2304
  21. Jaddi, N. S., Abdullah, S., Hamdan, A. R. (2015). Multi-population cooperative bat algorithm-based optimization of artificial neural network model. Information Sciences, 294, 628–644. doi: https://doi.org/10.1016/j.ins.2014.08.050
  22. Mirjalili, S. (2015). How effective is the Grey Wolf optimizer in training multi-layer perceptrons. Applied Intelligence, 43 (1), 150–161. doi: https://doi.org/10.1007/s10489-014-0645-7
  23. Nandy, S., Sarkar, P. P., Das, A. (2012). Analysis of a nature inspired firefly algorithm based back-propagation neural network training. arXiv. doi: https://doi.org/10.48550/arXiv.1206.5360
  24. Heidari, A. A., Faris, H., Aljarah, I., Mirjalili, S. (2018). An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Computing, 23 (17), 7941–7958. doi: https://doi.org/10.1007/s00500-018-3424-2
  25. Xu, J., Yan, F. (2018). Hybrid Nelder–Mead Algorithm and Dragonfly Algorithm for Function Optimization and the Training of a Multilayer Perceptron. Arabian Journal for Science and Engineering, 44 (4), 3473–3487. doi: https://doi.org/10.1007/s13369-018-3536-0
  26. Tang, R., Fong, S., Dey, N. (2018). Metaheuristics and Chaos Theory. Chaos Theory. doi: https://doi.org/10.5772/intechopen.72103
  27. Karaagac, B. (2019). Two step Adams Bashforth method for time fractional Tricomi equation with non-local and non-singular Kernel. Chaos, Solitons & Fractals, 128, 234–241. doi: https://doi.org/10.1016/j.chaos.2019.08.007
  28. Butcher, J. C. (2016). Numerical Methods for Ordinary Differential Equations. John Wiley & Sons. doi: https://doi.org/10.1002/9781119121534
  29. Ahmadianfar, I., Heidari, A. A., Gandomi, A. H., Chu, X., Chen, H. (2021). RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Systems with Applications, 181, 115079. doi: https://doi.org/10.1016/j.eswa.2021.115079
  30. Belew, R. K., McInerney, J., Schraudolph, N. N. (1990). Evolving networks: Using the genetic algorithm with connectionist learning. CSE Technical report #CS90-174. Available at: https://nic.schraudolph.org/pubs/BelMcISch92.pdf

Downloads

Published

2022-04-28

How to Cite

Khudhur, H. M., & Ibraheem, K. I. (2022). Metaheuristic optimization algorithm based on the two-step Adams-Bashforth method in training multi-layer perceptrons . Eastern-European Journal of Enterprise Technologies, 2(4 (116), 6–13. https://doi.org/10.15587/1729-4061.2022.254023

Issue

Section

Mathematics and Cybernetics - applied aspects