Estimating local approximation accuracy with the network of hybrid neuron-like units

Authors

  • Сергей Витальевич Попов Kharkiv National University of Radio Electronics Lenin av. 14, Kharkov, Ukraine, 61166, Ukraine
  • Кристина Александровна Шкуро Kharkiv National University of Radio Electronics Lenin av. 14, Kharkov, Ukraine, 61166, Ukraine

DOI:

https://doi.org/10.15587/1729-4061.2013.18350

Keywords:

local accuracy estimation, evolutionary architecture optimization, approximation reliability enhancement

Abstract

Accuracy is one of the most important properties of the solution to any practical problem. Neuro-fuzzy networks usually generate point estimates of the process under consideration, and the accuracy is estimated on average for the whole dataset. This is the easiest way of accuracy estimation and it is justified for most cases, however it is not enough in some situations, where approximation accuracy may be clearly non-uniform across the dataset. In this paper, a network of hybrid neuron-like units is considered, which is expanded to deliver local accuracy estimates. The architecture is constrained by a priori information about the properties of the input signals and the system being modeled and is subsequently optimized on a synaptic level by an evolutionary algorithm. Introduction of a priori information into evolutionary process enables a gray-box approach to systems modeling. Local accuracy estimation provides vital information for subsequent decision making and increases method’s value for the users. The proposed approach is quite general and can be applied to many popular neural networks, e.g. MLP, FIR networks or any other neural and neuro-fuzzy networks (including emerging ones) that are special cases of the network of hybrid neuron-like units.

Author Biographies

Сергей Витальевич Попов, Kharkiv National University of Radio Electronics Lenin av. 14, Kharkov, Ukraine, 61166

Doctor of Science, Chief Researcher

Control Systems Research Laboratory

Кристина Александровна Шкуро, Kharkiv National University of Radio Electronics Lenin av. 14, Kharkov, Ukraine, 61166

Ph.D. student

Control Systems Research Laboratory

References

  1. Kononenko, I. Machine Learning and Data Mining: Introduction to Principles and Algorithms [Текст] / Kononenko I., Kukar M. – Cambridge : Horwood Publishing, 2007. – 454 p.
  2. Попов, С. В. Гибридный нейроподобный элемент – новый тип строительного блока искусственных нейронных сетей [Текст] / Попов С. В., Шкуро К. А. // Научный вестник Донбасской государственной машиностроительной академии. – 2011. – № 2(8Е). – С. 87-92.
  3. Haykin, S. Neural Networks. A Comprehensive Foundation [Текст] / Haykin S. – Upper Saddle River : Prentice Hall, 1999. – 842 p.
  4. Statistical concepts in reliability [Текст] / Crowder M. J., Kimber A. C., Smith R. L., Sweeting T. J. // Statistical Analysis of Reliability Data – London: Chapman & Hall, 1991. – P. 1-11.
  5. Schaal, S. Assessing the quality of learned local models [Текст] / Schaal S., Atkeson C. G. // Advances in Neural Information Processing Systems. – San Mateo, CA, 1994. – P. 160-167.
  6. Birattari, M. Local Learning for Data Analysis [Текст] / Birattari M., Bontempi H., Bersini H. // Proc. 8th Belgian-Dutch Conference on Machine Learning. – Benelearn, 1998. – P. 55-61.
  7. Rodrigues, P. P. Online Reliability Estimates for Individual Predictions in Data Streams [Текст] / Rodrigues P. P., Gama J., Bosnic Z. // Data Mining Workshops, 2008. ICDMW '08. IEEE International Conference on. – 2008. – P. 36-45.
  8. Кобзарь, А. И. Прикладная математическая статистика [Текст] / Кобзарь А. И. – М. : ФИЗМАТЛИТ, 2006. – 816 с.
  9. Rivals, I. Construction of confidence intervals for neural networks based on least squares estimation [Текст] / Rivals I., Personnaz L. // Neural Networks. – 2000. – Vol. 13, № 4-5. – P. 463-484.
  10. Bishop, C. M. Regression with Input-Dependent Noise: A Bayesian Treatment [Текст] / Bishop C. M., Qazaz C. S. // Advances in Neural Information Processing Systems 9: Proceedings of the 1996 Conference. – Denver, 1997. – P. 347-353.
  11. Nix, D. A. Learning Local Error Bars for Nonlinear Regression [Текст] / Nix D. A., Weigend A. S. // Advances in Neural Information Processing Systems: Proceedings of the 1994 Conference. – Denver, 1995. – P. 489-496.
  12. Popov, S. Evolutionary Optimized Network of Hybrid Neuron-Like Units [Текст] / Popov S., Shkuro K. // Proc. 7th Int. Conf. Neural Networks and Artificial Intelligence (ICNNAI-2012). – Minsk, Belarus, 2012. – P. 32-35.
  13. Попов, С. В. Метод параметрической оптимизации сети на базе гибридных нейроподобных элементов, основанный на методе Ψ-преобразования [Текст] / Попов С. В., Шкуро К. А. // Радіоелектронні і комп’ютерні системи. – 2013. – № 2(61). – С. 94-100.
  14. Rashedi, E. GSA: A Gravitational Search Algorithm [Текст] / Rashedi E., Nezamabadi-pour H., Saryazdi S. // Information Sciences. – 2009. – Vol. 179, № 13. – P. 2232-2248.
  15. Химмельблау, Д. Прикладное нелинейное программирование [Текст] / Химмельблау Д. – М. : Мир, 1975. – 536 с.
  16. Kononenko, I., & Kukar, M. (2007). Machine Learning and Data Mining: Introduction to Principles and Algorithms. Cambridge: Horwood Publishing, 454.
  17. Popov, S., & Shkuro, K. (2011). Hybrid neuron-like unit – a new type of neural network biulding block. Nauchnyy vestnik Donbasskoy gosudarstvennoy mashinostroitelnoy akademii (2(8Е)), 87-92.
  18. Haykin, S. (1999). Neural Networks. A Comprehensive Foundation. Upper Saddle River: Prentice Hall, 842.
  19. Crowder, M. J., Kimber, A. C., Smith, R. L., & Sweeting, T. J. (1991). Statistical concepts in reliability Statistical Analysis of Reliability Data (pp. 1-11). London: Chapman & Hall.
  20. Schaal, S., & Atkeson, C. G. (1994). Assessing the quality of learned local models. Paper presented at the Advances in Neural Information Processing Systems, San Mateo, CA, 160-167.
  21. Birattari, M., Bontempi, H., & Bersini, H. (1998). Local Learning for Data Analysis. Paper presented at the Proc. 8th Belgian-Dutch Conference on Machine Learning, Benelearn, 55-61.
  22. Rodrigues, P. P., Gama, J., & Bosnic, Z. (2008). Online Reliability Estimates for Individual Predictions in Data Streams. Paper presented at the Data Mining Workshops, 2008. ICDMW '08. IEEE International Conference on, 36-45.
  23. Kobzar, A. (2006). Applied mathematical statistics. Moscow: FIZMATLIT, 816.
  24. Rivals, I., & Personnaz, L. (2000). Construction of confidence intervals for neural networks based on least squares estimation. Neural Networks, 13(4-5), 463-484.
  25. Bishop, C. M., & Qazaz, C. S. (1997). Regression with Input-Dependent Noise: A Bayesian Treatment. Paper presented at the Advances in Neural Information Processing Systems 9: Proceedings of the 1996 Conference, Denver, 347-353.
  26. Nix, D. A., & Weigend, A. S. (1995). Learning Local Error Bars for Nonlinear Regression. Paper presented at the Advances in Neural Information Processing Systems: Proceedings of the 1994 Conference, Denver, 489-496.
  27. Popov, S., & Shkuro, K. (2012). Evolutionary Optimized Network of Hybrid Neuron-Like Units. Paper presented at the Proc. 7th Int. Conf. Neural Networks and Artificial Intelligence (ICNNAI-2012), Minsk, Belarus, 32-35.
  28. Popov, S., & Shkuro, K. (2013). Method for parametric optimization of the network of hybrid neuron-like units based on Ψ-transform. Radioelektronni i komputerni systemy (2(61)), 94-100.
  29. Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: A Gravitational Search Algorithm. Information Sciences, 179(13), 2232-2248.
  30. Himmelblau, D. (1975). Applied nonlinear programming. Moscow: Mir, 536.

Published

2013-10-29

How to Cite

Попов, С. В., & Шкуро, К. А. (2013). Estimating local approximation accuracy with the network of hybrid neuron-like units. Eastern-European Journal of Enterprise Technologies, 5(4(65), 53–59. https://doi.org/10.15587/1729-4061.2013.18350

Issue

Section

Mathematics and Cybernetics - applied aspects