Estimating local approximation accuracy with the network of hybrid neuron-like units
DOI:
https://doi.org/10.15587/1729-4061.2013.18350Keywords:
local accuracy estimation, evolutionary architecture optimization, approximation reliability enhancementAbstract
Accuracy is one of the most important properties of the solution to any practical problem. Neuro-fuzzy networks usually generate point estimates of the process under consideration, and the accuracy is estimated on average for the whole dataset. This is the easiest way of accuracy estimation and it is justified for most cases, however it is not enough in some situations, where approximation accuracy may be clearly non-uniform across the dataset. In this paper, a network of hybrid neuron-like units is considered, which is expanded to deliver local accuracy estimates. The architecture is constrained by a priori information about the properties of the input signals and the system being modeled and is subsequently optimized on a synaptic level by an evolutionary algorithm. Introduction of a priori information into evolutionary process enables a gray-box approach to systems modeling. Local accuracy estimation provides vital information for subsequent decision making and increases method’s value for the users. The proposed approach is quite general and can be applied to many popular neural networks, e.g. MLP, FIR networks or any other neural and neuro-fuzzy networks (including emerging ones) that are special cases of the network of hybrid neuron-like units.
References
- Kononenko, I. Machine Learning and Data Mining: Introduction to Principles and Algorithms [Текст] / Kononenko I., Kukar M. – Cambridge : Horwood Publishing, 2007. – 454 p.
- Попов, С. В. Гибридный нейроподобный элемент – новый тип строительного блока искусственных нейронных сетей [Текст] / Попов С. В., Шкуро К. А. // Научный вестник Донбасской государственной машиностроительной академии. – 2011. – № 2(8Е). – С. 87-92.
- Haykin, S. Neural Networks. A Comprehensive Foundation [Текст] / Haykin S. – Upper Saddle River : Prentice Hall, 1999. – 842 p.
- Statistical concepts in reliability [Текст] / Crowder M. J., Kimber A. C., Smith R. L., Sweeting T. J. // Statistical Analysis of Reliability Data – London: Chapman & Hall, 1991. – P. 1-11.
- Schaal, S. Assessing the quality of learned local models [Текст] / Schaal S., Atkeson C. G. // Advances in Neural Information Processing Systems. – San Mateo, CA, 1994. – P. 160-167.
- Birattari, M. Local Learning for Data Analysis [Текст] / Birattari M., Bontempi H., Bersini H. // Proc. 8th Belgian-Dutch Conference on Machine Learning. – Benelearn, 1998. – P. 55-61.
- Rodrigues, P. P. Online Reliability Estimates for Individual Predictions in Data Streams [Текст] / Rodrigues P. P., Gama J., Bosnic Z. // Data Mining Workshops, 2008. ICDMW '08. IEEE International Conference on. – 2008. – P. 36-45.
- Кобзарь, А. И. Прикладная математическая статистика [Текст] / Кобзарь А. И. – М. : ФИЗМАТЛИТ, 2006. – 816 с.
- Rivals, I. Construction of confidence intervals for neural networks based on least squares estimation [Текст] / Rivals I., Personnaz L. // Neural Networks. – 2000. – Vol. 13, № 4-5. – P. 463-484.
- Bishop, C. M. Regression with Input-Dependent Noise: A Bayesian Treatment [Текст] / Bishop C. M., Qazaz C. S. // Advances in Neural Information Processing Systems 9: Proceedings of the 1996 Conference. – Denver, 1997. – P. 347-353.
- Nix, D. A. Learning Local Error Bars for Nonlinear Regression [Текст] / Nix D. A., Weigend A. S. // Advances in Neural Information Processing Systems: Proceedings of the 1994 Conference. – Denver, 1995. – P. 489-496.
- Popov, S. Evolutionary Optimized Network of Hybrid Neuron-Like Units [Текст] / Popov S., Shkuro K. // Proc. 7th Int. Conf. Neural Networks and Artificial Intelligence (ICNNAI-2012). – Minsk, Belarus, 2012. – P. 32-35.
- Попов, С. В. Метод параметрической оптимизации сети на базе гибридных нейроподобных элементов, основанный на методе Ψ-преобразования [Текст] / Попов С. В., Шкуро К. А. // Радіоелектронні і комп’ютерні системи. – 2013. – № 2(61). – С. 94-100.
- Rashedi, E. GSA: A Gravitational Search Algorithm [Текст] / Rashedi E., Nezamabadi-pour H., Saryazdi S. // Information Sciences. – 2009. – Vol. 179, № 13. – P. 2232-2248.
- Химмельблау, Д. Прикладное нелинейное программирование [Текст] / Химмельблау Д. – М. : Мир, 1975. – 536 с.
- Kononenko, I., & Kukar, M. (2007). Machine Learning and Data Mining: Introduction to Principles and Algorithms. Cambridge: Horwood Publishing, 454.
- Popov, S., & Shkuro, K. (2011). Hybrid neuron-like unit – a new type of neural network biulding block. Nauchnyy vestnik Donbasskoy gosudarstvennoy mashinostroitelnoy akademii (2(8Е)), 87-92.
- Haykin, S. (1999). Neural Networks. A Comprehensive Foundation. Upper Saddle River: Prentice Hall, 842.
- Crowder, M. J., Kimber, A. C., Smith, R. L., & Sweeting, T. J. (1991). Statistical concepts in reliability Statistical Analysis of Reliability Data (pp. 1-11). London: Chapman & Hall.
- Schaal, S., & Atkeson, C. G. (1994). Assessing the quality of learned local models. Paper presented at the Advances in Neural Information Processing Systems, San Mateo, CA, 160-167.
- Birattari, M., Bontempi, H., & Bersini, H. (1998). Local Learning for Data Analysis. Paper presented at the Proc. 8th Belgian-Dutch Conference on Machine Learning, Benelearn, 55-61.
- Rodrigues, P. P., Gama, J., & Bosnic, Z. (2008). Online Reliability Estimates for Individual Predictions in Data Streams. Paper presented at the Data Mining Workshops, 2008. ICDMW '08. IEEE International Conference on, 36-45.
- Kobzar, A. (2006). Applied mathematical statistics. Moscow: FIZMATLIT, 816.
- Rivals, I., & Personnaz, L. (2000). Construction of confidence intervals for neural networks based on least squares estimation. Neural Networks, 13(4-5), 463-484.
- Bishop, C. M., & Qazaz, C. S. (1997). Regression with Input-Dependent Noise: A Bayesian Treatment. Paper presented at the Advances in Neural Information Processing Systems 9: Proceedings of the 1996 Conference, Denver, 347-353.
- Nix, D. A., & Weigend, A. S. (1995). Learning Local Error Bars for Nonlinear Regression. Paper presented at the Advances in Neural Information Processing Systems: Proceedings of the 1994 Conference, Denver, 489-496.
- Popov, S., & Shkuro, K. (2012). Evolutionary Optimized Network of Hybrid Neuron-Like Units. Paper presented at the Proc. 7th Int. Conf. Neural Networks and Artificial Intelligence (ICNNAI-2012), Minsk, Belarus, 32-35.
- Popov, S., & Shkuro, K. (2013). Method for parametric optimization of the network of hybrid neuron-like units based on Ψ-transform. Radioelektronni i komputerni systemy (2(61)), 94-100.
- Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: A Gravitational Search Algorithm. Information Sciences, 179(13), 2232-2248.
- Himmelblau, D. (1975). Applied nonlinear programming. Moscow: Mir, 536.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2014 Сергей Витальевич Попов, Кристина Александровна Шкуро
This work is licensed under a Creative Commons Attribution 4.0 International License.
The consolidation and conditions for the transfer of copyright (identification of authorship) is carried out in the License Agreement. In particular, the authors reserve the right to the authorship of their manuscript and transfer the first publication of this work to the journal under the terms of the Creative Commons CC BY license. At the same time, they have the right to conclude on their own additional agreements concerning the non-exclusive distribution of the work in the form in which it was published by this journal, but provided that the link to the first publication of the article in this journal is preserved.
A license agreement is a document in which the author warrants that he/she owns all copyright for the work (manuscript, article, etc.).
The authors, signing the License Agreement with TECHNOLOGY CENTER PC, have all rights to the further use of their work, provided that they link to our edition in which the work was published.
According to the terms of the License Agreement, the Publisher TECHNOLOGY CENTER PC does not take away your copyrights and receives permission from the authors to use and dissemination of the publication through the world's scientific resources (own electronic resources, scientometric databases, repositories, libraries, etc.).
In the absence of a signed License Agreement or in the absence of this agreement of identifiers allowing to identify the identity of the author, the editors have no right to work with the manuscript.
It is important to remember that there is another type of agreement between authors and publishers – when copyright is transferred from the authors to the publisher. In this case, the authors lose ownership of their work and may not use it in any way.