Improvement of the accuracy of determining movement parameters of cuts on classification humps by methods of video analysis
Keywords:video analysis, optical flow, background subtraction, monitoring the parameters of cuts’ movement, classification hump
The study proves that it is necessity to develop and specify the procedures of using a synthesis of the methods of optical flow and background subtraction in the case of controlling objects of complex shapes on a changing background and in the presence of small moving objects that are not subject to tracking. Research in this area can yield significant results to automate the movement parameters control over objects such as railway transport.
For the described conditions of practical use, it is most of all advisable to synthesize the classical Lucas-Kanade method of optical flow and the Horn-Schunck method for segmenting the frames and identifying control zones. The study describes the procedures of choosing the size of control zones and analysing a joint movement in these areas, which makes it possible to identify the movement of a cut even if the cut has been formed from different categories of wagons.
The suggested algorithms were tested on the classification hump at Odesa – the Classifying Section station (Ukraine). The obtained quantitative characteristics of the accuracy of recognizing cuts show that the conditional probability of correct work of the suggested approach is 0.8332, compared with 0.44 in the case of the classical Horn-Schunck method under the same conditions.
- Seroklin, I. M., Fomina, V., Brazhnik, A. (2015). Use of video image analysis methods for detaching control on hump yards. Part 1. ScienceRise, 1/2(6), 16–21. doi: 10.15587/2313-8416.2015.35869
- Ivanov, Y. A. (2011). Tekhnolohyy kompiuternoho zrenyia v systemakh avtovedenyia. Avtomatyka, sviaz, ynformatyka, 6, 46–48.
- Gasimov, R. Ch. (2011). Programnyy kompleks dlya videomonitoringa zheleznodorozhnogo pereezda. Nauchnaya sessiya GUAP. Sbornik dokladov. Part 2. Tekhnicheskie nauki, 10–12.
- Ivanov, Yu. A. (2011). Tekhnologii komp'yuternogo zreniya dlya nablyudeniya za ob''ektami putevoy infrastrkutury. Prom. transp. KhKhI, 5-6, 35–38.
- Rodrigues, T. M. (2011). A novel approach to rail crossing protection using computer vision and radio communications. Graph. And Vision, 20 (1), 41–71.
- Ivanov, Yu. A. (2012). Tekhnologii komp'yuternogo zreniya. Zheleznodorozhnyy transport, 12, 49
- Dufour, J.-Y. (2013). Intelligent video surveillance systems. Printed and bound in Great Britain by CPI Group (UK) Ltd., Croydon, Surrey CR0 4YY, 322.
- Chamasemani, F. F., Affendey, L. S. (2013). Systematic Review and Classification on Video Surveillance Systems. Information Technology and Computer Science, 5 (7), 87–102. doi: 10.5815/ijitcs.2013.07.11
- Sun, D., Roth, S., Black, M. J. (2010). Secrets of optical flow estimation and their principles. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. doi: 10.1109/cvpr.2010.5539939
- Bruhn, A., Weickert, J., Schnörr, C. (2005). Lucas/Kanade Meets Horn/Schunck: Combining Local and Global Optic Flow Methods. International Journal of Computer Vision, 61 (3), 1–21. doi: 10.1023/b:visi.0000045324.43199.43
- Baker, S., Scharstein, D., Lewis, J. P., Roth, S., Black, M. J., Szeliski, R. (2010). A Database and Evaluation Methodology for Optical Flow. International Journal of Computer Vision, 92 (1), 1–31. doi: 10.1007/s11263-010-0390-2
- Shapiro, L. M., Stockman, J. P. (2006). Computer Vision. Knowledge Laboratory, 2, 752.
- Nagy, A. T., Vamossy, Z. M. (2007). Super-Resolution for Traditional and Omnidirectional Image Sequences. John von Neumann Faculty of Informatics. Institute of Software Technology, 117–129.
- Siroklyn, I. M. (2012). Application of the method of the optical flow analysis to control the parameters of passenger traffic. Eastern-European Journal of Enterprise Technologies, 6/3(60), 33–36. Available at: http://journals.uran.ua/eejet/article/view/5507/4949
- Siroklyn, I. M. (2012). Aprobatsiya metodu avtomatychnoho kontrolyu pasazhyropotoku z vykorystannyam tekhnichnoho zoru. Informatsiyno-keruyuchi systemy na zaliznychnomu transporti: naukovo-tekhnichnyy zhurnal, 6, 22–25.
How to Cite
Copyright (c) 2016 Ivan Siroklyn, Sergii Zmii, Anton Lapko, Alexandr Kameniev, Sergey Panchenko
This work is licensed under a Creative Commons Attribution 4.0 International License.
The consolidation and conditions for the transfer of copyright (identification of authorship) is carried out in the License Agreement. In particular, the authors reserve the right to the authorship of their manuscript and transfer the first publication of this work to the journal under the terms of the Creative Commons CC BY license. At the same time, they have the right to conclude on their own additional agreements concerning the non-exclusive distribution of the work in the form in which it was published by this journal, but provided that the link to the first publication of the article in this journal is preserved.
A license agreement is a document in which the author warrants that he/she owns all copyright for the work (manuscript, article, etc.).
The authors, signing the License Agreement with PC TECHNOLOGY CENTER, have all rights to the further use of their work, provided that they link to our edition in which the work was published.
According to the terms of the License Agreement, the Publisher PC TECHNOLOGY CENTER does not take away your copyrights and receives permission from the authors to use and dissemination of the publication through the world's scientific resources (own electronic resources, scientometric databases, repositories, libraries, etc.).
In the absence of a signed License Agreement or in the absence of this agreement of identifiers allowing to identify the identity of the author, the editors have no right to work with the manuscript.
It is important to remember that there is another type of agreement between authors and publishers – when copyright is transferred from the authors to the publisher. In this case, the authors lose ownership of their work and may not use it in any way.