Estimation of accuracy in determining the translational velocity of a video camera using data from optical flow
DOI:
https://doi.org/10.15587/1729-4061.2017.108449Keywords:
UAV, optical navigation, dense optical flow, motion field, motion parametersAbstract
The devised approaches are adapted to the complicated conditions of observation in certain real tasks, and are fully operational in those cases when existing standard algorithms fail to give reliable results. We propose a method for determining dynamic motion parameters based on the algorithm of a dense optical flow using a texture analysis. In order to determine an optical flow, we employed a block mapping method that uses adaptively variable size and adaptive motion vector search strategy with weighting the measurements of image blocks, where each block is matched with a texture indicator. A standard block method for estimating optical flow does not imply the use of weighting of the image blocks. A measure of the image block texturization and, consequently, the reliability of the computed motion vector, is determined on the basis of conditionality number of the information matrix. Based on the calculated optical flow, in order to estimate motion parameters, it is proposed to use the least square method that takes into account noise of the measured data. In this case, the minimization is applied at which a contribution to an error is weighed, greater importance is given to the points where the optical flow speed is larger. This is most useful when the measurement of high speeds is more accurate. The norm that produces the best results depends on the noise properties in the measured optical flow. When estimating parameters of the translational motion velocity of the entire image frame, the proposed method considers textural differences of the underlying surface, as well as noise in the measured data of each image block.
We presented simulation results of a UAV motion along different types of the underlying surface and estimated the accuracy of determining translational motion parameters using the optical sensor. Experimental results confirm that the application of a texture analysis when evaluating a motion field improves performance by recruiting a reduced number of vectors, as well as this proves to be more accurate in comparison with traditional block brute-force methodsReferences
- Casbeer, D. W., Li, S. M., Beard, R. W., McLain, T. W., Mehra, R. K. (2005). Forest fire monitoring with multiple small UAVs. Proceedings of the 2005, American Control Conference, 3530–3535. doi: 10.1109/acc.2005.1470520
- Chao, H., Chen, Y. Q. (2012). Remote Sensing and Actuation Using Unmanned Vehicles UAVs. Hoboken, New Jersey: Wiley-IEEE Press, 232.
- Franceschini, N., Ruffier, F., Serres, J. (2007). A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities. Current Biology, 17 (4), 329–335. doi: 10.1016/j.cub.2006.12.032
- Garratt, M. A., Chahl, J. S. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25 (4-5), 284–301. doi: 10.1002/rob.20239
- Beyeler, A., Zufferey, J.-C., Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27 (3), 201–219. doi: 10.1007/s10514-009-9139-6
- Herissé, B., Hamel, T., Mahony, R., Russotto, F.-X. (2012). Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE Transactions on Robotics, 28 (1), 77–89. doi: 10.1109/tro.2011.2163435
- Brox, T., Malik, J. (2011). Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33 (3), 500–513. doi: 10.1109/tpami.2010.143
- Butler, D. J., Wulff, J., Stanley, G. B., Black, M. J. (2012). A Naturalistic Open Source Movie for Optical Flow Evaluation. Lecture Notes in Computer Science, 7577, 611–625. doi: 10.1007/978-3-642-33783-3_44
- Sanada, A., Ishii, K., Yagi, T. (2010). Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor. Journal of Bionic Engineering, 7, S172–S176. doi: 10.1016/s1672-6529(09)60232-8
- Molchanov, A. A., Kortunov, V. I. (2015). Metod ocenki dvizheniya opticheskogo potoka s vzveshivaniem izmerenii blokov izobrazheniya. Sistemy obrobky іnformatsіi, 3 (128), 26–31.
- Hartley, R., Zisserman, A. (2004), Multiple View Geometry in Computer Vision. 2nd edition. Cambridge, U.K.: Cambridge Univ. Press, 655.
- Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22 (11), 1330–1334. doi: 10.1109/34.888718
- Heikkila, J., Silven, O. (1997). A four-step camera calibration procedure with implicit image correction. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1106–112. doi: 10.1109/cvpr.1997.609468
- Von Mises, R. (1959). Theory of Flight. 1st edition. Dover Publications, 672.
- Farhadi, R. M., Kortunov, V. I., Mohammad, A. (2015). UAV motion model and estimation of its uncertainties with flight test data. 22nd Saint Petersburg International Conference on Integrated Navigation Systems, 131–133.
- Horn, B. (1986). Robot Vision. MIT Press, 509.
- Kortunov, V. I., Molchanov, A. O. (2015). Video camera motion detection according to the optical flow. 22nd St. Petersburg International Conference on Integrated Navigation Systems, 81–82.
- Kaplan, E. D., Hegarty, C. (2006). Understanding GPS: Principles and Applications. 2nd edition. Artech House, 726.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2017 Andrii Molchanov, Vyacheslav Kortunov, Farhadi Rahman Mohammadi
This work is licensed under a Creative Commons Attribution 4.0 International License.
The consolidation and conditions for the transfer of copyright (identification of authorship) is carried out in the License Agreement. In particular, the authors reserve the right to the authorship of their manuscript and transfer the first publication of this work to the journal under the terms of the Creative Commons CC BY license. At the same time, they have the right to conclude on their own additional agreements concerning the non-exclusive distribution of the work in the form in which it was published by this journal, but provided that the link to the first publication of the article in this journal is preserved.
A license agreement is a document in which the author warrants that he/she owns all copyright for the work (manuscript, article, etc.).
The authors, signing the License Agreement with TECHNOLOGY CENTER PC, have all rights to the further use of their work, provided that they link to our edition in which the work was published.
According to the terms of the License Agreement, the Publisher TECHNOLOGY CENTER PC does not take away your copyrights and receives permission from the authors to use and dissemination of the publication through the world's scientific resources (own electronic resources, scientometric databases, repositories, libraries, etc.).
In the absence of a signed License Agreement or in the absence of this agreement of identifiers allowing to identify the identity of the author, the editors have no right to work with the manuscript.
It is important to remember that there is another type of agreement between authors and publishers – when copyright is transferred from the authors to the publisher. In this case, the authors lose ownership of their work and may not use it in any way.