Estimation of accuracy in determining the translational velocity of a video camera using data from optical flow

Authors

DOI:

https://doi.org/10.15587/1729-4061.2017.108449

Keywords:

UAV, optical navigation, dense optical flow, motion field, motion parameters

Abstract

The devised approaches are adapted to the complicated conditions of observation in certain real tasks, and are fully operational in those cases when existing standard algorithms fail to give reliable results. We propose a method for determining dynamic motion parameters based on the algorithm of a dense optical flow using a texture analysis. In order to determine an optical flow, we employed a block mapping method that uses adaptively variable size and adaptive motion vector search strategy with weighting the measurements of image blocks, where each block is matched with a texture indicator. A standard block method for estimating optical flow does not imply the use of weighting of the image blocks. A measure of the image block texturization and, consequently, the reliability of the computed motion vector, is determined on the basis of conditionality number of the information matrix. Based on the calculated optical flow, in order to estimate motion parameters, it is proposed to use the least square method that takes into account noise of the measured data. In this case, the minimization is applied at which a contribution to an error is weighed, greater importance is given to the points where the optical flow speed is larger. This is most useful when the measurement of high speeds is more accurate. The norm that produces the best results depends on the noise properties in the measured optical flow. When estimating parameters of the translational motion velocity of the entire image frame, the proposed method considers textural differences of the underlying surface, as well as noise in the measured data of each image block.

We presented simulation results of a UAV motion along different types of the underlying surface and estimated the accuracy of determining translational motion parameters using the optical sensor. Experimental results confirm that the application of a texture analysis when evaluating a motion field improves performance by recruiting a reduced number of vectors, as well as this proves to be more accurate in comparison with traditional block brute-force methods

Author Biographies

Andrii Molchanov, M. E. Zhukovsky National Aerospace University “Kharkiv Aviation Institute” Chkalova str., 17, Kharkiv, Ukraine, 61070

Assistant

Department of Aircraft Radio-Electronic Systems Manufacturing

Vyacheslav Kortunov, M. E. Zhukovsky National Aerospace University “Kharkiv Aviation Institute” Chkalova str., 17, Kharkiv, Ukraine, 61070

Doctor of Technical Sciences, Professor, Head of Department

Department of Aircraft Radio-Electronic Systems Manufacturing

Farhadi Rahman Mohammadi, M. E. Zhukovsky National Aerospace University “Kharkiv Aviation Institute” Chkalova str., 17, Kharkiv, Ukraine, 61070

Postgraduate student

Department of Aircraft Radio-Electronic Systems Manufacturing

References

  1. Casbeer, D. W., Li, S. M., Beard, R. W., McLain, T. W., Mehra, R. K. (2005). Forest fire monitoring with multiple small UAVs. Proceedings of the 2005, American Control Conference, 3530–3535. doi: 10.1109/acc.2005.1470520
  2. Chao, H., Chen, Y. Q. (2012). Remote Sensing and Actuation Using Unmanned Vehicles UAVs. Hoboken, New Jersey: Wiley-IEEE Press, 232.
  3. Franceschini, N., Ruffier, F., Serres, J. (2007). A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities. Current Biology, 17 (4), 329–335. doi: 10.1016/j.cub.2006.12.032
  4. Garratt, M. A., Chahl, J. S. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25 (4-5), 284–301. doi: 10.1002/rob.20239
  5. Beyeler, A., Zufferey, J.-C., Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27 (3), 201–219. doi: 10.1007/s10514-009-9139-6
  6. Herissé, B., Hamel, T., Mahony, R., Russotto, F.-X. (2012). Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE Transactions on Robotics, 28 (1), 77–89. doi: 10.1109/tro.2011.2163435
  7. Brox, T., Malik, J. (2011). Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33 (3), 500–513. doi: 10.1109/tpami.2010.143
  8. Butler, D. J., Wulff, J., Stanley, G. B., Black, M. J. (2012). A Naturalistic Open Source Movie for Optical Flow Evaluation. Lecture Notes in Computer Science, 7577, 611–625. doi: 10.1007/978-3-642-33783-3_44
  9. Sanada, A., Ishii, K., Yagi, T. (2010). Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor. Journal of Bionic Engineering, 7, S172–S176. doi: 10.1016/s1672-6529(09)60232-8
  10. Molchanov, A. A., Kortunov, V. I. (2015). Metod ocenki dvizheniya opticheskogo potoka s vzveshivaniem izmerenii blokov izobrazheniya. Sistemy obrobky іnformatsіi, 3 (128), 26–31.
  11. Hartley, R., Zisserman, A. (2004), Multiple View Geometry in Computer Vision. 2nd edition. Cambridge, U.K.: Cambridge Univ. Press, 655.
  12. Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22 (11), 1330–1334. doi: 10.1109/34.888718
  13. Heikkila, J., Silven, O. (1997). A four-step camera calibration procedure with implicit image correction. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1106–112. doi: 10.1109/cvpr.1997.609468
  14. Von Mises, R. (1959). Theory of Flight. 1st edition. Dover Publications, 672.
  15. Farhadi, R. M., Kortunov, V. I., Mohammad, A. (2015). UAV motion model and estimation of its uncertainties with flight test data. 22nd Saint Petersburg International Conference on Integrated Navigation Systems, 131–133.
  16. Horn, B. (1986). Robot Vision. MIT Press, 509.
  17. Kortunov, V. I., Molchanov, A. O. (2015). Video camera motion detection according to the optical flow. 22nd St. Petersburg International Conference on Integrated Navigation Systems, 81–82.
  18. Kaplan, E. D., Hegarty, C. (2006). Understanding GPS: Principles and Applications. 2nd edition. Artech House, 726.

Downloads

Published

2017-08-30

How to Cite

Molchanov, A., Kortunov, V., & Mohammadi, F. R. (2017). Estimation of accuracy in determining the translational velocity of a video camera using data from optical flow. Eastern-European Journal of Enterprise Technologies, 4(9 (88), 37–45. https://doi.org/10.15587/1729-4061.2017.108449

Issue

Section

Information and controlling system