DOI: https://doi.org/10.15587/1729-4061.2017.108449

Estimation of accuracy in determining the translational velocity of a video camera using data from optical flow

Andrii Molchanov, Vyacheslav Kortunov, Farhadi Rahman Mohammadi

Abstract


The devised approaches are adapted to the complicated conditions of observation in certain real tasks, and are fully operational in those cases when existing standard algorithms fail to give reliable results. We propose a method for determining dynamic motion parameters based on the algorithm of a dense optical flow using a texture analysis. In order to determine an optical flow, we employed a block mapping method that uses adaptively variable size and adaptive motion vector search strategy with weighting the measurements of image blocks, where each block is matched with a texture indicator. A standard block method for estimating optical flow does not imply the use of weighting of the image blocks. A measure of the image block texturization and, consequently, the reliability of the computed motion vector, is determined on the basis of conditionality number of the information matrix. Based on the calculated optical flow, in order to estimate motion parameters, it is proposed to use the least square method that takes into account noise of the measured data. In this case, the minimization is applied at which a contribution to an error is weighed, greater importance is given to the points where the optical flow speed is larger. This is most useful when the measurement of high speeds is more accurate. The norm that produces the best results depends on the noise properties in the measured optical flow. When estimating parameters of the translational motion velocity of the entire image frame, the proposed method considers textural differences of the underlying surface, as well as noise in the measured data of each image block.

We presented simulation results of a UAV motion along different types of the underlying surface and estimated the accuracy of determining translational motion parameters using the optical sensor. Experimental results confirm that the application of a texture analysis when evaluating a motion field improves performance by recruiting a reduced number of vectors, as well as this proves to be more accurate in comparison with traditional block brute-force methods

Keywords


UAV; optical navigation; dense optical flow; motion field; motion parameters

Full Text:

PDF

References


Casbeer, D. W., Li, S. M., Beard, R. W., McLain, T. W., Mehra, R. K. (2005). Forest fire monitoring with multiple small UAVs. Proceedings of the 2005, American Control Conference, 3530–3535. doi: 10.1109/acc.2005.1470520

Chao, H., Chen, Y. Q. (2012). Remote Sensing and Actuation Using Unmanned Vehicles UAVs. Hoboken, New Jersey: Wiley-IEEE Press, 232.

Franceschini, N., Ruffier, F., Serres, J. (2007). A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities. Current Biology, 17 (4), 329–335. doi: 10.1016/j.cub.2006.12.032

Garratt, M. A., Chahl, J. S. (2008). Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics, 25 (4-5), 284–301. doi: 10.1002/rob.20239

Beyeler, A., Zufferey, J.-C., Floreano, D. (2009). Vision-based control of near-obstacle flight. Autonomous Robots, 27 (3), 201–219. doi: 10.1007/s10514-009-9139-6

Herissé, B., Hamel, T., Mahony, R., Russotto, F.-X. (2012). Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE Transactions on Robotics, 28 (1), 77–89. doi: 10.1109/tro.2011.2163435

Brox, T., Malik, J. (2011). Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 33 (3), 500–513. doi: 10.1109/tpami.2010.143

Butler, D. J., Wulff, J., Stanley, G. B., Black, M. J. (2012). A Naturalistic Open Source Movie for Optical Flow Evaluation. Lecture Notes in Computer Science, 7577, 611–625. doi: 10.1007/978-3-642-33783-3_44

Sanada, A., Ishii, K., Yagi, T. (2010). Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor. Journal of Bionic Engineering, 7, S172–S176. doi: 10.1016/s1672-6529(09)60232-8

Molchanov, A. A., Kortunov, V. I. (2015). Metod ocenki dvizheniya opticheskogo potoka s vzveshivaniem izmerenii blokov izobrazheniya. Sistemy obrobky іnformatsіi, 3 (128), 26–31.

Hartley, R., Zisserman, A. (2004), Multiple View Geometry in Computer Vision. 2nd edition. Cambridge, U.K.: Cambridge Univ. Press, 655.

Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22 (11), 1330–1334. doi: 10.1109/34.888718

Heikkila, J., Silven, O. (1997). A four-step camera calibration procedure with implicit image correction. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1106–112. doi: 10.1109/cvpr.1997.609468

Von Mises, R. (1959). Theory of Flight. 1st edition. Dover Publications, 672.

Farhadi, R. M., Kortunov, V. I., Mohammad, A. (2015). UAV motion model and estimation of its uncertainties with flight test data. 22nd Saint Petersburg International Conference on Integrated Navigation Systems, 131–133.

Horn, B. (1986). Robot Vision. MIT Press, 509.

Kortunov, V. I., Molchanov, A. O. (2015). Video camera motion detection according to the optical flow. 22nd St. Petersburg International Conference on Integrated Navigation Systems, 81–82.

Kaplan, E. D., Hegarty, C. (2006). Understanding GPS: Principles and Applications. 2nd edition. Artech House, 726.


GOST Style Citations


Casbeer, D. W. Forest fire monitoring with multiple small UAVs [Text] / D. W. Casbeer, S. M. Li, R. W. Beard, T. W. McLain, R. K. Mehra // Proceedings of the 2005, American Control Conference, 2005. – P. 3530–3535. doi: 10.1109/acc.2005.1470520 

Chao, H. Remote Sensing and Actuation Using Unmanned Vehicles UAVs [Text] / H. Chao, Y. Q. Chen. – Hoboken, New Jersey: Wiley-IEEE Press, 2012. – 232 p.

Franceschini, N. A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities [Text] / N. Franceschini, F. Ruffier, J. Serres // Current Biology – 2007. – Vol. 17, Issue 4. – P. 329–335. doi: 10.1016/j.cub.2006.12.032 

Garratt, M. A. Vision‐based terrain following for an unmanned rotorcraft [Text] / M. A. Garratt, J .S. Chahl // Journal of Field Robotics. – 2008. – Vol. 2, Issue 4-5. – P. 284–301. doi: 10.1002/rob.20239 

Beyeler, A. Vision-based control of near-obstacle flight [Text] / A. Beyeler, J.-C. Zufferey, D. Floreano // Autonomous robots. – 2009. – Vol. 27, Issue 3. – P. 201–219. doi: 10.1007/s10514-009-9139-6 

Herissé, B. Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow [Text] / B. Herissé, T. Hamel, R. Mahony // IEEE Transactions on Robotics. – 2012. – Vol. 28, Issue 1. – P. 77–89. doi: 10.1109/tro.2011.2163435 

Brox, T. Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation [Text] / T. Brox, J. Malik // IEEE Transactions on Pattern Analysis and Machine Intelligence. – 2011. – Vol. 33, Issue 3. – P. 500–513. doi: 10.1109/tpami.2010.143 

Butler, D. A naturalistic open source movie for optical flow evaluation. [Text] / D. Butler, J. Wulff, G. Stanley, M. Black // Lecture Notes in Computer Science. – 2012. – Vol. 7577. – P. 611–625. doi: 10.1007/978-3-642-33783-3_44 

Sanada, A. Self-localization of an omnidirectional mobile robot based on an optical flow sensor [Text] / A. Sanada, K. Ishii, T. Yagi // Journal of Bionic Engineering. – 2010. – Vol. 7. – P. S172–S176. doi: 10.1016/s1672-6529(09)60232-8 

Molchanov, A. A. Metod ocenki dvizheniya opticheskogo potoka s vzveshivaniem izmerenii blokov izobrazheniya [Text] / A. A. Molchanov, V. I. Kortunov // Sistemy obrobky іnformatsіi. – 2015. – Vol. 3, Issue 128. – P. 26–31.

Hartley, R. Multiple View Geometry in Computer Vision. 2nd edition [Text] / R. Hartley, A. Zisserman. – Cambridge, U.K.: Cambridge Univ. Press, 2004. – 655 p.

Zhang, Z. A Flexible New Technique for Camera Calibration [Text] / Z. Zhang // IEEE Transactions on Pattern Analysis and Machine Intelligence. – 2000. – Vol. 22, Issue 11. – P. 1330–1334. doi: 10.1109/34.888718 

Heikkila, J. A Four-step Camera Calibration Procedure with Implicit Image Correction [Text] / J. Heikkila, O. Silven // Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1997. – P. 1106–1112. doi: 10.1109/cvpr.1997.609468 

Von Mises, R. Theory of Flight. 1st edition [Text] / R. Von Mises. – Dover Publications, 1959. – 672 p.

Farhadi, R. M. UAV motion model and estimation of its uncertainties with flight test data [Text] / R. M. Farhadi, V. I. Kortunov, A. Mohammadi // 22nd Saint Petersburg International Conference on Integrated Navigation Systems, 2015. – P. 131–133.

Horn, B. Robot Vision [Text] / B. Horn. – MIT Press, 1986. – 509 p.

Kortunov, V. I. Video camera motion detection according to the optical flow [Text] / V. I. Kortunov, A. O. Molchanov// 22nd St. Petersburg International Conference on Integrated Navigation Systems, 2015. – P. 81–82.

Kaplan, E. D. Understanding GPS: Principles and Applications. 2nd edition [Text] / E. D. Kaplan, C. Hegarty. – Artech House, 2006. – 726 p.







Copyright (c) 2017 Andrii Molchanov, Vyacheslav Kortunov, Farhadi Rahman Mohammadi

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

ISSN (print) 1729-3774, ISSN (on-line) 1729-4061