Accuracy assessment of marker recognition using ultra wide angle camera

Authors

DOI:

https://doi.org/10.15587/2706-5448.2022.259068

Keywords:

augmented reality, marker recognition, ArUco fiducial markers, recognition accuracy, surgical navigation, ultra-wide angle camera

Abstract

Modern devices that support augmented reality technology are widely used in various fields of human activity, including medicine. Head mounted displays may provide an attractive alternative to traditional surgery navigation systems because allow users to stand at the first point of view and interact with objects in their surroundings naturally. Thus, the object of research in this study is recognition accuracy of fiducial markers in zones where ultra-wide angle camera distort the most. This is motivated by the need to increase user workspace for interaction with markers compare to the workspace provided with such popular augmented reality device as Microsoft HoloLens 2.

In this study, the recognition accuracy is evaluated using ArUco square markers with taking into account different marker sizes and their positions in the camera view space. The marker positions include the center of the camera view space as well as such zones where lenses distort the most as top left, top right, bottom left, and bottom right corners.

Obtained results show that recognition accuracy is good enough to be applicable for surgical navigation and failures referred to the distortion occurs are available in less than 0.2 % of all cases. This gives a possibility to increase workspace for interaction with markers compare to the Microsoft HoloLens 2. At the same time, the workspace for interaction could not reach the actual view space of the camera since recognition fails in cases where marker’s body is partially visible in the captured image (i. e., marker position is at the image boundaries).

Author Biographies

Svitlana Alkhimova, National Technical University of Ukraine «Igor Sikorsky Kyiv Polytechnic Institute»

PhD

Department of Biomedical Cybernetics

Illia Davydovych, National Technical University of Ukraine «Igor Sikorsky Kyiv Polytechnic Institute»

Department of Biomedical Cybernetics

References

  1. Vassallo, R., Rankin, A., Chen, E. C., Peters, T. M. (2017, March). Hologram stability evaluation for Microsoft HoloLens. Medical Imaging 2017: Image Perception, Observer Performance, and Technology Assessment, 10136, 295–300. doi: http://doi.org/10.1117/12.2255831
  2. Eckert, M., Volmerg, J. S., Friedrich, C. M. (2019). Augmented Reality in Medicine: Systematic and Bibliographic Review. JMIR mHealth and uHealth, 7 (4), e10967. doi: http://doi.org/10.2196/10967
  3. Moro, C., Phelps, C., Redmond, P., Stromberga, Z. (2020). HoloLens and mobile augmented reality in medical and health science education: A randomised controlled trial. British Journal of Educational Technology, 52 (2), 680–694. doi: http://doi.org/10.1111/bjet.13049
  4. Olson, E. (2011). AprilTag: A robust and flexible visual fiducial system. 2011 IEEE international conference on robotics and automation, 3400–3407. doi: http://doi.org/10.1109/icra.2011.5979561
  5. Romero-Ramirez, F. J., Muñoz-Salinas, R., Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers. Image and Vision Computing, 76, 38–47. doi: http://doi.org/10.1016/j.imavis.2018.05.004
  6. Microsoft: Learn about HoloLens 2 features and review technical specs. Available at: https://www.microsoft.com/en-us/hololens/hardware Last accessed: 20.05.2022
  7. Howard, I. P., Rogers, B. J. (1995). Binocular vision and stereopsis. Oxford psychology series No. 29. Oxford University Press, 736. doi: https://doi.org/10.1093/acprof:oso/9780195084764.001.0001
  8. Brand, M., Wulff, L. A., Hamdani, Y., Schüppstuhl, T. (2020). Accuracy of Marker Tracking on an Optical See-Through Head Mounted Display. Annals of Scientific Society for Assembly, Handling and Industrial Robotics. Vieweg, Berlin, Heidelberg: Springer, 21–31. doi: http://doi.org/10.1007/978-3-662-61755-7_3
  9. Thabit, A., Niessen, W. J., Wolvius, E. B., van Walsum, T. (2022). Evaluation of marker tracking using mono and stereo vision in Microsoft HoloLens for surgical navigation. Medical Imaging 2022: Image-Guided Procedures, Robotic Interventions, and Modeling, 12034, 253–262. doi: http://doi.org/10.1117/12.2607262
  10. Zhao, H., Ying, X., Shi, Y., Tong, X., Wen, J., Zha, H. (2020). RDCFace: radial distortion correction for face recognition. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7721–7730. doi: http://doi.org/10.1109/cvpr42600.2020.00774
  11. Miki, D., Abe, S., Chen, S., Demachi, K. (2020). Robust human motion recognition from wide-angle images for video surveillance in nuclear power plants. Mechanical Engineering Journal, 7 (3), 19–00533–19–00533. doi: http://doi.org/10.1299/mej.19-00533
  12. Remondino, F., Fraser, C. (2006). Digital camera calibration methods: considerations and comparisons. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 36 (5), 266–272. doi: https://doi.org/10.3929/ethz-b-000158067
  13. OpenCV: Tutorials for contrib modules. Detection of ArUco Markers. Available at: https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html Last accessed: 20.05.2022
  14. Liu, X., Plishker, W., Shekhar, R. (2021). Hybrid electromagnetic-ArUco tracking of laparoscopic ultrasound transducer in laparoscopic video. Journal of Medical Imaging, 8 (1). doi: http://doi.org/10.1117/1.jmi.8.1.015001
  15. Oščádal, P., Heczko, D., Vysocký, A., Mlotek, J., Novák, P., Virgala, I. et. al. (2020). Improved Pose Estimation of Aruco Tags Using a Novel 3D Placement Strategy. Sensors, 20 (17), 4825. doi: http://doi.org/10.3390/s20174825
  16. Luzon, J. A., Stimec, B. V., Bakka, A. O., Edwin, B., Ignjatovic, D. (2020). Value of the surgeon’s sightline on hologram registration and targeting in mixed reality. International Journal of Computer Assisted Radiology and Surgery, 15 (12), 2027–2039. doi: http://doi.org/10.1007/s11548-020-02263-3
  17. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., Marín-Jiménez, M. J. (2014). Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition, 47 (6), 2280–2292. doi: http://doi.org/10.1016/j.patcog.2014.01.005
  18. Looker, J., Garvey, T. (2015). Reaching for Holograms: Assessing the Ergonomics of the Microsoft™ Hololens™ 3D Gesture Known as the «Air Tap». Proceedings from International Design Congress. Gwangju: KSDS, 504–511.

Downloads

Published

2022-06-16

How to Cite

Alkhimova, S., & Davydovych, I. (2022). Accuracy assessment of marker recognition using ultra wide angle camera. Technology Audit and Production Reserves, 3(2(65), 6–10. https://doi.org/10.15587/2706-5448.2022.259068

Issue

Section

Information Technologies: Reports on Research Projects