Sign language dactyl recognition based on machine learning algorithms

Authors

DOI:

https://doi.org/10.15587/1729-4061.2021.239253

Keywords:

Gesture recognition, sign language, feature extraction, hand tracking, algorithm evaluation

Abstract

In the course of our research work, the American, Russian and Turkish sign languages were analyzed. The program of recognition of the Kazakh dactylic sign language with the use of machine learning methods is implemented. A dataset of 5000 images was formed for each gesture, gesture recognition algorithms were applied, such as Random Forest, Support Vector Machine, Extreme Gradient Boosting, while two data types were combined into one database, which caused a change in the architecture of the system as a whole. The quality of the algorithms was also evaluated.

The research work was carried out due to the fact that scientific work in the field of developing a system for recognizing the Kazakh language of sign dactyls is currently insufficient for a complete representation of the language. There are specific letters in the Kazakh language, because of the peculiarities of the spelling of the language, problems arise when developing recognition systems for the Kazakh sign language.

The results of the work showed that the Support Vector Machine and Extreme Gradient Boosting algorithms are superior in real-time performance, but the Random Forest algorithm has high recognition accuracy. As a result, the accuracy of the classification algorithms was 98.86 % for Random Forest, 98.68 % for Support Vector Machine and 98.54 % for Extreme Gradient Boosting. Also, the evaluation of the quality of the work of classical algorithms has high indicators.

The practical significance of this work lies in the fact that scientific research in the field of gesture recognition with the updated alphabet of the Kazakh language has not yet been conducted and the results of this work can be used by other researchers to conduct further research related to the recognition of the Kazakh dactyl sign language, as well as by researchers, engaged in the development of the international sign language

Author Biographies

Chingiz Kenshimov, Institute of Information and Computational Technologies

PhD

Department of Robotics and Artificial Intelligence

Zholdas Buribayev, Al-Farabi Kazakh National University

Master

Department of Computer Science

Yedilkhan Amirgaliyev, Institute of Information and Computational Technologies

Doctor of Technical Sciences, Professor

Department of Robotics and Artificial Intelligence

Aisulyu Ataniyazova, Al-Farabi Kazakh National University

Masters student

Department of Computer Science

Askhat Aitimov, Suleyman Demirel University

PhD student

Department of Computer Science

References

  1. Ahmed, M. A., Zaidan, B. B., Zaidan, A. A., Salih, M. M., Lakulu, M. M. bin. (2018). A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017. Sensors, 18 (7), 2208. doi: http://doi.org/10.3390/s18072208
  2. Bilgin, M., Mutludogan, K. (2019). American Sign Language Character Recognition with Capsule Networks. 2019 3rd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT). Ankara. doi: http://doi.org/10.1109/ismsit.2019.8932829
  3. Kudubayeva, S., Zhussupova, B., Aliyeva, G. (2019). Features of the representation of the Kazakh sign language with the use of gestural notation. Proceedings of the 5th International Conference on Engineering and MIS. doi: http://doi.org/10.1145/3330431.3330440
  4. Luqman, H., El-Alfy, E.-S. M., BinMakhashen, G. M. (2020). Joint space representation and recognition of sign language fingerspelling using Gabor filter and convolutional neural network. Multimedia Tools and Applications, 80 (7), 10213–10234. doi: http://doi.org/10.1007/s11042-020-09994-0
  5. Saykol, E., Türe, H. T., Sirvanci, A. M., Turan, M. (2016). Posture labeling based gesture classification for Turkish sign language using depth values. Kybernetes, 45 (4), 604–621. doi: http://doi.org/10.1108/k-04-2015-0107
  6. Van Houdt, G., Mosquera, C., Nápoles, G. (2020). A review on the long short-term memory model. Artificial Intelligence Review, 53 (8), 5929–5955. doi: http://doi.org/10.1007/s10462-020-09838-1
  7. Makarov, I., Chertkov, M., Veldyaykin, N., Pokoev, A. (2019). American and Russian sign language dactyl recognition. ACM International Conference Proceeding Series, 204–210. doi: http://doi.org/10.1109/tsp.2019.8768868
  8. Artyukhin, S. G., Mestetskiy, L. M. (2015). Dactyl alphabet gesture recognition in a video sequence using microsoft kinect. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-5/W6, 83–86. doi: http://doi.org/10.5194/isprsarchives-xl-5-w6-83-2015
  9. Oktekin, B. (2018). Development of Turkish sign language recognition application. Nicosia.
  10. Wadhawan, A., Kumar, P. (2019). Sign Language Recognition Systems: A Decade Systematic Literature Review. Archives of Computational Methods in Engineering, 28 (3), 785–813. doi: http://doi.org/10.1007/s11831-019-09384-2
  11. Munib, Q., Habeeb, M., Takruri, B., Al-Malik, H. A. (2007). American sign language (ASL) recognition based on Hough transform and neural networks. Expert Systems with Applications, 32 (1), 24–37. doi: http://doi.org/10.1016/j.eswa.2005.11.018
  12. Oz, C., Leu, M. C. (2007). Linguistic properties based on American Sign Language isolated word recognition with artificial neural networks using a sensory glove and motion tracker. Neurocomputing, 70 (16-18), 2891–2901. doi: http://doi.org/10.1016/j.neucom.2006.04.016
  13. Oz, C., Leu, M. C. (2011). American Sign Language word recognition with a sensory glove using artificial neural networks. Engineering Applications of Artificial Intelligence, 24 (7), 1204–1213. doi: http://doi.org/10.1016/j.engappai.2011.06.015
  14. Sun, C., Zhang, T., Bao, B.-K., Xu, C., Mei, T. (2013). Discriminative Exemplar Coding for Sign Language Recognition With Kinect. IEEE Transactions on Cybernetics, 43 (5), 1418–1428. doi: http://doi.org/10.1109/tcyb.2013.2265337
  15. Chuan, C.-H., Regina, E., Guardino, C. (2014). American Sign Language Recognition Using Leap Motion Sensor. 2014 13th International Conference on Machine Learning and Applications, 541–544. doi: http://doi.org/10.1109/icmla.2014.110
  16. Tangsuksant, W., Adhan, S., Pintavirooj, C. (2014). American Sign Language recognition by using 3D geometric invariant feature and ANN classification. The 7th 2014 Biomedical Engineering International Conference. doi: http://doi.org/10.1109/bmeicon.2014.7017372
  17. Zamani, M., Kanan, H. R. (2014). Saliency based alphabet and numbers of American sign language recognition using linear feature extraction. 2014 4th International Conference on Computer and Knowledge Engineering (ICCKE), 398–403. doi: http://doi.org/10.1109/iccke.2014.6993442
  18. Savur, C., Sahin, F. (2016). American Sign Language Recognition system by using surface EMG signal. 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 2872–2877. doi: http://doi.org/10.1109/smc.2016.7844675
  19. Saha, S., Lahiri, R., Konar, A., Nagar, A. K. (2016). A novel approach to American sign language recognition using Madaline neural network. 2016 IEEE Symposium Series on Computational Intelligence (SSCI). doi: http://doi.org/10.1109/ssci.2016.7850121
  20. Kudubayeva, S. A., Ryumin, D. A., Kalzhanov, M. U. (2016). The method of basis vectors for recognition sign languageby using sensor KINECT, KazNU Bulletin. Mathematics, Mechanics, Computer Science Series, 3 (91), 86–96.
  21. Gupta, R., Bhatnagar, A. S. (2021). Multi-stage Indian sign language classification with Sensor Modality Assessment. 2021 7th International Conference on Advanced Computing and Communication Systems (ICACCS). doi: http://doi.org/10.1109/icaccs51430.2021.9441906

Downloads

Published

2021-08-31

How to Cite

Kenshimov, C., Buribayev, Z., Amirgaliyev, Y., Ataniyazova, A., & Aitimov, A. (2021). Sign language dactyl recognition based on machine learning algorithms. Eastern-European Journal of Enterprise Technologies, 4(2(112), 58–72. https://doi.org/10.15587/1729-4061.2021.239253