Document Type : Full Research Paper


1 M.Sc. Student, Faculty of Biomedical Engineering, Sahand University of Technology, Tabriz, Iran

2 Assistant Professor, Faculty of Biomedical Engineering, Sahand University of Technology, Tabriz, Iran



In recent years, the fabrication of devices that can facilitate the difficulty of communication between deaf people and the general public and translate sign language has attracted interest from researchers. But problems such as low accuracy and calculation speed and the high cost of tools have hindered the commercialization of research. Another challenge in making a practical tool is the necessity of good performance of the methods in the perspective of training by leave-one-subject-out or in other words classifying the data of a new person. Therefore, in this article, an efficient method for detecting hand gestures with the purpose of sign language translation has been presented, so that while using a method with lower dimensions, better performance can be obtained in all kinds of training methods. In the proposed method, the features consisting of the mean absolute value, variance, root mean square, waveform length, kurtosis, and skewness have been extracted from the empirical wavelet transformation of the electromyogram and inertial signals. Then, by the ReliefF method, effective features have been selected and for the classification of hand gestures, a support vector machine classifier has been used. The accuracy percentages of the proposed method on the PSL database and DB2, DB3, DB5, and DB7 datasets of the NinaPro database, have been respectively obtained as follows: 99.31%, 97.11%, 96.58%, 96.12%, and 97.32% in the word-subject training approach, 99.78%, 97.22%, 95.46%, 97.23%, and 97.72% in the word-all-subject training approach, and 97.43%, 94.68%, 89.66%, 91.55%, and 94.81% in the leave-one-subject-out method.


Main Subjects

  1. Word health orgnanization, deafness and hearing loss 2021 [Available from:
  2. Wu J, Sun L, Jafari R. A wearable system for recognizing American sign language in real-time using IMU and surface EMG sensors. IEEE journal of biomedical and health informatics. 2016;20(5):1281-90.
  3. Dryer MS, Haspelmath M. The world atlas of language structures online. Leipzig: Max Planck Institute for Evolutionary Anthropology. Online: http://wals info. 2013.
  4. Karami A, Zanj B, Sarkaleh AK. Persian sign language (PSL) recognition using wavelet transform and neural networks. Expert Systems with Applications. 2011;38(3):2661-7.
  5. Barkoky A, Charkari NM, editors. Static hand gesture recognition of Persian sign numbers using thinning method. 2011 International Conference on Multimedia Technology; 2011: IEEE.
  6. Azar SG, Seyedarabi H. Trajectory-based recognition of dynamic Persian sign language using hidden Markov model. Computer Speech & Language. 2020;61:101053.
  7. Stokoe W. Sign Language Structure, An outline of the visual communications systems of American deaf. Studies in linguistics occasional paper. 1960;8.
  8. مولائی م. بررسی زبان اشاره فارسی از دیدگاه نشانه شناسی. اولین کنفرانس علمی پژوهشی راهکارهای توسعه وترویج آموزش علوم در ایران، 1394.
  9. Shukor AZ, Miskon MF, Jamaluddin MH, bin Ali F, Asyraf MF, bin Bahar MB. A new data glove approach for Malaysian sign language detection. Procedia Computer Science. 2015;76:60-7.
  10. Anderson R, Wiryana F, Ariesta MC, Kusuma GP. Sign language recognition application systems for deaf-mute people: a review based on input-process-output. Procedia computer science. 2017;116:441-8.
  11. Cheok MJ, Omar Z, Jaward MH. A review of hand gesture and sign language recognition techniques. International Journal of Machine Learning and Cybernetics. 2019;10(1):131-53.
  12. Khomami SA, Shamekhi S. Persian sign language recognition using IMU and surface EMG sensors. Measurement. 2021;168:108471.
  13. Fels SS, Hinton GE. Glove-talk: A neural network interface between a data-glove and a speech synthesizer. IEEE transactions on Neural Networks. 1993;4(1):2-8.
  14. Yang X, Chen X, Cao X, Wei S, Zhang X. Chinese sign language recognition based on an optimized tree-structure framework. IEEE journal of biomedical and health informatics. 2016;21(4):994-1004.
  15. Wei W, Dai Q, Wong Y, Hu Y, Kankanhalli M, Geng W. Surface-electromyography-based gesture recognition by multi-view deep learning. IEEE Transactions on Biomedical Engineering. 2019;66(10):2964-73.
  16. Sun T, Hu Q, Gulati P, Atashzar SF. Temporal dilation of deep LSTM for agile decoding of sEMG: Application in prediction of Upper-Limb motor intention in NeuroRobotics. IEEE Robotics and Automation Letters. 2021;6(4):6212-9.
  17. Zhang Y, Yang F, Fan Q, Yang A, Li X. Research on sEMG-Based Gesture Recognition by Dual-View Deep Learning. IEEE Access. 2022;10:32928-37.
  18. Shen S, Wang X, Mao F, Sun L, Gu M. Movements Classification through sEMG with Convolutional Vision Transformer and Stacking Ensemble Learning. IEEE Sensors Journal. 2022.
  19. Tyacke E, Reddy SP, Feng N, Edlabadkar R, Zhou S, Patel J, et al. Hand Gesture Recognition via Transient sEMG Using Transfer Learning of Dilated Efficient CapsNet: Towards Generalization for Neurorobotics. IEEE Robotics and Automation Letters. 2022;7(4):9216-23.
  20. Gopal P, Gesta A, Mohebbi A. A Systematic Study on Electromyography-Based Hand Gesture Recognition for Assistive Robots Using Deep Learning and Machine Learning Models. Sensors. 2022;22(10):3650.
  21. Colli Alfaro JG, Trejos AL. User-Independent Hand Gesture Recognition Classification Models Using Sensor Fusion. Sensors. 2022;22(4):1321.
  22. خمامی سع. تشخیص حالات دست بر پایه سیگنال های الکترومایوگرافی سطحی: دانشگاه صنعتی سهند; 1396.
  23. Atzori M, Gijsberts A, Castellini C, Caputo B, Hager A-GM, Elsig S, et al. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Scientific data. 2014;1(1):1-13.
  24. Atzori M, Gijsberts A, Kuzborskij I, Elsig S, Hager A-GM, Deriaz O, et al. Characterization of a benchmark database for myoelectric movement classification. IEEE transactions on neural systems and rehabilitation engineering. 2014;23(1):73-83.
  25. Atzori M, Gijsberts A, Castellini C, Caputo B, Mittaz Hager A-G, Elsig S, et al. Clinical parameter effect on the capability to control myoelectric robotic prosthetic hands. Journal of rehabilitation research and development. 2016;53(3):345-58.
  26. Pizzolato S, Tagliapietra L, Cognolato M, Reggiani M, Müller H, Atzori M. Comparison of six electromyography acquisition setups on hand movement classification tasks. PloS one. 2017;12(10):e0186132.
  27. Krasoulis A, Kyranou I, Erden MS, Nazarpour K, Vijayakumar S. Improved prosthetic hand control with concurrent use of myoelectric and inertial measurements. Journal of neuroengineering and rehabilitation. 2017;14(1):1-14.
  28. Gilles J. Empirical wavelet transform. IEEE transactions on signal processing. 2013;61(16):3999-4010.
  29. Gilles J, Heal K. A parameterless scale-space approach to find meaningful modes in histograms—Application to image and spectrum segmentation. International Journal of Wavelets, Multiresolution and Information Processing. 2014;12(06):1450044.
  30. Gilles J, Tran G, Osher S. 2D empirical transforms. Wavelets, ridgelets, and curvelets revisited. SIAM Journal on Imaging Sciences. 2014;7(1):157-86.
  31. Kononenko I, Šimec E, Robnik-Šikonja M. Overcoming the myopia of inductive learning algorithms with RELIEFF. Applied Intelligence. 1997;7(1):39-55.
  32. Peng H, Long F, Ding C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on pattern analysis and machine intelligence. 2005;27(8):1226-38.
  33. Parveen AN, Inbarani HH, Kumar ES, editors. Performance analysis of unsupervised feature selection methods. 2012 International Conference on Computing, Communication and Applications; 2012: IEEE.
  34. Sammut C, Webb GI. Encyclopedia of machine learning: Springer Science & Business Media; 2011.
  35. Boser BE, Guyon IM, Vapnik VN, editors. A training algorithm for optimal margin classifiers. Proceedings of the fifth annual workshop on Computational learning theory; 1992.
  36. دانشور س. تشخیص ضایعات قرمز بیماری رتینوپاتی با استفاده از موجک مورلت در تصاویر رنگی شبکیه [کارشناسی ارشد]: وزارت علوم، تحقیقات و فناوری - دانشگاه فردوسی مشهد - دانشکده مهندسی; 1391.
  37. رضایی خ, قادری ف, طاهری گرجی ح, حدادنیا ج. بازشناسی ژست‌ها و حالت‌های حرکتی دست در سیگنال‌های الکترومایوگرام با استفاده از روش هم‌جوشی نرم در انتخاب ویژگی و طبقه‌بندی کننده‌ی بهینه. فصل نامه علمی پژوهشی مهندسی پزشکی زیستی. 2020;14(3):195-208.