Machine Learning Approach for Ibing Penca Stance Recognition Using Landmark Detection and Angle-Based Classification
DOI:
https://doi.org/10.35877/454RI.asci4539Keywords:
Angle Heuristics, Classification, Ibing Penca, Keypoint, Project-Based Learning (PjBL), StanceAbstract
The problem is how students can learn independently when no assistant teachers are present. The main objective of this research is to build an application system that can recognize the stances in Ibing Penca martial art. This research aims to help facilitate independent learning for martial arts practitioners, especially Ibing Penca, by developing a system that is able to recognize and classify the movements in 62 Ibing Penca stances. To achieve these goals, the research method used is to collect input data in the form of images or videos taken using an Orbbec camera. After the images are obtained, the next stage is data processing to detect important points on the body using landmark detection techniques. The next process is the identification of 33 keypoints on the body using the MediaPipe algorithm. From these keypoints, six important angles were calculated which included the right arm, left arm, right leg, left leg, right foot and left foot. This angle calculation is done using the angle method of the three relevant key points. The system is able to recognize the movements in Ibing Penca with a high degree of accuracy, which is very useful for learners who want to practice independently. The results of this study show that the system is able to classify 62 Ibing Penca moves with a success rate of 95.2% (58 moves), while the error rate is only 4.8% (4 moves). For future research, it is expected to develop this system by adding variations of movements and improving detection accuracy in more diverse environmental conditions.
Downloads
References
Anand Thoutam, V., Srivastava, A., Badal, T., Kumar Mishra, V., Sinha, G. R., Sakalle, A., Bhardwaj, H., & Raj, M. (2022). Yoga Pose Estimation and Feedback Generation Using Deep Learning. Computational Intelligence and Neuroscience, 2022. https://doi.org/10.1155/2022/4311350
Azzahra, N. R., Kasmahidayat, Y., & Suryawan, A. I. (2023). Ibing Penca Baragbag Tengah di Paguron Sinar Pusaka Putra Garut Ibing Penca Of The Middle Baragbag In Paguron Sinar Pusaka Putra Garut. Journal of Education, Humaniora and Social Sciences (JEHSS), 6(1), 301–319. https://doi.org/10.34007/jehss.v6i1.1868
Bazarevsky, V., Grishchenko, I., Raveendran, K., Zhu, T., Zhang, F., & Grundmann, M. (2020). BlazePose: On-device Real-time Body Pose tracking. ArXiv 2006.10204v1 [Cs.CV], 1–4. http://arxiv.org/abs/2006.10204
Bibbò, L., & Vellasco, M. M. B. R. (2023). Human Activity Recognition (HAR) in Healthcare. Applied Sciences (Switzerland), 13(24), 1–9. https://doi.org/10.3390/app132413009
Cao, Z., Member, S., Hidalgo, G., Member, S., Simon, T., Wei, S., & Sheikh, Y. (2019). OpenPose?: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. IEEE Transactions on Pattern Analusis and Machine Intelligence, XXX(Xxx).
Cao, Z., Simon, T., Wei, S. E., & Sheikh, Y. (2017). Realtime multi-person 2D pose estimation using part affinity fields. Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017-Janua, 1302–1310. https://doi.org/10.1109/CVPR.2017.143
Chaudhary, I., Singh, N. T., Chaudhary, M., & Yadav, K. (2023). Real-Time Yoga Pose Detection Using OpenCV and MediaPipe. 2023 4th International Conference for Emerging Technology (INCET), May, 1–5. https://doi.org/10.1109/INCET57972.2023.10170485
Elharrouss, O., Almaadeed, N., & Al-Maadeed, S. (2021). A review of video surveillance systems. Journal of Visual Communication and Image Representation, 77(May 2018), 103116. https://doi.org/10.1016/j.jvcir.2021.103116
Hameed, K., Chai, D., & Rassau, A. (2018). A comprehensive review of fruit and vegetable classification techniques. Image and Vision Computing, 80, 24–44. https://doi.org/10.1016/j.imavis.2018.09.016
Jeong, S. O., & Kook, J. (2023). CREBAS: Computer-Based REBA Evaluation System for Wood Manufacturers Using MediaPipe. Applied Sciences (Switzerland), 13(2). https://doi.org/10.3390/app13020938
Jessika, Handayani, A., Amanda, I., & Auliya, H. M. (2019). A Study on Part Affinity Fields Implementation for Human Pose Estimation with Deep Neural Network. Proceeding - 2019 International Conference of Artificial Intelligence and Information Technology, ICAIIT 2019, 391–396. https://doi.org/10.1109/ICAIIT.2019.8834602
Jin, S., Xu, L., Xu, J., Wang, C., Liu, W., Qian, C., Ouyang, W., & Luo, P. (2020). Whole-Body Human Pose Estimation in the Wild. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 12354 LNCS, 196–214. https://doi.org/10.1007/978-3-030-58545-7_12
Nguyen, V. (2023). Study on Tracking Real-Time Target Human Using Deep Learning for High Accuracy. Journal of Robotics (Hindawi), 2023(Article ID 9446956,). https://doi.org/10.1155/2023/9446956
Oktriyadi, R., & Sentosa, G. (2023). Tepak ciwaringinan pada seni pencak silat di kota bandung. PARAGUNA: Jurnal Ilmu Pengetahuan, Pemikiran, Dan Kajian Seni Karawitan, 10(2), 96–104.
Patel, S., & Lathigara, A. (2022). MediaPipe: yoga pose detection using deep learning models. International Conference on Science, Engineering and Technology, Icset, 125–131. https://soe.rku.ac.in/conferences/data/16_1368_ICSET 2022.pdf
Pradana, A., & Prasetya, A. (2019). Penggunaan Console Kamera Kinect Pada Gerakan Tangan Untuk Mengontrol Visualisasi Objek Gambar. INOVTEK - Seri Elektro, 1(1), 11. https://doi.org/10.35314/ise.v1i1.1046
Putra, I. A., Nurhayati, O. D., & Eridani, D. (2022). Human Action Recognition (HAR) Classification Using MediaPipe and Long Short-Term Memory (LSTM). Teknik, 43(2), 190–201. https://doi.org/10.14710/teknik.v43i2.46439
Rahmawati, V. N., Yuniarno, E. M., Mardi, S., & Nugroho, S. (2023). Pencak Silat Movement Classification Using Convolutional Neural Network (CNN). JAREE (Journal on Advanced Research in Electrical Engineering), 7(2), 99–105.
Rasoulidanesh, M. (2022). On Study of 1D Depth Scans as an Alternative Feature for Human Pose Detection in a Sensor Network. 2022.
Ray, E. L., Sasaki, J. E., Freedson, P. S., & Staudenmayer, J. (2018). Physical activity classification with dynamic discriminative methods. Biometrics, 74(4), 1502–1511. https://doi.org/10.1111/biom.12892
Rustiyanti, S. (2019). Aesthetic Transformation in the Production Process of the Augmented Reality Folklore Pasua Realtime Performance. Journal of Urban Society’s Arts, 6(2), 112–122. https://doi.org/10.24821/jousa.v6i2.3449
Rustiyanti, S., Listiani, W., Sari, F. D., Gede, I. B., Peradantha, S., & Budaya, P. A. (2019). Seni Digital Wisata Teknologi AR Pasua PA Berbasis Kearifan Lokal. Jurnal Budaya Etnika, 3(2), 197–204. https://jurnal.isbi.ac.id/index.php/etnika/article/view/1123
Rustiyanti, S., Listiani, W., Sari, F. D., & Peradantha, I. B. G. S. (2020). Literasi Tubuh Virtual dalam Aplikasi Teknologi Augmented Reality PASUA PA. Panggung, 30(3), 454–464. https://doi.org/10.26742/panggung.v30i3.1271
Rustiyanti, S., Listiani, W., Sari, F. D., & Surya Peradantha, I. (2021). Ekranisasi AR PASUA PA: dari Seni Pertunjukan ke Seni Digital sebagai Upaya Pemajuan Kebudayaan. Mudra Jurnal Seni Budaya, 36(2), 186–196. https://doi.org/10.31091/mudra.v36i2.1064
Shan, Z., Li, Z., & Song, W. (2024). Research on Human Posture Recognition Method Based on Deep Learning. Journal of Mechanics in Medicine and Biology, 24(2), 1–12. https://doi.org/10.1142/S0219519424400104
Sunney, J. (2022). Real-Time Yoga Pose Detection using Machine Learning Algorithm [School of Computing National College of Ireland]. https://ai.googleblog.com/2020/08/on-device-real-time-body-pose-tracking.html
Tawar, R., Jagtap, S., Hirve, D., Gundgal, T., & Kale, N. (2022). Real-Time Yoga Pose Detection. International Research Journal of Modernization in Engineering Technology and Science Www.Irjmets.Com @International Research Journal of Modernization in Engineering, 04(05), 2582–5208. www.irjmets.com
Yang, C. C., & Hsu, Y. L. (2012). Remote monitoring and assessment of daily activities in the home environment. Journal of Clinical Gerontology and Geriatrics, 3(3), 97–104. https://doi.org/10.1016/j.jcgg.2012.06.002
Zhang, L., Huang, W., Wang, C., & Zeng, H. (2023). Improved Multi-Person 2D Human Pose Estimation Using Attention Mechanisms and Hard Example Mining. Sustainability (Switzerland), 15(18), 1–17. https://doi.org/10.3390/su151813363
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Ratnadewi, Agus Prijono, Aan Darmawan Hangkawidjaja, Sri Rustiyanti, Deri Al Badri (Author)

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.


