Comparison of Single-Shot and Two-Shot Deep Neural Network Models for Pose Estimation in Assistance Living Application
Keywords:
Pose Estimation, Real-Time, Transfer Learning, Key PointAbstract
Estimating human posture from an image or video is an essential task in computer vision. This task has detected body key points from a camera for body posture and gesture recognition technology, which enables the following applications: assisted living in the case of fall detection, yoga pose identification, character animation, and an autonomous drone control system. The rapid development of AI-based posture estimation algorithms for picture recognition has resulted in the availability of quick and dependable solutions for recognizing the human body joint in collected films. One major issue in human posture assessment is the system’s capacity to perform with high accuracy in real-time under shifting ambient conditions. The ultimate goal of the proposed transfer learning-based posture estimation assignment is to achieve real-time speed with virtually no drop accuracy. In this research paper, assisted living program (ALP) is implemented by using a single-shot deep estimation network and a pose key points angular feature. Experimental results show that transfer learning-based pose identifies and estimates posture with a frame rate of about 30 frames per second and a detection accuracy rate of 96.81%.
References
A.Hatsham, Y.Chen “Human Activity Recognition for Elderly People Using Machine and Deep Learning Approaches”, ICIEM Access ,2022.
A.Latee Haroon, “Effective Human Activity Recognition Approach using Machine Learning”, Journal of Robotics Control,2021.
Muhnad, “A Novel Feature Selection Method for Video-Based Huma Activity Recognition Systems”, IEEE Access, 2019.
Shubham, “YOLO based Human Action Recognition and Localization”, (RoSMa),2018.
Tin Zar Wint Cho, “Performance Analysis of Human Action Recognition System between Static k-Means and Non-Static k-Means”, IJSR,2017.
Win Myat Oo, “Feature Based Human Activity Recognition using Neural Network”, IEEE,07 February 2021.
May Phyo Ko, “Human Activity Recognition System Using Angle Inclination Method and Keypoints Descriptor Network”, IEEE, Conference of Young Researchers in Electrical and Electronic Engineering (ElCon), 2024.
May Phyo Ko, “ Keypoints Feature based Activity Recognition Using Deep Neural Network”, ISSN:2709-6505, Journal of Research and Innovation, vol.6,2023.
Eisha Akanksha, “A Feature Extraction Approach for Multi-Object Detection Using HoG and LPT”, International Journal of Intelligent Engineering and Systems, 2021.
K. Wei, “Multiple Branches Faster RCNN for Human Parts Detection and Pose Estimation”, Springer International Publishing, 2020.
Muhammad, “Effective Human Activity Recognition Approach using Machine Learning”, Journal of Robotics Control, 2021.
S-Xiang, “RGB+2D skeleton: local hand-crafted and 3D convolution feature coding for action recognition”, Springer,2021.
V.Parameswari, “Human Activity Recognition using SVM and Deep Learning”, European Journal of Molecular and Clinical Medicine,2020.
Valentin Bazarevsky Tyler Zhu, “On-device Real-time Body Pose Tracking”, arXiv:2006.10204v1 [cs.CV] 17 Jun 2020
Utkarsh Bahukhandi, “Yoga Pose Detection and Classification Using Machine Learning Techniques”, International Research Journal of Modernization in Engineering Technology and Science, vol -3 /issue:12/ December-2021.
Wei Lui, “Single Shot Multi Box Detector”, arXiv:1512.02325v5 [cs.CV] 29 Dec 2016.
Utkarsh Kharb, “Review and Analysis of Various Human Pose Estimation Models”, 2nd International Conference on Advancement in Electronic & Communication Engineering (AECE 2022) July 14-15, 2022.
Jyoti Jangade, “Study on Deep Learning Models for Human pose Estimation and its Real Time Application”, IEEE,6th International Conference on Information Systems and omputer Networks (ISCON), 2023.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 May Phyo Ko, Chaw Su
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Authors who submit papers with this journal agree to the following terms.