Improved Competitive Neural Network for Classification of Human Postures Based on Data fom RGB-D Sensors

Authors

Keywords: Human motion, Posture classification, Human activity, Competitive neural network, Classifier

Abstract

The cognitive goal of this paper is to assess whether  marker‐less motion capture systems provide sufficient data to recognize human postures in the side view. The research goal is to develop a new posture classification method that allows for analysing human activities using data recorded by RGB‐D sensors. The method is insensi tive to recorded activity duration and gives satisfactory  results for the sagittal plane. An improved competitive  Neural Network (cNN) was used. The method of preprocessing the data is first discussed. Then, a method for classifying human postures is presented. Finally, classification quality using various distance metrics is assessed.
The data sets covering the selection of human activities have been created. Postures typical for these activities have been identified using the classifying neural network.  The classification quality obtained using the proposed cNN network and two other popular neural networks were compared. The results confirmed the advantage of cNN network. The developed method makes it possible to  recognize human postures by observing movement in the sagittal plane.

Downloads

Published
22.02.2024
Issue
Section
Articles

How to Cite

Dutta, V., Cydejko, J., & Zielinska, T. (2024). Improved Competitive Neural Network for Classification of Human Postures Based on Data fom RGB-D Sensors. Journal of Automation, Mobile Robotics and Intelligent Systems, 17(3), 15-28. https://doi.org/10.14313/JAMRIS/3-2023/19

Most read articles by the same author(s)