Action recognition of Taekwondo unit actions using action images constructed with time-warped motion profiles

Taekwondo has evolved from a traditional martial art into an official Olympic sport. This study introduces a novel action recognition model tailored for Taekwondo unit actions, utilizing joint-motion data acquired via wearable inertial measurement unit (IMU) sensors. The utilization of IMU sensor-measured motion data facilitates the capture of the intricate and rapid movements characteristic of Taekwondo techniques. The model, underpinned by a conventional convolutional neural network (CNN)-based image classification framework, synthesizes action images to represent individual Taekwondo unit actions. These action images are generated by mapping joint-motion profiles onto the RGB color space, thus encapsulating the motion dynamics of a single unit action within a solitary image. To further refine the representation of rapid movements within these images, a time-warping technique was applied, adjusting motion profiles in relation to the velocity of the action. The effectiveness of the proposed model was assessed using a dataset compiled from 40 Taekwondo experts, yielding remarkable outcomes: an accuracy of 0.998, a precision of 0.983, a recall of 0.982, and an F1 score of 0.982. These results underscore this time-warping technique`s contribution to enhancing feature representation, as well as the proposed method`s scalability and effectiveness in recognizing Taekwondo unit actions.
© Copyright 2024 Sensors. All rights reserved.

Bibliographic Details
Subjects:
Notations:combat sports
Tagging:neuronale Netze
Published in:Sensors
Language:English
Published: 2024
Online Access:https://doi.org/10.3390/s24082595
Volume:24
Issue:8
Pages:2595
Document types:article
Level:advanced