Gait Recognition from Motion Capture Data

Varování

Publikace nespadá pod Pedagogickou fakultu, ale pod Fakultu informatiky. Oficiální stránka publikace je na webu muni.cz.
Autoři

BALÁŽIA Michal SOJKA Petr

Rok publikování 2018
Druh Článek v odborném periodiku
Časopis / Zdroj ACM Transactions on Multimedia Computing, Communications and Applications (TOMM), special issue on Representation, Analysis and Recognition of 3D Humans
Fakulta / Pracoviště MU

Fakulta informatiky

Citace
www
Doi http://dx.doi.org/10.1145/3152124
Obor Informatika
Klíčová slova gait recognition
Přiložené soubory
Popis Gait recognition from motion capture data, as a pattern classification discipline, can be improved by the use of machine learning. This paper contributes to the state-of-the-art with a statistical approach for extracting robust gait features directly from raw data by a modification of Linear Discriminant Analysis with Maximum Margin Criterion. Experiments on the CMU MoCap database show that the suggested method outperforms thirteen relevant methods based on geometric features and a method to learn the features by a combination of Principal Component Analysis and Linear Discriminant Analysis. The methods are evaluated in terms of the distribution of biometric templates in respective feature spaces expressed in a number of class separability coefficients and classification metrics. Results also indicate a high portability of learned features, that means, we can learn what aspects of walk people generally differ in and extract those as general gait features. Recognizing people without needing group-specific features is convenient as particular people might not always provide annotated learning data. As a contribution to reproducible research, our evaluation framework and database have been made publicly available. This research makes motion capture technology directly applicable for human recognition.
Související projekty:

Používáte starou verzi internetového prohlížeče. Doporučujeme aktualizovat Váš prohlížeč na nejnovější verzi.