A web-based system for annotation of dance multimodal recordings by dance practitioners and experts

(Ein web-basiertes System zur Kommentierung von multimodalen Tanzaufnahmen durch Tanzpraktiker und Experten.)

Recent advances in technologies for capturing, analyzing and visualizing movement can revolutionize the way we create, practice, learn dance, and transmit bodily knowledge. The need for creating meaningful, searchable and re-usable libraries of motion capture and video movement segments can only be fulfilled through the collaboration of both technologists and dance practitioners. Towards this direction, manual annotations of these segments by dance experts can play a four-fold role: a) enrich movement libraries with expert knowledge, b) create "ground-truth" datasets for comparing the results of automated algorithms, c) fertilize a dialogue across dance genres and disciplines on movement analysis and conceptualization, and d) raise questions on the subjectivity and diversity of characterizing movement segments using verbal descriptions. The web-based application presented in this work, is an archival system with, browsing, searching, visualization, personalization and textual annotation functionalities. Its main objective is to provide access to a repository of multimodal dance recordings including motion capture data, video, and audio, with the aim to also support dance education. The tool has been designed and developed within an interdisciplinary project, following a user-centered, iterative design approach involving dance researchers and practitioners of four different dance genres.
© Copyright 2018 Proceedings of the 5th International Conference on Movement and Computing. Veröffentlicht von ACM. Alle Rechte vorbehalten.

Bibliographische Detailangaben
Schlagworte:
Notationen:Naturwissenschaften und Technik technische Sportarten
Veröffentlicht in:Proceedings of the 5th International Conference on Movement and Computing
Sprache:Englisch
Veröffentlicht: New York ACM 2018
Schriftenreihe:MOCO '18
Online-Zugang:https://doi.org/10.1145/3212721.3212722
Seiten:8
Dokumentenarten:Artikel
Level:hoch