4057826

Using interactive machine learning to sonify visually impaired dancers' movement

(Interaktives maschinelles Lernen zur Sonifizierung der Bewegung von sehbehinderten Tänzern)

This preliminary research investigates the application of Interactive Machine Learning (IML) to sonify the movements of visually impaired dancers. Using custom wearable devices with localized sound, our observations demonstrate how sonification enables the communication of time-based information about movements such as phrase length and periodicity, and nuanced information such as magnitudes and accelerations. The work raises a number challenges regarding the application of IML to this domain. In particular we identify a need for ensuring even rates of change in regression models when performing sonification and a need for consideration of how to convey machine learning approaches to end users.
© Copyright 2016 Proceedings of the 3rd International Symposium on Movement and Computing. Veröffentlicht von ACM Press. Alle Rechte vorbehalten.

Bibliographische Detailangaben
Schlagworte:
Notationen:Parasport Naturwissenschaften und Technik
Tagging:Sonifikation maschinelles Lernen Blinde
Veröffentlicht in:Proceedings of the 3rd International Symposium on Movement and Computing
Sprache:Englisch
Veröffentlicht: New York ACM Press 2016
Schriftenreihe:MOCO '16
Online-Zugang:http://doi.acm.org/10.1145/2948910.2948960
Seiten:40
Dokumentenarten:Artikel
Level:hoch