4057826

Using interactive machine learning to sonify visually impaired dancers' movement

This preliminary research investigates the application of Interactive Machine Learning (IML) to sonify the movements of visually impaired dancers. Using custom wearable devices with localized sound, our observations demonstrate how sonification enables the communication of time-based information about movements such as phrase length and periodicity, and nuanced information such as magnitudes and accelerations. The work raises a number challenges regarding the application of IML to this domain. In particular we identify a need for ensuring even rates of change in regression models when performing sonification and a need for consideration of how to convey machine learning approaches to end users.
© Copyright 2016 Proceedings of the 3rd International Symposium on Movement and Computing. Published by ACM Press. All rights reserved.

Bibliographic Details
Subjects:
Notations:sports for the handicapped technical and natural sciences
Tagging:Sonifikation maschinelles Lernen Blinde
Published in:Proceedings of the 3rd International Symposium on Movement and Computing
Language:English
Published: New York ACM Press 2016
Series:MOCO '16
Online Access:http://doi.acm.org/10.1145/2948910.2948960
Pages:40
Document types:article
Level:advanced