EmbodiNet: Enriching Distributed Musical Collaboration Through Embodied Interactions

Abstract : This paper presents EmbodiNet, a novel system that augments distributed performance with dynamic, real-time, hands-free control over several aspects of the musicians’ sound, enabling them to seamlessly change volume, affect reverb and adjust their mix. Musical performance is a demanding activity necessitating multiple levels of communication among its participants, as well as a certain degree of creativity, playfulness and spontaneity. As a result, distributed musical performance presents a challenging application area for the “same time/different place” category of Computer-Supported Cooperative Work (CSCW). In fact, musicians wishing to play together over a network are typically limited by tools that differ little from standard videoconferencing. Instead, we propose leveraging the technology inherent to the distributed context towards meaningfully augmenting collaborative performance. In order to do so without introducing new paradigms that may require learning or that may distract musicians from their primary task, we have designed and evaluated embodied controls that capitalize on existing interpersonal interactions. Further designed to restore the spatial properties of sound that are typically absent in the distributed context, and apply the notion of “shared space” found in CSCW research, EmbodiNet also helps confer a greater level of co-presence than standard distributed performance systems. This paper describes the implementation of EmbodiNet, along with the results of a long-term collaboration and experiment with a three-piece band. The long-term collaboration helped illustrate the benefits of augmenting an artistic form of distributed collaboration, and resulted in a system that not only helped enhance our users’ sense of enjoyment and self-expression, but one that they would also likely use in the future.
Type de document :
Communication dans un congrès
15th Human-Computer Interaction (INTERACT), Sep 2015, Bamberg, Germany. Lecture Notes in Computer Science, LNCS-9297 (Part II), pp.1-19, 2015, Human-Computer Interaction – INTERACT 2015. 〈10.1007/978-3-319-22668-2_1〉
Liste complète des métadonnées

Littérature citée [38 références]  Voir  Masquer  Télécharger

https://hal.inria.fr/hal-01599860
Contributeur : Hal Ifip <>
Soumis le : lundi 2 octobre 2017 - 15:41:14
Dernière modification le : mardi 3 octobre 2017 - 14:45:29

Fichier

346942_1_En_1_Chapter.pdf
Fichiers produits par l'(les) auteur(s)

Licence


Distributed under a Creative Commons Paternité 4.0 International License

Identifiants

Citation

Dalia El-Shimy, Jeremy Cooperstock. EmbodiNet: Enriching Distributed Musical Collaboration Through Embodied Interactions. 15th Human-Computer Interaction (INTERACT), Sep 2015, Bamberg, Germany. Lecture Notes in Computer Science, LNCS-9297 (Part II), pp.1-19, 2015, Human-Computer Interaction – INTERACT 2015. 〈10.1007/978-3-319-22668-2_1〉. 〈hal-01599860〉

Partager

Métriques

Consultations de la notice

36

Téléchargements de fichiers

1