From the point of view of content, COMEDIA began by specifying a number of scenarios or “use cases” that the partners were interested in exploring individually and/or collectively in order to create the content for the public events. These are presented in extenso in the Interim Report for 2008. Out of the original list of fourteen scenarios, nine were realized in 2009 with variations or including certain aspects of other scenarios. Each scenario type (in italics) is followed by one or more examples of it’s realization.
• COMEDIA Bus : the technical backbone of the common platform of the project aimed at making connection easy and efficient.
The implementation of this scenario was only partially realized mainly because of the technical state of the art in networking technologies and practices as a whole. Since the goal of is not a technological one, this situation will most likely persist throughout the rest of the project.
• Technology lecture/training network event : a typical conference situation with feedback from a distant audience.
This was typified in the IRCAM Forum workshops where a double stream enabled Forum members to follow software training and demonstration lectures. Another example is HCMF and IRCAM where networking is now common practice in education, research and conferences.
• Robotic ensemble remote concert and/or installation : a collection of musical robots (automats) interacting over the network.
In the IEM project, robotic instruments are placed in different places over the network. Each instrument has a sensory set-up in parallel to capture the audio, and/or any other generative source for data. Data is delivered to the other places and interpreted by the possibilities of the robotic instrument either directly or in interaction with the data. Each place has a different sound context and character, the correspondence between data and how it varies depends on the characteristics of the sender and receiver (eg: automatic piano player, drum-android, …)
• Multi-way performance with remote avatars : multi-site performance with certain aspects of the performance transmitted in “symbolic” form (visual and auditory) over the network.
Examples of this are aspects of the UGDIST, HFMT and IRCAM “Invisible Line” installation and the HCMF workshop (8/5/2009) where the avatar of the remote performer is created to help to differentiate between the « voices » of the local and the remote musicians.
• Remote performance : “classic” performance or rehearsal over the network (as practiced by the New World Symphony in Miami or the Metropolitan Opera in New York, for example).
Examples of this scenario are the IRCAM dress rehearsal with Henri Dutilleux (7/1/09) or the contrabass Masterclass with François Rabbath (29/4/09) and the New World Symphony.
• Distributed ensemble & real time composition : musical performance depending on composition done in real time over the network with interaction from the performers.
The European Bridges Ensemble (HFMT and HCMF) performed a network concert with Johannes Kretz playing from Vienna connected via network with the concert at the TU Berlin (an associated member). Software had been written to adapt Quintet.net (a network based composition environment) to a WFS sound spatialization system.
A distributed performance took place that was part of the opening concert at SMC (Sound and Music Computing Conference) and included a version of “Disparate Bodies” (by Pedro Rebelo, SARC) and a Duet (Parker/Perry). The event was produced in collaboration with the University of Bournemouth and constituted an important milestone in addressing video transmission issues in network performance.
A series of distributed recording sessions took place between SARC, Rensselaer Polytechnic Institute (NY), and Banff (Canada). The main objective of this session was to record audio and visual materials at three locations simultaneously with a view to better understanding how to present multiple perspectives in a network performance context.
• Two way interactive musical performance : features highly synchronized and reactive performance between two musicians.
This scenario was realized, for example, via a two-way network improvisation between the HFMT in Hamburg (Frank Gratkowski, saxophone ) and CNMAT at Berkeley (David Wessel, electronics).
• Motion-enabled live electronics : using the motion capture data picked up on one site to control the sound (synthesis, spatialisation) at another site.
Via the installation “Invisible Line” (Cera & Canepa), a collaborative artistic project and scientific experiment between IRCAM, UGDIST and HFMT took place, focusing on the interconnection between human expressive full-body movement and gesture analysis and sound and interface design for creating a shared, networked and social experience. The musical composition served to reinforce and inform the interaction between the participants in the two locations.
During the Impuls workshop-concert “Enacted Electronics” at IEM, Motion-Enabled Live Electronics (MELE) was streamed over Internet to remote places, IRCAM and Medienkunstlabor, where it was interpreted locally for sound production. Additional tracking data was transferred to the remote place to be interpreted for the different room situations. By using the Ambisonics system for streaming the environmental sound and for the audio signals and audio “personality” of the original room.
The HCMF Workshop with dancers and composers aimed at using human controlled lasers to be applied in network music performance. The laser beams function both as an instrument, that produce sound in response to the dancers gestures and movements, and simultaneously as a means of visualization or visual counterpoint of the movements or musical gestures of remote performers. An avatar of the remote performer is created to help to differentiate between the « voices » of the local and the remote musicians. Connecting the sound with laser animation of the remote performer’s actions helps the audience to separate the players’ sounds from each other, and to better understand their musical discourse.
The CIANT Golem Workshop took place where the goal was to offer a creative opportunity for young artists to work in an experimental way with interactive and networked sound via using motion capture system. It covered propositions for new choreographies and sound interpretations. This workshop was run with the help of a dancer (Mandafounis of the Lemurius company) in collaboration with CIANT artists (Konjar and Princic). The workshop was composed of the motion capture system experiments with a focus on networked dance, choreography, and sound.
• Networked mediated social interaction : projects where the focus is not so much on the performers but the relationship that is created between the audiences in several different places.
An example of this is the aspect of the UGDIST, HFMT and IRCAM “Invisible Line” installation where one person present in the installation on one site creates a “relationship” with the person present at the other site.
• Multiple-way performance in a virtual environment : creating an audience space other than that present at each node on the network.