TY - ADVS
T1 - The Madness of Crowds
A2 - Redhead, Tracy
A2 - Lidbo, Hakan
A2 - Leigh, Ginger
A2 - Brown, Trevor
A2 - Rutherford, Jonathan
PY - 2016
Y1 - 2016
N2 - This unique collaboration between Hakan Lidbo (SE), Tracy Redhead (AU/AT), Synthestruct (US), Trevor Brown (AU) and Jonathan Rutherford (AU) working at Ars Electronica Futurelab takes audience interaction to the next level. During this live improvised performance the audience will decide the composition of the music. The audience will be jamming with performers by using one of the world’s largest midi controllers – 3 giant, bouncing cubes developed by Hakan Libdo and Per Olov Jernberg – and by using the Gesture-based Interactive Remixable Dancefloor (GIRD) gloves — an award-winning prototype developed by Tracy Redhead and Jonathan Rutherford. The audience will also drive the performance’s pulsing interactive and audioreactive projections, using “react();” — a dance visualizer designed by Synthestruct. The music has been composed and designed by Tracy Redhead and Trevor Brown who will also be part of the improvised performance. The audience becomes the soul of the performance highlighting the participatory nature of music and dance as an artform. The project uses Kinect and Gesture sensors, IRCAM Ri-ot sensor, Processing, Abelton Live, Max 7 and Max for Live in combination with real-time musicians.
AB - This unique collaboration between Hakan Lidbo (SE), Tracy Redhead (AU/AT), Synthestruct (US), Trevor Brown (AU) and Jonathan Rutherford (AU) working at Ars Electronica Futurelab takes audience interaction to the next level. During this live improvised performance the audience will decide the composition of the music. The audience will be jamming with performers by using one of the world’s largest midi controllers – 3 giant, bouncing cubes developed by Hakan Libdo and Per Olov Jernberg – and by using the Gesture-based Interactive Remixable Dancefloor (GIRD) gloves — an award-winning prototype developed by Tracy Redhead and Jonathan Rutherford. The audience will also drive the performance’s pulsing interactive and audioreactive projections, using “react();” — a dance visualizer designed by Synthestruct. The music has been composed and designed by Tracy Redhead and Trevor Brown who will also be part of the improvised performance. The audience becomes the soul of the performance highlighting the participatory nature of music and dance as an artform. The project uses Kinect and Gesture sensors, IRCAM Ri-ot sensor, Processing, Abelton Live, Max 7 and Max for Live in combination with real-time musicians.
UR - http://www.tracyredhead.com/?q=content/madness-crowds
M3 - Performance
T2 - Ars Electronica
Y2 - 8 September 2016 through 12 September 2016
ER -