TY - ADVS
T1 - GIRD (Gesture-based interactive Dance Floor)
T2 - Music Tech Fest - Berlin
A2 - Redhead, Tracy
A2 - Rutherford, Jonathan
PY - 2016
Y1 - 2016
N2 - As technology changes the way we produce and experience music, it presents many creative challenges to musicians and the music industry. In this post digital environment musicians now have the tools and opportunity to completely reinvent forms of recorded music.GIRD is a gestural based interactive audio and lighting system that allows audiences to remix, explore and interact with the music and lights through dancing and gestures.GIRD also gives performers and producers tools to create interactive audio and musical works. The prototype consists of a glove containing IRCAM’s riot sensor, a max for live patch controller and 5 neo pixel LED lights.The patch allows musicians and producers to design dynamic and fluid music that can be explored using gesture and movement.The lighting in the environment plays a vital role. Using individually programmable LED “neo pixels” there are 5 individual lighting fixtures. The fixtures have several modes. 1) The lights can create atmosphere for the music. 2) Gesturally control the lighting 3) Provide interaction feedback to guide the user based on the music being interacted with.This project began in September 2015 when Tracy Redhead and Jonathan Rutherford participated in a Hackathon at Music Tech Fest in Ljubljana #mtfcentral. Their idea was chosen for a 3 month incubation as part of the European Commission funded #musicbricks initiative. The GIRD prototype was launched at MTF Berlin in 2016 and was featured in the Transhumanism performance with Viktoria Modesta as part of the MTF Performance Lab.
AB - As technology changes the way we produce and experience music, it presents many creative challenges to musicians and the music industry. In this post digital environment musicians now have the tools and opportunity to completely reinvent forms of recorded music.GIRD is a gestural based interactive audio and lighting system that allows audiences to remix, explore and interact with the music and lights through dancing and gestures.GIRD also gives performers and producers tools to create interactive audio and musical works. The prototype consists of a glove containing IRCAM’s riot sensor, a max for live patch controller and 5 neo pixel LED lights.The patch allows musicians and producers to design dynamic and fluid music that can be explored using gesture and movement.The lighting in the environment plays a vital role. Using individually programmable LED “neo pixels” there are 5 individual lighting fixtures. The fixtures have several modes. 1) The lights can create atmosphere for the music. 2) Gesturally control the lighting 3) Provide interaction feedback to guide the user based on the music being interacted with.This project began in September 2015 when Tracy Redhead and Jonathan Rutherford participated in a Hackathon at Music Tech Fest in Ljubljana #mtfcentral. Their idea was chosen for a 3 month incubation as part of the European Commission funded #musicbricks initiative. The GIRD prototype was launched at MTF Berlin in 2016 and was featured in the Transhumanism performance with Viktoria Modesta as part of the MTF Performance Lab.
UR - https://mtflabs.net/gird/
UR - https://www.youtube.com/watch?v=g7YYoXM2kBU
M3 - Software
Y2 - 23 May 2016 through 28 February 2022
ER -