TY - ADVS
T1 - Semantic Machine Prototype Launch
T2 - FAST Industry Day
A2 - Redhead, Tracy
PY - 2018
Y1 - 2018
N2 - The Semantic Machine is a project developed by Tracy Redhead and Florian Thalmann. It uses the Semantic Player technology to create a song which changes based on the weather, time of day and location of the listener. The song changes just like we all do, depending on the time of day and weather. Its like it has a mind of its own. This project is part of the FAST research project. Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption (FAST) is a five year project funded by EPSRC. The project partners include Queen Mary University of London, The University of Nottingham, Oxford e-Research Centre, Abbey Road Red, Internet Archive, BBC, Audio Labs and Solid State Logic. “The project brings the very latest technologies to bear on the entire recorded music industry, end-to-end, producer to consumer, making the production process more fruitful, the consumption process more engaging, and the delivery and and intermediation more automated and robust.” FAST website. The Semantic Machine is a recorded work that adapts to the listener's environment. The Semantic Machine is affected by the listener's location and the time of day. The location’s coordinates, temperature and weather as well as the time of day are all factors that affect how the music is played back. For example if the work is experienced early morning, the tempo may be different to that which is heard late afternoon or late evening. If it is snowing the work may play back with the addition of a warm tonal mix. All of these control factors are designed during the composition and modeling processes. The project has been developed over the past two year and has included a number of residencies at Queen Mary Univeristy, London.
AB - The Semantic Machine is a project developed by Tracy Redhead and Florian Thalmann. It uses the Semantic Player technology to create a song which changes based on the weather, time of day and location of the listener. The song changes just like we all do, depending on the time of day and weather. Its like it has a mind of its own. This project is part of the FAST research project. Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption (FAST) is a five year project funded by EPSRC. The project partners include Queen Mary University of London, The University of Nottingham, Oxford e-Research Centre, Abbey Road Red, Internet Archive, BBC, Audio Labs and Solid State Logic. “The project brings the very latest technologies to bear on the entire recorded music industry, end-to-end, producer to consumer, making the production process more fruitful, the consumption process more engaging, and the delivery and and intermediation more automated and robust.” FAST website. The Semantic Machine is a recorded work that adapts to the listener's environment. The Semantic Machine is affected by the listener's location and the time of day. The location’s coordinates, temperature and weather as well as the time of day are all factors that affect how the music is played back. For example if the work is experienced early morning, the tempo may be different to that which is heard late afternoon or late evening. If it is snowing the work may play back with the addition of a warm tonal mix. All of these control factors are designed during the composition and modeling processes. The project has been developed over the past two year and has included a number of residencies at Queen Mary Univeristy, London.
UR - https://dynamic-music.github.io/semantic-machine
UR - http://www.tracyredhead.com/?q=content/semantic-machine
M3 - Digital or Visual Products
PB - Queen Mary University of London
CY - Abbey Road Studios
Y2 - 25 October 2018 through 25 October 2018
ER -