Visitors entering the space get tracked by infrared depth cameras. The bodies are transformed into contours projected as white lines together with the architecture.
Tracking data from the depth camera is used to generate midi values. Users can trigger sounds by merging their contours with the virtual architecture. The movement of the user and its relation to certain architectural features gets translated into sounds.
The augmented environment itself reacts on audio input. Sounds captured by a microphone are added as noise to the architectural line structure.
Human interaction creates a feedback loop between visuals and audio, bringing the virtual representation of the space to life.
The installation can be extended to an augmented audiovisual dance performance. Dancers explore the possibilities of being part of the projection and interacting with the architecture.
Duración (minutos)
10
Qué se necesita
power source video projector, min. resolution 1024x768 vga/dvi/hdmi cable to projector tripod for projector/kinect mounting audio speakers stereo
Este sitio web utiliza cookies para mejorar su experiencia. Asumiremos que está de acuerdo con esto, pero puede optar por impedir la generación de cookies si lo desea.AceptarLeer más sobre privacidad