This week, our Programming and Technical Implementation member, Aileen Chen, had the opportunity to experiment with the Kinect camera. Utilizing the Kinect’s depth sensor, Aileen integrated it with her previously developed interactivity and combined it with the wave visuals created by our Media Creation lead, Max Wentzlaff. As shown in the video below, the system now captures and displays an image of whatever the Kinect is currently tracking in real time.
Aileen also introduced a feature where users can move their bodies in front of the camera, causing the particles on the screen to respond and shift accordingly. While the final direction for interactivity hasn’t been fully determined yet—largely due to learning curve limitations—this movement feature provides an exciting option for potential implementation.
For the next steps, Aileen plans to explore enhancing the interactivity so that users can manipulate wave-shaped particles, adding more ripple effects if feasible. If time permits, she would also like to incorporate audio reactivity using the ambient soundscape Max has created, further immersing users in a responsive, sensory experience.
Comments