top of page

Week 4 Blog: Kinect Camera Testing Insights




This week, our programming and technical implementation member, Aileen Chen, primarily used the Kinect sensor camera to conduct testing under various lighting conditions. She set up the Kinect camera near the windows on the first floor of the Stratford Main Campus to see if it could track moving objects through the glass clearly. During the test, she discovered that the Kinect’s depth sensor only works effectively at short distances; anything farther away wasn’t captured by the camera in the TouchDesigner software. To address this limitation, she decided to have the Kinect camera track colors instead, allowing it to detect moving objects from a greater distance.


While testing the color-tracking capability, Aileen noticed in the TouchDesigner software that the particles were being displaced by passing cars outside the window. It was exciting to see the camera picking up the cars, which suggests it will likely track people walking by as well. Although she couldn’t test the Kinect sensor at Stratford Main Campus at night, she ran tests on the UWaterloo campus, where it performed similarly to daytime testing, provided there was some street lighting for the camera to capture.


In addition to testing the Kinect camera, Aileen worked on enhancing the current wave particle prototype, adding effects to make it visually more appealing. She introduced a glow effect and a small trailing effect, making the particles look like glowing fish in dark water when they move. This is still a work in progress as we refine the visual aspects of the final product, but everything is shaping up nicely!




Comments


bottom of page