Footage from danceroom spectroscopy development workshops at Bristol’s Arnolfini on 18 Mar 2012.
Ok my big task for the next couple of days is to build a way of merging multiple kinect meshes into a single unified mesh.
What i have: Methods for filtering out the background on high res depth image, Meshes built from the depth data from multiple cameras, a interface to orient them round a single origin.
What i need is methods to: Eliminate / prioitise faces, join the edges of separate meshes to unify the surface.
Im going to have to meditate on this problem a bit, i have some ideas but im not convinced yet.
So, the London stint of the MADE residencies is over. There were a fair few technical challenges to overcome in this short time, most notably streaming Kinect meshes between 4 machines simultaneously and in real time.
I was not 100% happy with the result but i know that i had sacrificed frame rate for stability for the process demo on the last day. I think i could have pushed it quite a lot further and given a little more time could have got them streaming at maybe half frame rate (12fps). other than that the applicaqtion worked well, i still have a fair number of visual bugs to sort out but as we were only paying lip service to the aesthetics of the the work i only care about those for my own personal pride.
Here is some video of all 4 portals setup side by side
You can see more video over at the project blog here
Thanks to the people of the National Theatre for hosting us, you were all very pleasant company and a special thanks to our dancers Sasha, Nick and Amina we could have done none of this without your patient participation.
Check out the video interview with Dave that was featured in physics world where he talks about the Danceroom Spectroscopy project here
If you have a IOP subscription you can read the full article here
Bodig’s wonderful Banu working with the pre-alpha, some bugs are visible to keen eyes.