3 Axis with Blender


#1

This question considers the issue of multi sensor integration.

Is it possible to restrain the application of data [ e.g. by Kinnect ] to an axis in Blender?

The thought is that a sensor with data from a machine on X-axis in a room can be used to animate a rig in Blender on its x-axis. [ As a constraint ] Then, transferring another data set from a camera mounted 90 degrees to the first [ aka Y-Axis ] could be used to enhance the rig animation in Y-Axis within Blender - if the action can be limited to that axis - without negating the already established X-axis rig and animation. If doable, this could be extended into negative axes and the Z to work around the multi-sensor limitations.

Has anyone considered this?


#2

You can customize the OSC addresses on the skeleton tracker page using OSC parameter tags:

Essentially you’ll want to change the addresses for each bone to something like: /left_hand {X} for the first sensor, /left_hand {Y} for the second sensor and so on.

However, this won’t work: The Kinect (and other ToF IR cameras) use phased light where the sensor expects to see a projected IR pattern. If you have two or more Kinects facing the same direction the IR dot patterns will be mixed together and sensors won’t know which pattern is the sensor’s own. There’s a potential solution to this by the usage of a vibrating motor attached to the Kinects: https://www.google.com/patents/CN104567720A?cl=en


Building a Dedicated MOCAP machine