Blender Conference 2015 RealSense demos and slides available


#1

Last October we had the privilege to participate and present at the yearly Blender Conference. We’ve now published the Blender .blend file examples that were created for the conference in order to demonstrate NI mate’s support for the Intel RealSense F200 sensor.

In all the three examples you should use the included version of our Blender plugin as opposed to the one available on our website. This is important because the scripts have some minor tweaks that are used for receiving shape keys and other elements. Make sure you uninstall any pre-existing plugins, then load the .blend file, right click the script in the text editor and choose “run script”.

You can now also get our presentation sheets here: https://ni-mate.com/release/blends/bconf2016/NI%20mate%20at%20Blender%20Conference%202015.pdf while the main theatre presentation can be watched online here: https://www.youtube.com/watch?v=JF8rj0cJPs0

Camera: https://ni-mate.com/release/blends/bconf2016/camera.blend

This example .blend demonstrates a way of doing “fake 3D” by using the sensor’s head tracking and mapping that to a camera in Blender. Start NI mate from this project file. Alternative, manually enable skeleton tracker and change the Head OSC address to “Camera”. Set the output mode to “Position.” Make sure the plugin is set to the correct port and start receiving data. You might want to change to the “Default.002” UI layout to get a fullscreen window that looks pretty good with fake 3D.

Ton: https://ni-mate.com/release/blends/bconf2016/ton.blend

This example .blend uses a scan of Blender founder and developer Ton Roosendaal’s head, mapped with some simple shape keys. Start NI mate from this project file. Alternatively do it manually: in NI mate, make sure Skeleton Tracker and Face Shapes are enabled. In the .blend file there’s a text file called “oscpaths” that contains some strings which you need to copy to NI mate’s OSC addresses for the “Face Shapes” component. The paths are Python expressions that are evaluated by the plugin.

Hands: https://ni-mate.com/release/blends/bconf2016/handdemo.blend

This example shows a basic hand tracking system in the game engine. The live feed is also drawn. Start NI mate from this project file, or alternatively enable Skeleton Tracking yourself. Open the .blend file and hit P in the 3D view.


Facial motion capture
Xboxone kinect v2 model1520
Live motion capture and object interaction with Blender + Ni-Mate
Ni-mate + kinect 2 + facial expressions tutorial
Face Detection Not Working
Need Help with Triggers in Blender Game Engine
Face Detection Not Working