NI mate plugin for Blender 2.8 released!

After a long wait, Blender 2.8 has finally been released. The NI mate plugin has now been updated to run on this new version of Blender. We now supply two versions of the plugin, and you can download the one you need below:

Unfortunately the removal of the Blender Game Engine means some features aren’t available anymore. The most important of these are the NI mate feeds which currently aren’t easily implementable in Blender. Perhaps something changes in a later version of Blender which makes the feeds possible with Eevee.

Special shoutout goes to Denis Perchenko whose contributions made our work a lot easier.

In Blender 2.80, the plugin can be found in a new place. Please see the attached image.

2 Likes

Firing up my machine with kinects :snowboarder:

A little fix was just pushed to the 2.79 plugin as the filename cannot have a period in it. If your 2.79 plugin didn’t work, simply redownload it and install it again.

Hello @Jesse Leap motion is not tracking in blender 2.8…though it is tracking in ni-mate 2.12. Probably plugin issue!

I just installed 2.12 and tried our Leap with the 2.8 plugin. It worked without issues.

Could you verify the log window in NI mate says the OSC messages are being outputted? Make sure to tick the “OSC” box in the log window.

And yes its working now…unfortunately the port number is mis matching…I solved it

Unfortunately existing rig is not working… remapping from start☹️

Hey, brand new to motion capture here! NI Mate was super easy to use from the start I had it all figured out and working within minutes which was really nice! But, I have no clue what I’m doing. I’m using the Blender 2.8 add on and made a short recording of myself using a Kinect and now I have the key-framed empties in Blender. From there, how do I make an armature from the empties?

Which depth sensor are you using?..if kinect V2 Remington provides skeleton for it. If other sensor Ni-Mate forums has skeleton which automatically tracks user data and can make bvh files in realtime

I’m using a Kinect V2, the skeleton he provided just glitched into a weird position upon opening. Also, his website is currently under maintenance and the rig is unavailable for now.

If you want to get a rig in Blender, you could try exporting the NI mate default sample clips as BVH and then import them in Blender. This way you have a rig that matches the NI mate data. You can do this from the motion editor tab as per this documentation page.

The easiest way to use the empties is to output the skeleton data from NI mate as “basic + orientation” (on the skeleton tracker page) and then, in Blender, apply the position data only for the root joint in your armature and use “copy rotation” constraints on the rest of the bones. That is:

  • Create the rig in Blender
  • Receive the empties with position and orientation data
  • Go to the armature and add a copy location constraint to the root bone, copy the location of the two empties for the hips
  • Go to every other bone and use “copy rotation” constraint and set them to copy the rotation of the respective empties
1 Like

Would there be any way to do it using the Blender add-on so that my actor could see their performance live (rigged to a mesh) on a monitor?

The simplest way would be to simply have the Blender viewport display the rigged model in realtime as your actor is performing. The constraints follow the data in real time as the data arrives from NI mate.

Exporting the example clip seems to require a pro license, which I do not have. So, I’m still working on making an armature for it.