Moving head joints with Kinect 2

I am new to this forum, and don’t see an option to post a new topic. So I’m hoping I can ask my question here for now: How can I make my armature’s head look left and right, without needing to turn my body or shoulders?

I’m using NI mate v2.14 with Blender v2.82, with an Xbox One Kinect. I’m impressed with how well the motion capture works, but can’t figure out how to make my armature track my head to look left and right. All other directions work such as looking up or down, and tilting my head side to side. It just doesn’t seem to rotate along the vertical axis. I have watched all the tutorials I can find (none are recent unfortunately), and as of tonight have searched the NI mate forum, but I cannot find an answer to this yet. Is this possible, and if so, how?

Thank you,


I moved this in a new topic. Users automatically get more permissions as they browse and post on the forums, but it can take a while.

I wonder if the head joint’s rotation along that axis is not included in the Kinect 2 SDK body tracking. Just to eliminate a few options, you could try loading the example clip in NI mate on your sensor page (from the “source” button) and stream that to Blender, and see if the head is rotated with that data.

Are you using the default joint names with position + orientation mode on the skeleton tracker page?

Hi Jesse. Thank you for replying so quickly. I was able to view the example clip you suggested in Blender (with the head Axis turned on). When I played the clip, it appears that the head does not rotate independently around the Z axis.

For my project, I used a human meta-rig as a base armature (i think it’s from Rigify), I renamed all the bones using the NI mate joint names, and used the bone constraint “stretch to” to match all the joints to the generated empties. I then used a separate armature to “copy rotation” of the bones in the first armature. As a result, the armature works really well, with the exception of the head not looking left or right.

The way I built my project is based largely on a video tutorial I found from Remington Creative on youtube. However, in none of the tutorials that I’ve seen does anyone address the issue of the head not looking left or right. The only time I have ever seen an armature successfully be able to do this, as well as get my armature’s head to work this way, is by using a modified version of NI mate called Kinector. Kinector also allows for easy and effective head and face tracking, however, the overall motion capture is delayed, and jerky/glitchy in it’s appearance (especially the legs and feet). Because of this, I keep switching back to NI mate 2.14 because of how fluid the motion capture appears in real time. Unfortunately I’m still struggling to get the head to look left and right, and have not been able to get face tracking to work either.

I really appreciate your help with this. Is it possible for me to upload my .blend file for you to review?

NI mate (and Z Vector) are currently “as-is” products with no further updates planned and we mostly help people get the programs running, but don’t have the manpower to give individual support of that level. So unfortunately I can’t take a look at your blend file.

You could try making a non-rigify rig and see if you can get the head moving with that, and then build your way up towards a more complicated rig to see where the head stops working. That’s how I would go about it.

i never even paid attention to this. thanks for posting, now i’m going to check how mine works. will update after going for Baked Mackerel Fillets with Spices Recipe. thanks

Thanks for the suggestion. I tried starting a new project, switched the Ni mate format to Basic + Orientation, and connected a single bone to the “Head” empty (I used an empty-cube to see rotation better). I still had the same issue; the cube would look up and down on the X axis, and tilt side to side on the Y axis, but would not rotate to look left and right around the vertical Z axis. For now I’m going to stop worrying about this minor issue I’m having, and move on with the rest of my project. If I need my character to look left and right, I’ll try to do it manually after I’ve recorded the motion capture.

I’m not sure if the Blender version I’m using would cause this, but I’m currently using v2.82. Do you know if Ni mate will work with the Blender 2.9? (The Alpha version is out, but I haven’t switched to it yet.) Thanks! - Patrick

I don’t think the new Blender version will change the Python API, so it will probably work just fine with the latest NI mate plugin. The head problem might be either at NI mate or the sensor SDK end - perhaps the data is simply not collected or is not producted by the SDK in the first place. I’ll look into this when I have a bit of time available.