We don’t currently have a tutorial for face tracking. With Kinect v2, make sure you are using the Microsoft SDK by deselecting “libfreenect2” in the “libfreenect settings” under Preferences, then restart NI mate. Only the Microsoft SDK outputs face analysis data.
We have some Realsense specific material available from our 2015 Blender Conference presentation here: Blender Conference 2015 RealSense demos and slides available
The idea is the same with Kinect v2: Enable face analysis and output the data to Blender, then use the facial action units to drive shape keys to control the face. Note that to do this you will need the custom receiver plugin that parses Python expressions outputted by NI mate (see the conference videos / thread for links).