Hardware requirements
Supported OS Windows 8.1+
Sensors on one computer: 1
USB port: USB 3 (also requires power from wall socket)
Available from: Microsoft
Kinect for Xbox One requires the separately purchased adapter to work with a PC.
##Specs
Depth resolution: 512 x 424 @ 30 FPS
Depth field of view: 70.6° x 60°
Depth range: 1.4 - 5 m
Color resolution: 1920 x 1080
Color field of view: 84.1° x 53.8°
FPS: 30
##NI mate features
The user component is used for specifying how users are detected and how they are labeled.
[image]
Activation modes
Track single user: Toggles between single user and multi user mode. If in single user mode, only the user closest to the sensor is tracked.
Activation time slider: This slider can be used for specifying how many seconds the user needs to stay within the sensor’s view before the user is considered active.
Activation areas
Enabled activation areas makes the sensor only acti…
One of the most used features in NI mate is the skeleton tracker. This component is used for finding a human skeleton out of the live feed and computing its position and orientation.
[image]
Enable button: If the component is enabled either by enabling the checkbox in the page tree or by pressing the enable button the skeleton tracker page, NI mate will begin outputting skeleton tracker data out over OSC.
IP address & port override: If enabled and specified, these IP address and port are …
The Face Analysis component allows sending out information about the face landmark data as well as the head orientation. Emotions are also handled by this component.
[image]
Enable button: Clicking the checkbox for this component in the page tree on the left side of the UI will enable the Face Analysis component and NI mate will start outputting the relevant OSC messages. Alternatively, clicking the “Enable face analysis” button also works.
IP address & port override: If enabled, the IP a…
The faces shapes are Action Units that can be used to control, for example, shape keys in an external program. The action units are computed from the user’s face and displayed on the live feed as graphs.
https://en.wikipedia.org/wiki/Facial_Action_Coding_System
[image]
Clicking the checkbox next to the component name in the page tree on the left side of the UI enables face shape tracking and starts outputting the data over OSC. Alternatively, clicking the “Enable Face Shapes” button achieves …
Controlling tracking is used to send out OSC or MIDI messages every frame, computing assorted relations between joints and users.
[image]
Clicking the component’s checkbox in the page tree toggles between the output modes. It’s possible to output OSC, MIDI or both. Alternatively, this can be done by clicking the enable buttons directly on the Controller Tracker page.
IP address & port override: If enabled, these IP address and port are used when outputting OSC data. Otherwise the default …
The trigger component is used for outputting single OSC or MIDI values when the user touches a specific zone in the depth feed.
[image]
The triggers can be enabled by clicking the checkbox next to the component name in the page tree on the left side of the UI. Alternatively, clicking the “Enable trigger OSC” or “Enable trigger MIDI” buttons can be used instead.
The IP address & port override can be used to output trigger OSC values to a specific IP address and port. If left disabled the main …