Does NI Mate really works?

Hi does Ni Mate rellay works everything I see in the forum is crash, not opening etc…???

I just installed the free version to test it because I don’t trust that software enough to pay for it (and I think it is a little expensive for my use, but… I may pay for it if I am convince.)

So I installed it, on my macbook pro 2016 with AMD gpu, OSX High Sierra, The Kinect v1 1414 is connect through usb and to power and the infrared is on and the little green light is flashing.

I start NI Mate, but I don’t see anything??? Is there something that I am suppose tosse, to happen or to do??

Does NI mate start if you have the sensor unplugged? There’s still an unsolved bug that causes NI mate to not start on some MacOS systems, most likely due to a processor optimization that for some reason doesn’t work completely cross-platform. We’re still not sure what precisely causes this.

If NI mate starts without the sensor, then this is possibly GPU Texture Transfer, which you can disable from the Preferences -> General: Why does GPU texture transfer not work on my computer?

I don’t know if it starts or not, I see that it open and ask me to authorize or not the connection then I see the ni mate menu on my header but nothing else…

Sensor plugged or not. Texture on or off.

But I found a free way to make it work and I am gonna talk about it in the forum because seriously if you check your forum a lots of people windows and mac can’t make it work so to ask money for something that can be better and free well…

OpenNI and Processing or OpenFrameworks works already better than your paid solution…

NI mate starts minimized to the tray menu by default. Right click that and choose “control interface”. The main idea behind this was to make NI mate into something users wouldn’t have to operate apart from opening it, and then choosing a preset from the tray menu to get the sensor’s data outputting.

OK now everything seems to work, but one thing… and one of the most important making NI-Mate useless for now… seriously do you want to make money with this or it is just hobby if it is the first one you are not going the right way.

When I am getting the osc port of ni mate the only thing I get is Ni mate sync, what I need is all the skeleton parts…

You may want to look into the NI mate documentation at Documentation: Table of contents, particularly Documentation: Skeleton tracking. You will have to enable skeleton OSC output.

OK Now I am able to send through OSC the skeleton, but I can’t find a way to what is what or how I can separate joints individually, even when I erase everything but one joint I’ve got the same amount of data… How the f**** did you code that?

Seriously I am calling thieves and fraud to ask money for something that’s so shitty to use…

What am I suppose to do with that? I am using OSCeleton, it says that the name is /joint_“name of the joint” and it’s coordinates, but even when I want to take that info I’ve got nothing.

The outgoing OSC messages can be formatted in two ways: By using some preset on the Skeleton Tracker page, or by using OSC parameter tags:

You can also take a look at the Log page and check “OSC” which will list all outgoing OSC messages - from this you can verify the messages you’re expecting are being outputted.

Unless you have special requirements for your OSC receiver, either “Basic” or “Basic + orientation” presets are the ones you’ll want to use most of the time.

Which sensor are you using? To minimize potential sensor issues, you can use the .nirec recording samples to test the OSC output by changing the “Live” box on the sensor’s main page (with the live feed shown) to one of the sample recordings. These recordings will always output valid OSC data and you can eliminate any sensor related potential issues, such as the sensor not getting reliable data for all joints due to non-optimal conditions (sensor position, IR light, distance to the sensor).

Can someone please help me? I have a Kinect one on an iMac running High Sierra. I’ve installed everything and have a sensor that is awake. I have used the Kinect 2 input option but I don’t see anything when I choose Source = Live (no feed).
I also got the plugin for Cinema 4D which also shows nothing when I hit start receiving (all on port 7000)

If you see “no feed”, you are using a dummy sensor device and not the real one. See this part of the documentation: Documentation: User interface

What does the Log window say?