Official Kinect SDK and Unity3d
Asked Answered
P

2

6

Does anyone know anything about using Kinect input for Unity3d with the official SDK? I've been assigned a project to try and integrate these two, but my super doesn't want me to use the open Kinect stuff. Last news out of the Unity site was that Kinect SDK requires 4.0 .Net and Unity3D only takes 3.5

Workarounds? Point me toward resources if you know anything about it please.

Poem answered 22/6, 2011 at 20:40 Comment(0)
F
7

The OpenNI bindings for Unity are probably the best way to go. The NITE skeleton is more stable than the Microsoft Kinect SDK, but still requires calibration (PrimeSense mentioned that they'll have a calibration-free skeleton soon).

There are bindings to OpenNI from the Kinect SDK, that make the Kinect SDK work like SensorKinect, this module also exposes The KinectSDK calibration-free skeleton as an OpenNI module:

https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

Because the KinectSDK also provides ankles and wrists, and OpenNI already supported it (even though NITE didn't support it) all the OpenNI stuff including Unity character rigs that had included the ankles and wrists just all work and without calibration. The KinectSDK bindings for OpenNI also support using NITE's skeleton and hand trackers, with one caveat, it seems like the NITE gesture detection aren't working with the Kinect SDK yet. The work-around when using the KinectSDK with NITE's handGenerator is to use skeleton-free tracking to provide you with a hand point. Unfortunately you lose the ability to just track hands when your body isn't visible to the sensor.

Still, NITE's skeleton seems more stable and more responsive than the KinectSDK.

Favus answered 2/8, 2011 at 5:10 Comment(1)
Amir! Thanks. We've setup with OpenNI-Nite, and I'm following Tomoto Washio's experimental module and the mailing lists from the Google Groups. Nice to see you here, all of us interns are big fans of your work, assuming you're the Amir from the Unity wrapper fame. This is a great rundown on the developments since the release of the MS SDK.Poem
C
4

How much of the raw Kinect data do you need? For a constrained problem, like just getting limb articulation, have you thought about using an agnostic communication schema like a TcpClient. Just create a simple TCP server, in .net 4.0, that links to the Kinect SDK and pumps out packets w/ the info you need every 30ms or something. Then just write a receiving client in Unity. I had a similar problem with a different SDK. I haven't tried the Kinect though so maybe my suggestion is overkill.

If you want real-time depth/color data you might need something a bit faster, perhaps using Pipes?

Crust answered 1/7, 2011 at 16:27 Comment(1)
Thanks. This is something good to investigate. I believe at this point I've convinced my boss to let us do some development with OpenNI but at this time we don't need much from the Kinect so your thoughts are appreciated.Poem

© 2022 - 2024 — McMap. All rights reserved.