Recently there was a lot of enthusiams around Virtual Production and use of Motion Capture. This article really summarize what great future and fun we will get on that side.
But this is mainly about Virtual Production in general and mostly applied to MotionBuilder when it comes to Motion Capture. But there are many people asking - ok, but what about Maya and/or 3ds Max. I am not an expert on 3ds Max, but what I am going to say here is probably also applicable (but differently) to 3ds Max.
The main difference between MotionBuilder and Maya is that these products were designed with 2 completely different vision in mind. MotionBuilder was for real-time animation. And Motion Capture is just a subset of the power of MotionBuilder. Whereas Maya was designed for complex content creation. 2 different visions, 2 different architectures.
The main issue for Maya is that whatever you do, it will never be real-time as the DG and the fact Maya is mostly single threaded will prevent real-time features of any kind. As a side note true real-time isn't possible anyway on a pre-emptive OS such as Linux, Windows, or OSX. There are ways to come near real-time, so I fear it depends what someone means by real-time.
Back on Maya - a long time ago there was an API for devices which has been retired in the 2010 release I think. But Maya 2013 has introduced a new device API back. That API comes with 3 samples.
There is a randomizer example which generates info and feeds it into Maya. Then there is a UDP based Linux only example and a game controller windows only example.
Internally, the Maya engineering team use these classes to send character information from MotionBuilder to Maya in our Live Stream feature which by the way also use the Human IK character representation library.
The class you want to look at in case you want to write device support in Maya is: http://docs.autodesk.com/MAYAUL/2013/ENU/Maya-API-Documentation/cpp_ref/class_m_px_threaded_device_node.html. If you take a look at it, you will see that it contains some simple primitives for handling the starting and stopping of threads and the ability to pass data from a secondary thread to the primary (where Maya operates). It is important from there to understand what I said in a previous post, that you can really only use Maya API in the main Maya thread unless specified differently. So here, it means the device thread will only do something in Maya, if Maya is free and not in the middle of a DG evaluation for example - so not realtime. That also means, in case Maya is busy, that you will have to cache your device data and wait for Maya.
I'll probably use some free time to write a Kinect sample during the Chritmas break as I bought one recently. Will share the sample when it is ready.