I'm working on a project in wich stablishing a workflow to suit my needs has been restless. Below is the outline to work from:
1. Kinect skeleton streaming
2. Cloth simulation, set the skeleton as a collider
3. Visualization from four camera angles
So far I've seen how to integrate the Kinect with MotionBuilder using a third party software called Berkel Pro Body v2. The second point aside has been dealt with using 3ds Max and the cloth modifier to run the simulation. What troubles me is how to make a simulation of the cloth using the position markers from the kinect in real time. For the visualization I think there must be a way to freely drag the viewports from the 3ds max interface, still I don't know how to do it. No render is needed.
Would it be easier to solve using a different program? Maybe Unreal or Unity. How so?