Procedurally Generating Metahuman Face Animations
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report
Hello Everyone!
We are looking to generate a large number of face animations for some of the Unreal Metahumans to help train a model we are developing.
I'm wondering if it would be easier to produce these animations through scripting Maya's animation system, and then export them all to Unreal instead of simply trying to script the control rig in Unreal.
Any advice/suggestions you have to offer would be greatly appreciated.
We are using Unreal 4.27 right now because we are making use of Microsoft's AirSim plugin to help generate the image data we need for training the computer vision model. But I'd also be open to entirely different approaches if anything occurred to you. Ultimately, we just need the rendered images of the facial animations and a disparity map image for each to serve as ground truth for the model.
Thanks!!!