<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Live stream facial motion capture with Iphone x in MotionBuilder Forum</title>
    <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7584210#M1954</link>
    <description>&lt;P&gt;You're awesome&amp;nbsp;for responding &lt;A href="https://forums.autodesk.com/t5/user/viewprofilepage/user-id/1274550" target="_self"&gt;&lt;SPAN&gt;KikoBarahona&lt;/SPAN&gt;&lt;/A&gt;.&amp;nbsp; F&lt;SPAN&gt;aceshift and Kinect&lt;/SPAN&gt;&amp;nbsp;worked great for us too a couple years back,&amp;nbsp;but they had a motionbuilder plugin. I just hoped that someone might have made a plugin that I could use with the iphone x or another depth mobile phone.&amp;nbsp;&amp;nbsp;I don't know how to stream the&amp;nbsp; blendshapes without a plugin but I can't imagine it would be too hard.&amp;nbsp; Also curious if the mobu faceshift plugin would work with a minor tweak or something if anybody knows.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To put a phone on a Helmet to stream facial mocap while simultaneously&amp;nbsp;streaming body with something like perception neuron would make me a very happy person!&lt;/P&gt;</description>
    <pubDate>Thu, 30 Nov 2017 00:31:22 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2017-11-30T00:31:22Z</dc:date>
    <item>
      <title>Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7564056#M1952</link>
      <description>&lt;P&gt;What would I need to live stream facial mocap data with the iphone x depth camera? There is already a blendshape dictionary created within the phone according to ARkit docs but I don't see how one would stream it into motionbuilder to drive a 3d face. I'd like to use it simultaneously with perception neuron streaming.&amp;nbsp; I'd love any suggestions or can anyone point me in the right direction?&lt;/P&gt;</description>
      <pubDate>Wed, 22 Nov 2017 05:18:52 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7564056#M1952</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-11-22T05:18:52Z</dc:date>
    </item>
    <item>
      <title>Re: Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7567137#M1953</link>
      <description>&lt;P&gt;I'm guessing you would have to map those blendshapes to an Actor/Character face in motionbuilder.&lt;/P&gt;
&lt;P&gt;We used Character face to drive a cluster based face rig in motionbuilder with Faceshift and a Kinect camera.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Nov 2017 02:10:54 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7567137#M1953</guid>
      <dc:creator>KikoBarahona</dc:creator>
      <dc:date>2017-11-23T02:10:54Z</dc:date>
    </item>
    <item>
      <title>Re: Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7584210#M1954</link>
      <description>&lt;P&gt;You're awesome&amp;nbsp;for responding &lt;A href="https://forums.autodesk.com/t5/user/viewprofilepage/user-id/1274550" target="_self"&gt;&lt;SPAN&gt;KikoBarahona&lt;/SPAN&gt;&lt;/A&gt;.&amp;nbsp; F&lt;SPAN&gt;aceshift and Kinect&lt;/SPAN&gt;&amp;nbsp;worked great for us too a couple years back,&amp;nbsp;but they had a motionbuilder plugin. I just hoped that someone might have made a plugin that I could use with the iphone x or another depth mobile phone.&amp;nbsp;&amp;nbsp;I don't know how to stream the&amp;nbsp; blendshapes without a plugin but I can't imagine it would be too hard.&amp;nbsp; Also curious if the mobu faceshift plugin would work with a minor tweak or something if anybody knows.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;To put a phone on a Helmet to stream facial mocap while simultaneously&amp;nbsp;streaming body with something like perception neuron would make me a very happy person!&lt;/P&gt;</description>
      <pubDate>Thu, 30 Nov 2017 00:31:22 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7584210#M1954</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2017-11-30T00:31:22Z</dc:date>
    </item>
    <item>
      <title>Re: Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7595583#M1955</link>
      <description>&lt;P&gt;There's an &lt;A href="http://www.cartoonbrew.com/tech/can-iphone-xs-true-depth-camera-used-performance-capture-154706.html" target="_blank"&gt;article on Cartoon Brew&lt;/A&gt; about doing facial animation with the iPhone.&lt;/P&gt;
&lt;P&gt;Using Apple ARKit and Houdini (SideFX ships a free version of Houdini, btw)&lt;/P&gt;
&lt;P&gt;Hope that helps.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;KB&lt;/P&gt;</description>
      <pubDate>Mon, 04 Dec 2017 21:17:52 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/7595583#M1955</guid>
      <dc:creator>KikoBarahona</dc:creator>
      <dc:date>2017-12-04T21:17:52Z</dc:date>
    </item>
    <item>
      <title>Re: Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/12952678#M1957</link>
      <description>&lt;P&gt;To live stream facial motion capture data from the iPhone X's depth camera into MotionBuilder, you’ll need to set up a system to capture and transfer the data from the ARKit framework. Here’s a step-by-step approach to help you get started:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;ARKit Integration&lt;/STRONG&gt;: Begin by utilizing ARKit to capture facial motion data. You can create a custom iOS app to access ARKit’s facial tracking features and obtain the blendshape coefficients.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Data Streaming&lt;/STRONG&gt;: To stream the data, you’ll need to implement a mechanism in your iOS app that sends the blendshape data over a network connection. You might use a socket connection (e.g., WebSockets) or a custom protocol to transmit the data to your computer.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Receiver Setup&lt;/STRONG&gt;: On your computer, set up a receiver that listens for incoming data from the iOS app. This could be a custom application or script that parses the received blendshape data.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;MotionBuilder Integration&lt;/STRONG&gt;: To integrate with MotionBuilder, you’ll need to use its SDK or scripting capabilities (Python or MEL). Write a script to interpret the incoming blendshape data and map it to the corresponding 3D facial controls in MotionBuilder.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Simultaneous Streaming&lt;/STRONG&gt;: For simultaneous streaming with Perception Neuron, ensure that your system can handle multiple data streams. You might need to create a unified interface or middleware that combines data from both sources before sending it to MotionBuilder.&lt;/P&gt;&lt;/LI&gt;&lt;LI&gt;&lt;P&gt;&lt;STRONG&gt;Testing and Calibration&lt;/STRONG&gt;: Test your setup to ensure accurate data transfer and synchronization. Adjust the mapping as needed to ensure that the facial motion is correctly reflected in MotionBuilder.&lt;/P&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;For further guidance, you might want to check out developer forums, ARKit documentation, or MotionBuilder’s API references. Engaging with communities or seeking advice from experts who have worked on similar integrations can also provide valuable insights. Hope it&amp;nbsp;&lt;A href="https://docs.google.com/spreadsheets/d/1A0oZbZxCb4hs_aP59VTix9DJ8sPlbwg0E5EKQRUPTOY/pubhtml#" target="_blank" rel="noopener"&gt;helps&lt;/A&gt;:&lt;/P&gt;</description>
      <pubDate>Mon, 12 Aug 2024 04:00:13 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/12952678#M1957</guid>
      <dc:creator>taraftarium820</dc:creator>
      <dc:date>2024-08-12T04:00:13Z</dc:date>
    </item>
    <item>
      <title>Re: Live stream facial motion capture with Iphone x</title>
      <link>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/13024578#M1958</link>
      <description>&lt;P&gt;There is a Face Cap app for iPhones, which you can use to record or stream data in real-time&amp;nbsp;&lt;A href="https://apps.apple.com/us/app/face-cap-motion-capture/id1373155478" target="_blank"&gt;https://apps.apple.com/us/app/face-cap-motion-capture/id1373155478&lt;/A&gt;&lt;/P&gt;&lt;P&gt;And there is a open source device plugin for the facecap stream as part of OpenMoBu, you can find it in releases section of github repository -&amp;nbsp;&lt;A href="https://github.com/Neill3d/OpenMoBu/releases/tag/2024" target="_blank"&gt;https://github.com/Neill3d/OpenMoBu/releases/tag/2024&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 16 Sep 2024 21:16:00 GMT</pubDate>
      <guid>https://forums.autodesk.com/t5/motionbuilder-forum/live-stream-facial-motion-capture-with-iphone-x/m-p/13024578#M1958</guid>
      <dc:creator>_neill_</dc:creator>
      <dc:date>2024-09-16T21:16:00Z</dc:date>
    </item>
  </channel>
</rss>

