Blog

iPhone X Facial Capture – Apple blendshapes

By on May 22, 2018

Project bebyface is a series of tests I’ve been doing to find a fast pipeline for performance capture (Face and body) that we can use for our up and coming game Bebylon: Battle Royale. http://bebylon.world

Check out the previous tests for more info regarding the iPhone X capture process.  (starting with the most recent)


 

Apple ARKit Blendshapes

Along this journey a super cool developer extracted and sent me Apples blendshapes used in ARKit which I used in Maya (using built-in deformers) to generate a fresh set of blendshapes for our beby character. They were not a drastic improvement over the make shift blendshapes I originally made but they definitely helped achieve a more natural look and act as a perfect starting point for creative augmentation.   I wish this was a more full blown tutorial on that process (Hopefully thats something i can do very soon) but in the meantime, I’m attaching a link to the apple blendshape set that you can use as a reference or to generate blendshapes for your own character.

 

the Apple blendshapes are OBJ
 Apple blendshapes51 OBJs.zip

TAGS
RELATED POSTS
32 Comments
  1. Reply

    James Mack

    May 26, 2018

    Hey, I loved your demo of Apples motion tracking tech. I have the original Faceshift and I was wondering how Apple was going to develop it. One quick question: Will you be releasing commercially the app you are using for the capture and transfer to Maya? I want to develop my own pipeline and this looks promising. Before I invest the time in programming, I would like to know what your future plans are. Perhaps I can wait on you to release what you have.

    • Reply

      Cory Strassburger

      May 28, 2018

      thanks James! I don’t have any plans to develop a commercial version, partially due to the lack of time and I mainly developed this for generating data for our game. I think its a worthy work flow to build a commercial app around, especially considering how fast, easy and flexible it is for the quality.

      • Reply

        amine

        July 3, 2018

        Hello Cory!
        Amazing progress!
        I just want to ask you how is the iPhoneX performance compared to the old faceshift software if you have tried the later?
        thank you

        • Reply

          Cory Strassburger

          July 12, 2018

          Hey amine, I have used faceshift in the past and from my experience i’d say the arkit implementation is on par with the desktop version in terms of overall quality. Faceshift of course had a lot of features that i dont believe are accessible (or even integrated) in Arkit but that hasn’t been an issue for me so far.

      • Reply

        Graham

        July 11, 2018

        Wow uhhhh you REALLY should release something only because it would blow the lid off of other motion tracking software. EVERYONE would be flooding your way to create things and you’d START A REVOLUTION!

        … Well maybe less dramatic than that, but wow seriously… I can’t believe all of this. It’s something I always wanted when the iphone x was announced and I think you’ve got the best bet on making something quality. Keep up the good work, it’s extremely fascinating and I can’t wait to see your insane battle royale game come to life with this!

        • Reply

          Cory Strassburger

          July 12, 2018

          Lol, thanks Graham!!

    • Reply

      Cory Strassburger

      August 10, 2018

      Hey james, im obviously a bit lacking in response time! Did you end up jumping into it? I think a few apps have surfaced that accomplish the general task but not sure how easy that data can get into maya. In unreal engine its a bit easier with 4.20 to just capture the data live into the engine but Im not sure you can export it yet into maya?

  2. Reply

    J.N

    August 9, 2018

    What a great job! Amaging😆👍💕 I have a question. At 2:28 in the part 4 video, I have a question about merge another head.
    how did you apply Blend Shape when you combine or merge modeling together? (Face area and back head)
    The value of vertex changes, doesn’t it?

    • Reply

      Cory Strassburger

      August 10, 2018

      thanks j.n! my blendshapes for the beby are actually of the whole head (not just the front of the face). Though what your seeing is the apple blendshapes which are just the front of the face but I only used them to deform my beby characters face. I basically ‘wrap deform’ the apple blendshape to my beby characters head and use ‘delta mush’ to blend it smoothly and maintain the general shape of my beby character. the combination of the 2 deformers allow the apple blendshape to deform my beby characters face with a smooth transition into the rest of its head, If that makes sense? Honestly there is probably a better way to do it, maybe a plugin that makes the whole process fast and a better transfer of the deltas.

  3. Reply

    Luffy

    October 15, 2018

    Hey Cory,I have been paying attention to your development. Thank you for bringing us so many surprises, I have always had a question. My model is very cartoon with large round eyes,I basically ‘wrap deform’ the apple blendshape to my characters head,Unfortunately
    The eyes of my model always have problems, I can’t close them.,Is there any way to solve it? Looking forward to your reply.

  4. Reply

    LUFFY

    October 15, 2018

    Hey man, my 3D model has big eyes and I try to use “wrap deform” to make blend shape which accroding to 51 blenshapes by Arkit. But the question is my model cannot close eye completely and I already try my best still can’t figure out. Could you tell me how to fix this plz? Thx.

    • Reply

      Cory Strassburger

      October 16, 2018

      If its just the eyes closed blendshapes I would use the grab sculpt tool and just manually close them the reset of the way.

  5. Reply

    Nick

    October 16, 2018

    Hey Cory, Is this blendshape generation method what you used for your final product or did you go another route?

    • Reply

      Cory Strassburger

      October 16, 2018

      Hey Nick, that method was the starting point for the final results and then I sculpted on top of a bunch of key shapes after that.

  6. Reply

    pavel

    November 8, 2018

    Hello Cory, i have a question about closeMouth blendShape,
    it should be ugly same as for faceShift because it’s corrective pose to bring lips closed with jawOpen blendShape at 100% or not?, for instance in scene with Sloth there was closeMouth blendShape act like correct, sorry for bad english),

    • Reply

      Cory Strassburger

      November 9, 2018

      Hey Pavel thats right, the MouthClosed should be lips together when jawOpen is at 100%, which looks overblown by itself. Typically the captured value of mouthClosed is very very small.

  7. Reply

    Daniel

    November 14, 2018

    Wow, it is too amazing what you’ve done. I’m on the way to making an iOS app and concerning making the model. I was not able to import the face model made on 3d max exported as DAE file format. After importing to Xcode, all the blend shape keys are broken. Can you help me why it is happening?
    Is it problem using 3d max and should use Maya? Or any other reason?

    • Reply

      Cory Strassburger

      November 19, 2018

      Thanks Daniel! I’ve been using Unreal Engine for all the coding side and not having to mess with Xcode directly so i’m not sure what your running into. Exporting from Max should be fine. Have you tried exporting to FBX vs DAE?

  8. Reply

    Nick

    November 15, 2018

    Hey Cory,

    thanks for the response earlier. I’m using your wrap deformer technique, but I’m having a hard time when it comes to the lips because they are very close together. The top lip gets affected by the movement of the bottom lip and vice versa. I was wondering if you had a process for this? The best way I can get it working is by opening the mouth a little on my main character and aligning the inner lips of the apple mask to fit through the slot. I’d prefer if there was a cleaner way of doing this without having to open the mouth on my character.

    Thanks!

    • Reply

      Cory Strassburger

      November 19, 2018

      Sadly Nick im in the same boat. I’ve had to adjust the lips on a couple of the characters for the same reason. I might be re-approaching the process on some new characters coming up and if that pans out i’ll let you know.

  9. Reply

    Nick

    November 22, 2018

    Hey Cory, sounds awesome, thanks. So far I’ve been readjusting the lips after I’ve applied the blendshapes and it works out fine. I’ve finished one character using only the apple masks deformations on the model (without any extra work done) and it’s turned out pretty good. I’m going to be running the same setup you used for your Siggraph presentation, so I should be able to pump out some content pretty soon.

    Thanks!

  10. Reply

    yilog

    March 24, 2019

    Good job! The expression is very real.
    I want to know how the developer extracts the blendshapes from iphoneX?
    Is it possible to extract everyone’s blendshapes when he appears in front of iphoneX?

    • Reply

      Cory Strassburger

      March 25, 2019

      Thanks! I imagine that should be possible. I’d look into the ARkit framework on apples developer site.

  11. Reply

    Sophia Kyriacou

    May 21, 2019

    Hello Cory,
    I am totally inspired by your workflow! I love your Capture test Part 2! Makes me laugh everytime!! Love it and can’t wait to see it develop. Really interesting pipeline. Best of luck!

  12. Reply

    jack

    July 15, 2019

    Hi Cory, great work with Beby!
    I’m very curious on how did you make the connection between the 51 values from the iPhone X and the blend shapes for the Beby.

    Are you connecting one value to the to the right Blend Shape or do you have a mix/more complex setup?

    How are you smoothing the curves and removing noise in Unreal?

    Any chance you can share a view of the Blueprints involved in the animation process ?

    Thanks !!

    • Reply

      Cory Strassburger

      July 15, 2019

      Hey Jack, have you checked out the Face AR Sample that Epic created? You can download the project from their launcher and it has everything you need already setup with their KiteBoy character. I’m using the same setup and its more or less a direct connection between the iPhoneX and blendshapes. For smoothing they created an averaging node that drops into the anim BP flow. https://docs.unrealengine.com/en-US/Platforms/AR/HandheldAR/FaceARSample/index.html

  13. Reply

    jack

    July 17, 2019

    I’ve looked into their example project. I’ve tried replacing the head with one of my own, but the results are far from good.
    I thought you were doing something more with Beby.
    I’m guessing that most of the work that you did was on the morph targets, am i right?
    Some of the values coming from ARKit are quite subtle and will probably need the adjustments to the head mesh. For instance, i can’t seem to get a proper closed mouth or a clear sad face

  14. Reply

    Cory Strassburger

    July 17, 2019

    You right, most of the work is in the morph targets. I’m not doing anything beyond what the unreal sample is doing (At least with the tests I’ve posted so far). I’m sure you’ve downloaded the apple blendshapes from above, there are a few surprises like the lower lip that pushes really high towards the nose (I forget the actual shape) thats the key to getting a closed mouth with such small values. It looks like a hack on their end?
    I literally used those apple blendshapes to create my base shapes and maybe exaggerated a couple in zBrush.
    I also took 10-12 key shapes into zBrush, sculpted wrinkles and more defined expressions, baked those normals then connected them to the bebies material in UE4. They’re driven by the ARKit data. Basic wrinkle map process.

  15. Reply

    Jan

    August 13, 2019

    Hi Cory,

    first off: Amazing work you’ve done here. It’s awesome to see what you are able to do with a relatively cheap Mocap-Setup (compared to others). Aaaaand I freakin’ love your acting – it really pushes your results and makes this whole project even more jawdropping!

    I saw that you adjusted /created all blendshapes for different characters (e.g. the alien Han) to conform with the Apple blendshapes. I’m wondering if you could actually create a “base mesh” to capture your facial animations (e.g. Beby-Face) and change some aspects of the character later on. For example creating an additional blendshape that drives the characters weight (let’s say you have a simple character editor to change him up in the beginning of your game), so you could alter the character slightly without having to create blendshapes for every variation? Sure enough there probably would be some issues with the blendshapes if it gets too extreme (speaking of double-chin…)

    I’ve also read about the new Facial Animation Sharing. Might be interesting if you could combine the FaceCapture through the iPhoneX (recording the data in Unreal) and this feature to reuse animations over “variable” characters.

    Cheers to you! Keep up the great work 🙂
    Jan

    • Reply

      Cory Strassburger

      August 27, 2019

      Lol, thanks a lot Jan!!

      What you’re describing for creating various base ‘faces’ works pretty well and exactly what i’m doing with the bebies. I have 1 base mesh with all 51 blendshapes + additional “character’ blendshapes. This way I can turn on Character A’s blendshape and utilize the same 51 blendshapes. As you pointed out, if the character faces are too different then that trick starts to fall off. Ultimately for the short film i’m working on, I’m going to generate specific blendshapes for each Hero character just to get the most unique expressive quality possible.

      I tried the facial animation sharing but found for my setup it actually didn’t have any benefits because all the animation was already shareable (so to speak). Though it will probably come in handy once I start creating the various unique characters.

  16. Reply

    Jake

    August 27, 2019

    Hey Cory,

    Really amazing job! I was trying to reproduce your demo. Where did you get you hamlet and phone mounter?

LEAVE A COMMENT

This site uses Akismet to reduce spam. Learn how your comment data is processed.