Page 1 of 1

How to track facial movements from one person to another

Posted: Sun Sep 06, 2020 6:18 pm
by edwardsledward

This is probably a basic question, but I couldn't find an answer anywhere.

I have two videos of two people talking (already have the faces extracted). I want the first person's facial features to move in accordance with the second's (i.e. their face looks the same but they're saying what the second person was saying, example: https://www.youtube.com/watch?v=cQ54GDm1eL0).

What steps do I need to take to accomplish this? I can follow the step by step instructions in the documentation, it just doesn't make clear what things you need to do in order to accomplish what goal.

Thanks in advance!


Re: How to track facial movements from one person to another

Posted: Mon Sep 07, 2020 3:43 am
by bryanlyon

What you're looking for isn't what Faceswap does. Faceswap takes one person's face and puts it onto another person. If you're looking to just animate a person with new motions, you could check out First Order Model https://github.com/AliaksandrSiarohin/first-order-model and or it's Avatarify frontend to reanimate your person.