How to track facial movements from one person to another

Training your model
Forum rules
Read the FAQs and search the forum before posting a new topic.

Please mark any answers that fixed your problems so others can find the solutions.
Post Reply
User avatar
edwardsledward
Posts: 1
Joined: Sun Sep 06, 2020 6:07 pm

How to track facial movements from one person to another

Post by edwardsledward »

This is probably a basic question, but I couldn't find an answer anywhere.

I have two videos of two people talking (already have the faces extracted). I want the first person's facial features to move in accordance with the second's (i.e. their face looks the same but they're saying what the second person was saying, example: https://www.youtube.com/watch?v=cQ54GDm1eL0).

What steps do I need to take to accomplish this? I can follow the step by step instructions in the documentation, it just doesn't make clear what things you need to do in order to accomplish what goal.

Thanks in advance!

User avatar
bryanlyon
Site Admin
Posts: 473
Joined: Fri Jul 12, 2019 12:49 am
Answers: 39
Location: San Francisco
Has thanked: 3 times
Been thanked: 114 times
Contact:

Re: How to track facial movements from one person to another

Post by bryanlyon »

What you're looking for isn't what Faceswap does. Faceswap takes one person's face and puts it onto another person. If you're looking to just animate a person with new motions, you could check out First Order Model https://github.com/AliaksandrSiarohin/first-order-model and or it's Avatarify frontend to reanimate your person.

Post Reply