How to track facial movements from one person to another

Discussions about research, Faceswapping and things that don't fit in the other categories here.


Locked
User avatar
edwardsledward
Posts: 1
Joined: Sun Sep 06, 2020 6:07 pm

How to track facial movements from one person to another

Post by edwardsledward »

This is probably a basic question, but I couldn't find an answer anywhere.

I have two videos of two people talking (already have the faces extracted). I want the first person's facial features to move in accordance with the second's (i.e. their face looks the same but they're saying what the second person was saying, example: https://www.youtube.com/watch?v=cQ54GDm1eL0).

What steps do I need to take to accomplish this? I can follow the step by step instructions in the documentation, it just doesn't make clear what things you need to do in order to accomplish what goal.

Thanks in advance!

User avatar
bryanlyon
Site Admin
Posts: 793
Joined: Fri Jul 12, 2019 12:49 am
Answers: 44
Location: San Francisco
Has thanked: 4 times
Been thanked: 218 times
Contact:

Re: How to track facial movements from one person to another

Post by bryanlyon »

What you're looking for isn't what Faceswap does. Faceswap takes one person's face and puts it onto another person. If you're looking to just animate a person with new motions, you could check out First Order Model https://github.com/AliaksandrSiarohin/first-order-model and or it's Avatarify frontend to reanimate your person.

Locked