TransWikia.com

Deforming a lip mesh to match face-tracking points

Game Development Asked by Anubhav Roy on November 2, 2021

I’m creating a lipstick filter. I have a basic mesh for the lip shape ( 140 vertices ) which I am able to render on screen.

I need the mesh since I have my own lighting system, and hence need the normal info. Plus I don’t want to create a model at runtime since if I load an obj file I have the flexibility of rendering high quality meshes.

The problem that I am now facing is that when a user is going to use it, their face will be moving around in the video frame, and I will be getting the lips’ positions accordingly. Hence at each frame the mesh will be rendered differently ( in one frame he might be smiling, the other he might be frowning etc), in different places in the screen.

How can I set the OpenGL vertices to match the positions of the tracked feature points, and interpolate the rest? I have only 20-ish detection landmark points for each of the upper and lower lip.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP