From 2D to 3D real-time expression transfer for facial animation

Beste Ekmen, Hazım Kemal Ekenel

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we present a three-stage approach, which creates realistic facial animations by tracking expressions of a human face in 2D and transferring them to a human-like 3D model in real-time. Our calibration-free method, which is based on an average human face, does not require training. The tracking is performed using a single camera to enable several practical applications, for example, using tablets and mobile devices, and the expressions are transferred with a joint-based system to improve the quality and persuasiveness of animations. In the first step of the method, a joint-based facial rig providing mobility to pseudo-muscles is attached to the 3D model. The second stage covers the tracking of 2D positions of the facial landmarks from a single camera view and transfer of 3D relative movement data to move the respective joints on the model. The last step includes the recording of animation using a partially automated key-framing technique. Experiments on the extended Cohn-Kanade dataset using peak frames in frontal-view videos have shown that the presented method produces visually satisfying facial animations.

Original languageEnglish (US)
Pages (from-to)12519-12535
Number of pages17
JournalMultimedia Tools and Applications
Volume78
Issue number9
DOIs
StatePublished - May 1 2019

Keywords

  • Expression transfer
  • Facial animation
  • Facial tracking
  • Performance-driven animation

ASJC Scopus subject areas

  • Software
  • Media Technology
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'From 2D to 3D real-time expression transfer for facial animation'. Together they form a unique fingerprint.

Cite this