Generative Adversarial Talking Head: Bringing Portraits to Life with a Weakly Supervised Neural Network

Hai X. Pham, Yuting Wang & Vladimir Pavlovic.

Abstract

This paper presents Generative Adversarial Talking Head, a novel deep generative neural network that enables fully automatic facial expression synthesis of an arbitrary portrait with continuous action unit (AU) coefficients. Specifically, our model directly manipulates image pixels to make the unseen subject in the still photo express various emotions controlled by values of facial AU coefficients, while maintaining her personal characteristics, such as facial geometry, skin color and hair style, as well as the original surrounding background. In contrast to prior work, GATH is purely data-driven and it requires neither a statistical face model nor image processing tricks to enact facial deformations. Additionally, our model is trained from unpaired data, where the input image, with its auxiliary identity label taken from abundance of still photos in the wild, and the target frame are from different persons. In order to effectively learn such model, we propose a novel weakly supervised adversarial learning framework that consists of a generator, a discriminator, a classifier and an action unit estimator. Our work gives rise to template-and-target-free expression editing, where still faces can be effortlessly animated with arbitrary AU coefficients provided by the user.

The full paper at https://arxiv.org/abs/1803.07716.

Reference

2018

  • H. X. Pham, Y. Wang, and V. Pavlovic, “Generative Adversarial Talking Head: Bringing Portraits to Life with a Weakly Supervised Neural Network,” CoRR, vol. abs/1803.07716, 2018.
    [BibTeX]
    @article{hai18gath_arxiv,
    Author = {Hai Xuan Pham and Yuting Wang and Vladimir Pavlovic},
    Journal = {CoRR},
    Title = {Generative Adversarial Talking Head: Bringing Portraits to Life with a Weakly Supervised Neural Network},
    Volume = {abs/1803.07716},
    Year = {2018}}

Leave a Reply

Your email address will not be published. Required fields are marked *