Name: Justus Thies
Position: Post Doctor
Phone: +49-89-289-18456
Room No: 02.13.042


Justus Thies is a Ph.D. candidate at the University of Erlangen-Nuremberg. Currently, he joined the Visual Computing Lab at the Technical University of Munich (TUM). His research is mainly focused on marker-less motion capturing of facial performances. Beside real-time facial tracking approaches he is also interested in its applications. Thus, his work combines methods from both, the Computer Vision and the Computer Graphics field.



FaceForge: Markerless Non-Rigid Face Multi-Projection Mapping
Christian Siegl, Vanessa Lange, Marc Stamminger, Frank Bauer, Justus Thies
ISMAR 2017
In this paper, we introduce FaceForge, a multi-projection mapping system that is able to alter the appearance of a non-rigidly moving human face in real time.
[bibtex][project page]


Face2Face: Real-time Face Capture and Reenactment of RGB Videos
Justus Thies, Michael Zollhöfer, Marc Stamminger, Christian Theobalt, Matthias Nießner
CVPR 2016 (Oral)
We present a novel approach for real-time facial reenactment of a monocular target video sequence (e.g., Youtube video). The source sequence is also a monocular video stream, captured live with a commodity webcam. Our goal is to animate the facial expressions of the target video by a source actor and re-render the manipulated output video in a photo-realistic fashion.
[video][bibtex][supplemental][project page]


Real-time Expression Transfer for Facial Reenactment
Justus Thies, Michael Zollhöfer, Matthias Nießner, Levi Valgaerts, Marc Stamminger, Christian Theobalt
ACM Transactions on Graphics 2015 (TOG)
We present a method for the real-time transfer of facial expressions from an actor in a source video to an actor in a target video, thus enabling the ad-hoc control of the facial expressions of the target actor.
[video][bibtex][project page]