Learning from Synthetic Humans

Gül Varol 1, 2 Javier Romero 3 Xavier Martin 2 Naureen Mahmood 3 Michael Black 3 Ivan Laptev 1 Cordelia Schmid 2
1 WILLOW - Models of visual object recognition and scene understanding
DI-ENS - Département d'informatique de l'École normale supérieure, Inria de Paris
2 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
Abstract : Estimating human pose, shape, and motion from images and videos are fundamental challenges with many applications. Recent advances in 2D human pose estimation use large amounts of manually-labeled training data for learning convolutional neural networks (CNNs). Such data is time consuming to acquire and difficult to extend. Moreover, manual labeling of 3D pose, depth and motion is impractical. In this work we present SURREAL (Synthetic hUmans foR REAL tasks): a new large-scale dataset with synthetically-generated but realistic images of people rendered from 3D sequences of human motion capture data. We generate more than 6 million frames together with ground truth pose, depth maps, and segmentation masks. We show that CNNs trained on our synthetic dataset allow for accurate human depth estimation and human part segmentation in real RGB images. Our results and the new dataset open up new possibilities for advancing person analysis using cheap and large-scale synthetic data.
Type de document :
Communication dans un congrès
2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), Jul 2017, Honolulu, United States. 2017
Liste complète des métadonnées

Contributeur : Gul Varol <>
Soumis le : mardi 11 avril 2017 - 16:43:54
Dernière modification le : jeudi 15 juin 2017 - 09:09:17


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01505711, version 1
  • ARXIV : 1701.01370



Gül Varol, Javier Romero, Xavier Martin, Naureen Mahmood, Michael Black, et al.. Learning from Synthetic Humans. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), Jul 2017, Honolulu, United States. 2017. <hal-01505711>



Consultations de
la notice


Téléchargements du document