Joint learning of object and action detectors

Vicky Kalogeiton 1, 2 Philippe Weinzaepfel 3 Vittorio Ferrari 2 Cordelia Schmid 1
1 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
2 CALVIN research group [Edinburgh]
IPAB - Institute of Perception, Action and Behaviour
Abstract : While most existing approaches for detection in videos focus on objects or human actions separately, we aim at jointly detecting objects performing actions, such as cat eating or dog jumping. We introduce an end-to-end multitask objective that jointly learns object-action relationships. We compare it with different training objectives, validate its effectiveness for detecting objects-actions in videos, and show that both tasks of object and action detection benefit from this joint learning. Moreover, the proposed architecture can be used for zero-shot learning of actions: our multitask objective leverages the commonalities of an action performed by different objects, e.g. dog and cat jumping , enabling to detect actions of an object without training with these object-actions pairs. In experiments on the A2D dataset [50], we obtain state-of-the-art results on segmentation of object-action pairs. We finally apply our multitask architecture to detect visual relationships between objects in images of the VRD dataset [24].
Type de document :
Communication dans un congrès
ICCV 2017 - IEEE International Conference on Computer Vision, Oct 2017, Venice, Italy
Liste complète des métadonnées

Littérature citée [52 références]  Voir  Masquer  Télécharger
Contributeur : Thoth Team <>
Soumis le : lundi 21 août 2017 - 16:56:10
Dernière modification le : mercredi 11 avril 2018 - 01:57:45


Fichiers produits par l'(les) auteur(s)


  • HAL Id : hal-01575804, version 1


Vicky Kalogeiton, Philippe Weinzaepfel, Vittorio Ferrari, Cordelia Schmid. Joint learning of object and action detectors. ICCV 2017 - IEEE International Conference on Computer Vision, Oct 2017, Venice, Italy. 〈hal-01575804〉



Consultations de la notice


Téléchargements de fichiers