Extraction of activity patterns on large video recordings

Abstract : Extracting the hidden and useful knowledge embedded within video sequences and thereby discovering relations between the various elements to help an efficient decision-making process is a challenging task. The task of knowledge discovery and information analysis is possible because of recent advancements in object detection and tracking. The authors present how video information is processed with the ultimate aim to achieve knowledge discovery of people activity and also extract the relationship between the people and contextual objects in the scene. First, the object of interest and its semantic characteristics are derived in real-time. The semantic information related to the objects is represented in a suitable format for knowledge discovery. Next, two clustering processes are applied to derive the knowledge from the video data. Agglomerative hierarchical clustering is used to find the main trajectory patterns of people and relational analysis clustering is employed to extract the relationship between people, contextual objects and events. Finally, the authors evaluate the proposed activity extraction model using real video sequences from underground metro networks (CARETAKER) and a building hall (CAVIAR).
Type de document :
Article dans une revue
IET Computer Vision, IET, 2008, Special Issue on Visual Information Engineering, 2 (2), pp 108-128
Liste complète des métadonnées

Littérature citée [42 références]  Voir  Masquer  Télécharger

Contributeur : Jose Luis Patino Vilchis <>
Soumis le : vendredi 16 juillet 2010 - 08:43:24
Dernière modification le : mardi 24 juillet 2018 - 15:48:06
Document(s) archivé(s) le : vendredi 22 octobre 2010 - 12:13:18


Fichiers produits par l'(les) auteur(s)


  • HAL Id : inria-00502826, version 1



Jose Luis Patino Vilchis, Hamid Benhadda, Etienne Corvee, François Bremond, Monique Thonnat. Extraction of activity patterns on large video recordings. IET Computer Vision, IET, 2008, Special Issue on Visual Information Engineering, 2 (2), pp 108-128. 〈inria-00502826〉



Consultations de la notice


Téléchargements de fichiers