Human Joint Angle Estimation and Gesture Recognition for Assistive Robotic Vision

Abstract : We explore new directions for automatic human gesture recognition and human joint angle estimation as applied for human-robot interaction in the context of an actual challenging task of assistive living for real-life elderly subjects. Our contributions include state-of-the-art approaches for both low-and mid-level vision, as well as for higher level action and gesture recognition. The first direction investigates a deep learning based framework for the challenging task of human joint angle estimation on noisy real world RGB-D images. The second direction includes the employment of dense trajectory features for on-line processing of videos for automatic gesture recognition with real-time performance. Our approaches are evaluated both qualitative and quantitatively on a newly acquired dataset that is constructed on a challenging real-life scenario on assistive living for elderly subjects.
Document type :
Conference papers
Complete list of metadatas

Cited literature [26 references]  Display  Hide  Download

https://hal.inria.fr/hal-01410854
Contributor : Siddhartha Chandra <>
Submitted on : Thursday, December 8, 2016 - 1:10:31 PM
Last modification on : Thursday, February 7, 2019 - 4:19:35 PM
Long-term archiving on : Tuesday, March 21, 2017 - 3:44:53 PM

File

PaperACVR201654.pdf
Files produced by the author(s)

Identifiers

Citation

Alp Guler, Nikolaos Kardaris, Siddhartha Chandra, Vassilis Pitsikalis, Christian Werner, et al.. Human Joint Angle Estimation and Gesture Recognition for Assistive Robotic Vision. ACVR, ECCV, Oct 2016, Amsterdam, Netherlands. pp.415 - 431, ⟨10.1007/978-3-319-48881-3_29⟩. ⟨hal-01410854⟩

Share

Metrics

Record views

240

Files downloads

615