Depth-Aware Salient Object Detection and Segmentation via Multiscale Discriminative Saliency Fusion and Bootstrap Learning - Archive ouverte HAL Access content directly
Journal Articles IEEE Transactions on Image Processing Year : 2017

Depth-Aware Salient Object Detection and Segmentation via Multiscale Discriminative Saliency Fusion and Bootstrap Learning

(1) , (1) , (1) , (1) , (2) , (3)
1
2
3

Abstract

This paper proposes a novel depth-aware salient object detection and segmentation framework via multiscale discriminative saliency fusion (MDSF) and bootstrap learning for RGBD images (RGB color images with corresponding Depth maps) and stereoscopic images. By exploiting low-level feature contrasts, mid-level feature weighted factors and high-level location priors, various saliency measures on four classes of features are calculated based on multiscale region segmentation. A random forest regressor is learned to perform the discriminative saliency fusion (DSF) and generate the DSF saliency map at each scale, and DSF saliency maps across multiple scales are combined to produce the MDSF saliency map. Furthermore, we propose an effective bootstrap learning-based salient object segmentation method, which is bootstrapped with samples based on the MDSF saliency map and learns multiple kernel support vector machines. Experimental results on two large datasets show how various categories of features contribute to the saliency detection performance and demonstrate that the proposed framework achieves the better performance on both saliency detection and salient object segmentation.
Vignette du fichier
depthaware.jpg (269.11 Ko) Télécharger le fichier
Format : Figure, Image
Origin : Files produced by the author(s)

Dates and versions

hal-01650409 , version 1 (16-04-2018)

Identifiers

Cite

Hangke Song, Zhi Liu, Huan Du, Guangling Sun, Olivier Le Meur, et al.. Depth-Aware Salient Object Detection and Segmentation via Multiscale Discriminative Saliency Fusion and Bootstrap Learning. IEEE Transactions on Image Processing, 2017, 26 (9), pp.4204 - 4216. ⟨10.1109/TIP.2017.2711277⟩. ⟨hal-01650409⟩
245 View
0 Download

Altmetric

Share

Gmail Facebook Twitter LinkedIn More