23 November 2016 Exploiting multiple contexts for saliency detection
Mengnan Du, Xingming Wu, Weihai Chen, Jianhua Wang
Author Affiliations +
Abstract
A salient object detection method by extensively modeling contextual information in both the saliency feature extraction and the saliency optimization procedure is proposed. First, a sequence of multicontext features is extracted for each segmented image region. This multicontext feature encoding effectively represents the characteristics of image regions and is further mapped to the initial saliency value estimation using a nonlinear regressor. Second, contextual information is also utilized to optimize the initial saliency map, which is realized by constructing a region-level conditional random field (CRF). As such, the quality of the initial coarse saliency maps is promoted in a more principled manner. Third, multiple CRFs, defined over different scales of segmentation, are calculated and integrated so that different ranges of contextual information could contribute to the saliency optimization. Eventually, consistent saliency maps with uniformly highlighted salient regions and clear boundaries are generated. The proposed method is extensively evaluated on three public benchmark datasets, and experimental results demonstrate that our method can produce promising performance when compared to state-of-the-art salient object detection approaches.
© 2016 SPIE and IS&T 1017-9909/2016/$25.00 © 2016 SPIE and IS&T
Mengnan Du, Xingming Wu, Weihai Chen, and Jianhua Wang "Exploiting multiple contexts for saliency detection," Journal of Electronic Imaging 25(6), 063005 (23 November 2016). https://doi.org/10.1117/1.JEI.25.6.063005
Published: 23 November 2016
Lens.org Logo
CITATIONS
Cited by 4 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image segmentation

Feature extraction

Principal component analysis

Visualization

Eye

Optimization (mathematics)

Binary data

Back to Top