Skip to main content
Log in

Midlevel cues mean shift visual tracking algorithm based on target-background saliency confidence map

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Because of the well-known merits of non-parametric estimation and fast mode matching, the mean shift tracking algorithm has been proposed with demonstrated success. But the traditional mean shift utilizes center-weighted color histogram as the reference model which is susceptible to interference of background pixels, and that can result in the compromised tracking robustness. In order to solve this problem, we propose a midlevel cues mean shift visual tracking algorithm based on target-background confidence map saliency-weighted model. A discriminative appearance model based on superpixels is introduced, thereby it can facilitate a tracker to distinguish between target and background by different weights. This improved mean shift tracker is formulated by computing a target-background saliency confidence map and mean shift iteration, and then we can obtain the position in next frame. Experimental results demonstrate that the improved mean shift tracker is able to handle occlusion and recover it from tracking drifts, furthermore, the improved algorithm facilitates foreground object segmentation during tracking.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Achanta R, Shaji A, Smith K et al (2010) Slic superpixels. Technical Report, EPFL

  2. Achanta R, Shaji A, Smith K et al (2012) SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans Pattern Anal Mach Intell 34(11):2274–2282

    Article  Google Scholar 

  3. An XW, Kim J, Han YJ (2014) Optimal colour-based mean shift algorithm for tracking objects. IET Comput Vis 8(3):235–244

    Article  Google Scholar 

  4. Cheng MM, Zhang GX, Mitra NJ et al (2011) Global contrast based salient region detect. CVPR:409–416

  5. Dulai A, Stathaki T (2014) Mean shift tracking through scale and occlusion. IET Sign Process 6(5):534–540

    Article  MathSciNet  Google Scholar 

  6. Kwon J, Lee K (2010) Visual tracking decomposition. Proc IEEE Comput Soc Conf Comput Vision Pattern Recogn: 1269–1276

  7. Leichter I (2012) Mean shift trackers with cross-bin metrics. IEEE Trans Pattern Anal Mach Intell 4:695–706

    Article  Google Scholar 

  8. Liu HP, Wang LY, Sun FC (2014) Mean-shift tracking using fuzzy coding histogram. Int J Fuzzy Syst 16(4):457–467

    Google Scholar 

  9. Liu YM, Zhou SB (2014) A self-adaptive edge matching method based on mean shift and its application in video tracking. Imag Sci J 62(4):206–216

    Article  Google Scholar 

  10. Lucena M, Fuertes JM, De La Blanca Pérez N et al (2010) Tracking people in video sequences using multiple models. Multimed Tools Appl 49(2):371–403

    Article  Google Scholar 

  11. Wang S, Lu HC, Yang F et al (2011) Superpixel tracking. Proc IEEE Int Conf Comput Vision: 1323–1330

  12. Wang LF, Yan HP, Wu HY et al (2013) Forward-backward mean-shift for visual tracking with local-background- weighted histogram. IEEE Trans Intell Transp Syst 14(3):1480–1489

    Article  Google Scholar 

  13. Wu Y, Lim J, Yang MH (2013) Online object tracking: a benchmark. Proc IEEE Comput Soc Conf Comput Vision Pattern Recogn: 2411–2418

  14. Yu W, Tian XH, Hon ZQ et al (2015) Multi-Scale mean shift tracking. IET Comput Vis 9(1):110–123

    Article  Google Scholar 

  15. Zhang KH, Zhang L, Liu QS et al (2014) Fast visual tracking via dense spatio-temporal context learning. Lect Notes Comput Sci 8693(5):127–141

    Google Scholar 

  16. Zhang KH, Zhang L, Yang MH (2012) Real-time compressive tracking. Lect Notes Comput Sci 7574(3):864–877

    Article  Google Scholar 

  17. Zheng SW, Liu LS, Qiao H (2014) Spatial-temporal saliency feature extraction for robust mean-shift tracker. Lect Notes Comput Sci 8834:191–198

    Article  Google Scholar 

  18. Zhou ZY, Sheng YH, Wang YM et al (2012) Robust target tracking with mean shift algorithm. J Comput Inform Syst 8(3):1333–1340

    Google Scholar 

  19. Zhou ZP, Zhou MZ, Shi XF (2015) Target tracking based on foreground probability. Multimedia Tools and Applications 74(2): in Press

  20. Zhu GK, Wang Q, Yuan Y (2014) Tag-Saliency: Combining bottom-up and top-down information for saliency detection. Comput Vis Image Underst 118:40–49

    Article  Google Scholar 

  21. Zuo JY, Liang Y, Pan Q et al (2010) Camshift tracker based on multiple color distribution models. Acta Automat Sin 34(7):736–742 (in Chinese)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhengping Hu.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, Z., Xie, R., Wang, M. et al. Midlevel cues mean shift visual tracking algorithm based on target-background saliency confidence map. Multimed Tools Appl 76, 21265–21280 (2017). https://doi.org/10.1007/s11042-016-4068-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-016-4068-9

Keywords

Navigation