Visual tracking is a very challenging problem in computer vision as the performance of a tracking algorithm may be degraded due to many challenging issues in the scenes, such as illumination change, deformation, and background clutter. So far no algorithms can handle all these challenging issues. Recently, it has been shown that correlation filters can be implemented efficiently and, with suitable features and kernel functions incorporated, can give very promising tracking results. In this paper, we propose to learn discriminative correlation filters that incorporate information from the variances of the target's appearance features. We have evaluated our filters against several recent tracking methods on the OTB benchmark dataset. Our results show that the additional feature variances help to improve the robustness of the correlation filters in complex scenes.
|Name||DICTA 2017 - 2017 International Conference on Digital Image Computing: Techniques and Applications|
|Conference||2017 International Conference on Digital Image Computing: Techniques and Applications|
|Period||29/11/17 → 1/12/17|