Motion-guided Visual Tracking
-
Graphical Abstract
-
Abstract
Motion information is a crucial cue to build a robust tracker, especially in handling object occlusion and fast drift caused by cameras and objects. However, it has not been fully exploited. In this study, we attempt to exploit motion cues to guide visual trackers without bells and whistles. First, we decouple motion into two types: camera motion and object motion. Then, we predict them individually via the proposed camera motion modeling and object trajectory prediction. Each module contains a motion detector and a verifier. As for camera motion, we apply the off-the-shelf keypoint matching method to detect camera movement and propose a novel self-supervised camera motion verifier to validate its confidence. Given the previous object trajectory, object trajectory prediction aims to predict the future location of the target and select a reliable trajectory to handle fast object motion and occlusion. Numerous experiments on several mainstream tracking datasets, including OTB100, DTB70, TC128, UAV123, VOT2018 and GOT10k, demonstrate the effectiveness and generalization ability of our module, with real-time speed.
-
-