Monday, August 30, 2010

Abstract: Video sequences of real-world situations are often difficult to track with machine vision. Scenes frequently contain visual clutter, repetitive textures and occlusions that make online visual feature tracking difficult. If the camera is allowed to shake or moving objects are present, the exponential search-space of potential feature matches rapidly becomes intractable for real-time applications. In this paper we introduce "Jointly Compatible Pair Linking" (JCPL), an algorithm that efficiently and deterministically identifies the most globally consensual set of feature-measurement matches in tracking problems with probabilistic priors. We demonstrate JCPL as part of a two-stage visual tracking algorithm, showing it correctly resolving significant matching ambiguities in sequences with highly dynamic camera motion while robustly ignoring moving scene objects. In these experiments we show JCPL and the two-stage tracker evaluating a fixed number of tests in an exponential search-space. In one experiment JCPL tested less than 1/200th of the total search space and executed 4.6 times faster than the current gold-standard algorithm "Joint Compatibility Branch and Bound" (JCBB). Given highly ambiguous sequences we show JCPL tracking successfully while standard JCBB chooses incorrect matches and fails. Throughout our experiments the number of costly image matching operations are minimised, where in a typical sequence only 20.4% of the full image matching operations are required.