Distortion Invariant Joint-Feature for Visual Tracking in Catadioptric Omnidirectional Vision

Abstract

Central catadioptric omnidirectional images exhibit serious nonlinear distortions due to quadratic mirrors involved. Conventional visual features developed based on the perspective model are hard to achieve a satisfactory performance when directly applied to the distorted omnidirectional image. This paper presents a parameterized neighborhood model to efficiently calculate the adaptive neighborhood of an object based on the measurable radial distance in image plane. On the basis of the parameterized neighborhood model, a distortion invariant joint-feature framework implemented with contour-color fragment mixture model of Gaussian is proposed for visual tracking in catadioptric omnidirectional camera system. Under the framework of Gaussian Mixture Model, the problem of feature matching is converted into feature clustering. A weight contribution mechanism is presented to flexibly weight the fragments based on their responses, which makes the system robustly guided by limited visible fragments even when serious partial occlusion happens. The experiments validate the performance of the proposed algorithm.

full text

Bookmark the permalink.

Comments are closed.