Parameterized Distortion-Invariant Feature for Robust Tracking in Omnidirectional Vision

Abstract

Central catadioptric omnidirectional images exhibit serious nonlinear distortions due to the involved quadratic mirrors. Therefore, features based on the conventional pin-hole model are hard to achieve satisfactory performances when directly applied to the distorted omnidirectional images. This paper analyzes the catadioptric geometry to facilitate modeling the nonlinear distortions of omnidirectional images. Different to the conventional imaging model, the prior information is considered in catadioptric system. A parameterized neighborhood mapping model is proposed to efficiently calculate the neighborhood of an object based on its measurable radial distance in the image plane. On the basis of the parameterized nonlinear model, a distortion-invariant fragment-based joint-feature mixture model of Gaussian is presented for human target tracking in omnidirectional vision. Under the framework of Gaussian Mixture Model, the problem of feature matching is converted into the feature clustering. The joint probability distribution of a joint-feature class is modeled by a mixture of Gaussian. A weight contribution mechanism is designed to flexibly weight the fragments contribution based on their responses, which leads to a robust tracking even under serious partial occlusion. Finally, experiments validate the advantage of the proposed algorithm over other conventional approaches.
full text

Bookmark the permalink.

Comments are closed.