A Robust Facial Feature Point Tracker using Graphical Models
Serhan Cosar, Müjdat Çetin, and Aytül Erçil
ISPA2007 - IEEE International Symposium on Image and Signal Processing and Analysis
In recent years, facial feature point tracking becomes a research area that is used in human-computer interaction (HCI), facial expression analysis, etc. In this paper, a statistical method for facial feature point tracking is proposed. Feature point tracking is a challenging topic in scenarios involving arbitrary head movements and uncertain data because of noise and/or occlusions. As a natural human action, people move their heads or occlude their faces with their hands or ﬁngers. With this motivation, a graphical model that uses temporal information about feature point movements as well as the spatial relationships between such points, which is updated in time to deal with different head pose variations, is built. Based on this model, an algorithm that achieves feature point tracking through a video observation sequence is implemented. Also, an occlusion detector is proposed to automatically detect occluded points. The proposed method is applied on 2D gray scale video sequences consisting head movements and occlusions and the superiority of this approach over existing techniques is demonstrated.