Username: Password:
Menu 
VPA
Computer Vision And Pattern Analysis Laboratory Home Page  Home
People  People
Publications  Publications
Publications  Databases
Contact Information  Contact
Research
Supported Research Projects  Supported Research Projects
Research Activites  Research Activites
Research Groups
SPIS - Signal Processing and Information Systems Lab.SPIS - Signal Processing and Information Systems Lab.
Medical Vision and Analysis Group  Medical Research Activities
Biometrics Research Group  Biometrics Research Group
SPIS - Signal Processing and Information Systems Lab.MISAM - Machine Intelligence for Speech Audio and Multimedia.
Knowledge Base
  Paper Library
  VPA Lab Inventory
  Databases in VPALAB
  Recordings
Calendar
<<September 2017>>
Mo Tu We Th Fr Sa Su
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30
Upcoming Events:
None



Stereo Based 3D Head Pose Tracking




Project Team
  • Batu Akan
  • Aytul Ercil
  • Mujdat Cetin

ContactBatu Akan Send e-mail
Project Description
In this project a new stereo-based 3D head tracking technique, based on scale-invariant feature transform (SIFT) features, that is robust to illumination changes is proposed. Also two major tracking techniques based on normal flow constraint (NFC) and 3D registration based method (ICP) is reviewed and compared against our own. A 3D head tracker is very important for many vision applications. The resulting tracker output parameters can be used to generate a stabilized view of the face that can be used as input to many existing 2D techniques such as facial expression analysis, lip reading, eye tracking and face recognition.

Our system can automatically initialize using a simple 2D face detector. The face detector is only sensitive to frontal heads therefore the initial pose of the head can be assumed to be aligned with the camera. SIFT extracts salient points from the intensity images and by matching them between frames. Together with the depth image and the matched features we obtain 3D correspondences. Using the unit quaternion method we recover the 3D motion parameters.


Home Back Make a Comment