[Back to RI Seminar Schedule]


Warning:
This page is provided for historical and archival purposes only. While the seminar dates are correct, we offer no guarantee of informational accuracy or link validity. Contact information for the speakers, hosts and seminar committee are certainly out of date.


RI SEMINAR -- Alex Waibel and Jie Yang



ABSTRACT

One of our current research efforts in human-computer interaction is to enhance human-computer communication by processing and combining multiple communication modalities known to be helpful in human communicative situations. Among others, we seek to derive a better model of where a person is in a room, who he/she might be talking to, and what he/she is saying despite the presence of jamming speakers and other sound sources in the room (the cocktail party effect). All of these applications require locating and continuously tracking human faces and modeling their behavior in real-time.

In this talk, we will present our research on user modeling via tracking human face in real-time. We address two important issues: what to track and how to track human faces. We present a stochastic model to characterize the skin-colors of human faces. The information provided by the model is sufficient for tracking a human face in various poses and views. The model can be adapted in real time for different people and different lighting conditions while a person is moving. We then present a model-based approach to implement a real-time face tracker. The system has achieved a rate of up to 30+ frames/second using an HP-9000 workstation with a framegrabber and a Canon VC-C1 camera. It can track a person's face while the person moves freely (e.g., walks, jumps, sits and rises) in a room. Three types of models have been employed to track human faces. In addition to the skin-color model used to register the face, a motion model is used to estimate image motion and to predict search window; and a camera model is used to predict and to compensate for camera motion (panning, tilting, and zooming).

Several human-computer interaction tasks are explored to see how the face tracker can help to make human-computer interaction easier and more natural. We will discuss applications of the face tracker to tele-conferencing, lipreading, sound source localization, and eye/gaze tracking. We will show videos during the presentation and give live demos after the presentation.


Christopher Lee | chrislee@ri.cmu.edu
Last modified: Mon Jan 15 17:50:02 1996