Facial expression communicates information about emotions, regulates interpersonal behavior and person perception, indexes physiologic functioning, and is essential to evaluating preverbal infants. Current human-observer methods of facial expression analysis are labor intensive and difficult to standardize across laboratories and over time. These factors force investigators to use less specific systems whose convergent validity is often unknown. To make feasible more rigorous, quantitative measurement of facial expression in diverse applications, our interdisciplinary research group, with expertise in facial expression and computerized image processing, is developing automated methods of facial expression analysis. In the methods under development, automatic feature extractors detect and track changes in the facial features of a subject in a digitized image sequence (30 images per second). From the extracted features, a neural network based classifier estimates intensities of FACS (Facial Action Coding System) action units (AUs) in each video image. A user interface will permit investigators to define facial configurations (per EMFACS, FACS Dictionary, MAX, or their own specifications) and generate time series or summary data files for statistical analysis.
This work is funded by NIMH Grant R01MH51435.
Cohn, J.F., Zlochower, A., Lien, J., Wu., Y.T., & Kanade, T. (July, 1997). Auto mated face coding: A computer-vision based method of facial expression analysis. 7th European Conference on Facial Expression, Measurement, and Meaning, Salzbur g, Austria.
Cohn, J.F., Kanade, T.K., Wu, Y.T., Lien, J., & Zlochower, A. (August, 1996). Fa cial expression analysis: Preliminary results of a new image-processing based me thod. International Society for Research in Emotion, Toronto.
Yu-Te Wu, "Image Registration Using Wavelet-Based Motion Model And Its Applications", Ph.D. thesis, Dept. of EE , University of Pittsburgh, September, 1997.
Last updated: September 12th, 1997