Warning:
This page is
provided for historical and archival purposes
only. While the seminar dates are correct, we offer no
guarantee of informational accuracy or link
validity. Contact information for the speakers, hosts and
seminar committee are certainly out of date.
Department of Computer Science Columbia University New York, NY 10027
Structures of dynamic scenes can only be recovered using a real-time range sensor. Depth from defocus offers an elegant solution to fast and dense range estimation. It is computational efficient as it circumvents the correspondence problem faced by stereo and the feature tracking problem involved in structure from motion. However, accurate depth estimation requires theoretical and practical solutions to a variety of problems including recovery of textureless surfaces, blur estimation, and magnification variations caused by defocusing. Both textured and textureless surfaces are recovered using an optimized illumination pattern that is projected via the same optical path used to acquire images. A prototype focus range sensor has been developed that produces up to 512x480 depth estimates at 30 Hz with an accuracy of 0.3%.
In addition to the above topic, I'll briefly summarize recent results on diffuse reflectance and visual learning.
Host: Katsushi Ikeuchi, x8-6349