Time-of-flight (ToF) depth sensors have become the technology of choice in diverse applications today, from automotive and aviation to robotics, gaming and consumer electronics. These sensors come in two general flavors: LIDAR-based systems that rely on extremely brief pulses of light to sense depth, and continuous-wave systems that emit a modulated light signal over much longer duration. LIDAR can acquire centimeter-accurate depth maps up to a kilometer away in broad daylight but they have low measurement rates and their cost per pixel is orders of magnitude higher than continuous-wave ToF (CW-ToF) devices---whose range, outdoor operation and robustness are extremely limited. Since low cost, large-scale production and high measurement rate often trump other considerations, CW-ToF sensors continue to dominate the consumer electronics and low-end robotics space despite their shortcomings.
The EpiToF is the extension of the Episcan3D to continuous-wave time-of-flight technology. The EpiToF's robustness to different conditions results from its unique imaging capabilites. The sensor can capture images where most light from ambient sources (like the Sun) is blocked out. The EpiToF only captures direct light paths which enables more accurate imaging of scenes with scattered light. The sensor's imaging method also enables interference free imaging with multiple ToF devices and imaging without motion blur.
Our current EpiToF prototype is capable of capturing 320x240 resolution depth images at 7.5 frames per second outdoors in sunlight at ranges up to 15 meters. Increased range and video frame rates are possible with a well engineered system.
The depth cameras that are available on the market today only work indoors. Outside, their active light sources are overwhelmed by sunlight. EpiToF uses a unique imaging technique that blocks almost all ambient light. This allows its low power source to compete with the Sun. This opens up all sorts of exciting new applications for depth cameras such as outdoor imaging and self-driving cars.
EpiToF can differentiate between single-bounce light that reflects directly off surfaces and more complex multiple-bounce light paths that involve interreflection and scattering. The presence of complex light paths confuses most conventional 3D scanners, but EpiToF is robust to these types of light paths and produces accurate measurements.
With regular ToF cameras, diffuse interreflections between walls and ceiling cause depths to be overestimated and the corner to be rounded out. With epipolar imaging, the walls appear straight and meet at a sharp right angle. The conference table appears specular at grazing angles, but the EpiToF captures only the first light bounce. In the bathroom scene, the ghosting on the wall due to reflections from the mirror is suppressed by epipolar imaging. The water fountain is particularly challenging because the direct return from its metallic surface is very weak, but the surface reflects a lot of indirect light back to the sensor.
As continuous wave ToF Cameras appear in more and more devices, they must be able to operate without interfering with each other. While devices of a given make and model can be configured to not interfere with each other by varying modulation frequency across them or synchronization, robustness against the broader ecosystem of CW-ToF sensors is desirable. EpiToF enables interference-free live 3D imaging even for devices that have the exact same modulation frequency and light source wavelength.
Regular ToF imaging has motion blur and strong artifacts at depth discontinuities when moved during frame capture. With epipolar ToF imaging, motion blur has basically no effect and a depth map with a rolling-shutter-like effect is acquired. This can be corrected with a simple image warp computed from the rotation measured with a MEMS gyroscope.
Contemporary depth imaging systems such as the Microsoft Kinect, Intel RealSense, and ToF cameras work indoors where there is little ambient light but not outdoors in Sunlight.
EpiToF's energy-efficient, real time depth sensing technology works both indoors and outside and could power the next generation of outdoor imaging systems.
Today's self-driving cars rely heavily on expensive LIDAR units that only produce sparse point clouds. The EpiToF produces denser pointclouds for high resolution object recognition and obstacle avoidance.
The EpiToF prototype is built from an off-the-shelf continuous-wave time-of-flight camera and a custom built projector.
Performance Specs:
*See paper for range details
The modulated light sheet projector is comprised of:
The laser is collimated and passes through a Powell Lens line generator to form a light sheet that is then deflected and steered by a galvomirror in sync with the ToF camera. The laser is amplitude modulated to produce a line modulating at a fixed frequency.
The EpiToF sensor's light sheet projector illuminates the scene one scanline at a time. The ToF camera and projector are aligned so that by epipolar geometry, one projector scanline corresponds to a single row of pixels in the ToF camera. The ROI on the ToF camera is selected and synchronized so that the exposed row moves in lockstep with the active projector scanline. At each row the camera captures multiple images of the scene at different phases of the modulated wave. These images are then used to calculate the depth map from the phase shift principle.
Since the projected scanline and exposed camera row are synchronized, the exposure can be very short (~100us). This short exposure integrates very little ambient light while still collecting all the light from the projector. In addition, only light paths that follow the epipolar constraint between the projector and camera reach the camera sensor, which blocks almost all multipath light.
For an in-depth description of the technology behind the EpiToF sensor, please refer to our paper and the accompanying video
Supreeth Achar, Joseph R. Bartels, William L. ‘Red’ Whittaker, Kiriakos N. Kutulakos, Srinivasa G. Narasimhan. "Epipolar Time-of-Flight Imaging", ACM SIGGRAPH 2017
This work is sponsored by the Office of Naval Research (Grant N000141512358, DURIP N000141612906), the National Aeronautics and Space Administration (Grant NNX16AD98G), the Defense Advanced Research Projects Agency (REVEAL Grant HR00111620021) and the Natural Sciences and Engineering Reseach Council of Canada (NSERC) under the RPGIN and SPG programs. J. Bartels was supported by NASA fellowship NNX14AM53H.