Scientists at Camera Culture group at MIT’s Media Lab have taken their “time of flight” to the next level by improving its resolution 1000 times thereby taking us closer to realizing self-driving cars. The study is published in IEEE Access.
Researchers say that their new approach for time-of-flight enables accurate distance measurements through fog – something that has been proving a challenge until now. The time-of-flight approach gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor. At a range of 2 meters, existing time-of-flight systems have a depth resolution of about a centimeter. That’s good enough for the assisted-parking and collision-detection systems on today’s cars. But for self-driving cars this isn’t enough.
Self-driving cars need to be able to look much farther than 2 meters and that too at greater resolution so as to make quick snappy decision regarding its route.
With time-of-flight imaging, a short burst of light is fired into a scene, and a camera measures the time it takes to return, which indicates the distance of the object that reflected it. The longer the light burst, the more ambiguous the measurement of how far it’s traveled. So light-burst length is one of the factors that determines system resolution.
The other factor, however, is detection rate. Modulators, which turn a light beam off and on, can switch a billion times a second, but today’s detectors can make only about 100 million measurements a second. Detection rate is what limits existing time-of-flight systems to centimeter-scale resolution.
Researchers say they are using some parts of interferometry, some bits of LIDAR, and even principles of acoustics for their latest technique in time-of-flight to achieve 1000-fold increase in resolution.
Gigahertz optical systems are naturally better at compensating for fog than lower-frequency systems. Fog is problematic for time-of-flight systems because it scatters light: It deflects the returning light signals so that they arrive late and at odd angles. Trying to isolate a true signal in all that noise is too computationally challenging to do on the fly.
With low-frequency systems, scattering causes a slight shift in phase, one that simply muddies the signal that reaches the detector. But with high-frequency systems, the phase shift is much larger relative to the frequency of the signal. Scattered light signals arriving over different paths will actually cancel each other out: The troughs of one wave will align with the crests of another. Theoretical analyses performed at the University of Wisconsin and Columbia University suggest that this cancellation will be widespread enough to make identifying a true signal much easier.