Imagine coming home after a long day at work. Suddenly, a car comes out of a dark side street and turns right in front of you. Luckily, your self-driving car saw this vehicle long before it came into your line of sight and slowed down to avoid an accident. It may sound like magic, but a new technique developed at Caltech could bring it closer to a reality.
With the advent of autonomous vehicles, advanced spacecraft, and other technologies that rely on sensors for navigation, there is an ever-increasing need for advanced technologies that can scan for obstacles, pedestrians, or other objects. But what if something is hidden behind another object?
In a paper recently published in the journal Nature Photonics, Caltech researchers and their colleagues describe a new method that essentially turns nearby surfaces into lenses that can be used to indirectly image previously obscured objects.
The technology, developed in the laboratory of Changhuei Yang, Thomas G. Myers Professor of Electrical Engineering, Bioengineering and Medical Engineering; and investigator of the Heritage Medical Research Institute, is a form of non-line-of-sight (NLOS) sensing – or sensing that detects an object of interest outside the viewer’s line of sight. The new method, dubbed UNCOVER, does this by using nearby flat surfaces, such as walls, as a lens to see the hidden object clearly.
Most current NLOS imaging technologies detect light from a hidden object that is passively reflected from a surface such as a wall. However, since surfaces such as walls primarily scatter light, the techniques do not produce clear images. Computer imaging methods can be used to extract information from scattered light and improve image clarity, but they cannot generate high resolution images.
UNCOVER, however, directly counteracts diffusion through its use of wavefront shaping technology. Wavefront shaping was previously not viable because it required the use of a guide star, an approximate point source of light that helps infer details of the hidden object.
“We know that lenses image a point on another point. If you look through a bad “lens” with matte surfaces, the image of a dot is now blurry and the light spills everywhere, but you can grind and polish the matte surface to direct the light to the correct position, explains electrical engineering graduate student Ruizhi Cao, the first author of the Nature Photonics paper. “That’s how a guidestar basically helps you: it tells us where the little bumps are, so we know how to polish the surface properly.”
Yang and his colleagues discovered that the hidden object itself could be used as a guide star. The result is an NLOS imaging method that reconstructs scattered light into a clear image of the hidden object.
According to Cao, the imaging method could be useful for autonomous driving, rescue missions and other remote sensing-related missions. In the case of autonomous driving, Cao says, “We can see all the traffic at the intersection with this method. It could help cars to foresee the potential danger which cannot be seen directly.
Using UNCOVER could allow automobiles to see as well as humans, but also humans to become better drivers. While a human driver might be able to spot a jaywalker a few feet away, a self-driving car equipped with UNCOVER technology could potentially be able to spot such an instance on the next block, provided conditions are met. imaging are optimal.
UNCOVER imagery could also prove useful beyond Earth – for example, in future robotic missions to explore Mars, Cao says: “We rely on rovers to take images of another planet to to help us better understand this planet. However, for these rovers, some places may be difficult to reach due to limited resources and energy. With the non-line-of-sight imaging technique, we do not need the rover itself to do it. What it takes is finding a place where the light can reach.