Article

Depth-sensing imaging system can peer through fog

MIT researchers have developed a system that can produce images of objects shrouded by fog so thick that human vision can’t penetrate it. It can also gauge the objects’ distance.

An inability to handle misty driving conditions has been one of the chief obstacles to the development of autonomous vehicular navigation systems that use visible light, which are preferable to radar-based systems for their high resolution and ability to read road signs and track lane markers. So, the MIT system could be a crucial step toward self-driving cars.

The researchers tested the system using a small tank of water with the vibrating motor from a humidifier immersed in it. In fog so dense that human vision could penetrate only 36 centimeters, the system was able to resolve images of objects and gauge their depth at a range of 57 centimeters.

Fifty-seven centimeters is not a great distance, but the fog produced for the study is far denser than any that a human driver would have to contend with; in the real world, a typical fog might afford a visibility of about 30 to 50 meters. The vital point is that the system performed better than human vision, whereas most imaging systems perform far worse. A navigation system that was even as good as a human driver at driving in fog would be a huge breakthrough.

“I decided to take on the challenge of developing a system that can see through actual fog,” says Guy Satat, a graduate student in the MIT Media Lab, who led the research. “We’re dealing with realistic fog, which is dense, dynamic, and heterogeneous. It is constantly moving and changing, with patches of denser or less-dense fog. Other methods are not designed to cope with such realistic scenarios.”

Satat and his colleagues describe their system in a paper they’ll present at the International Conference on Computational Photography in May. Satat is first author on the paper, and he’s joined by his thesis advisor, associate professor of media arts and sciences Ramesh Raskar, and by Matthew Tancik, who was a graduate student in electrical engineering and computer science when the work was done.

Related Content