FREE GROUND SHIPPING on all US web orders over $99! (excluding AK and HI)

215-884-8105 Toll-Free 1-800-659-2250 Fax 215-884-0418

Ultrasound & LiDAR—Improving Vision for People & Robots

Posted by Ilena Di Toro | Posted on July 19, 2022

I’m going to state the obvious. Our eyes are taking in a lot of information. Now for the not so obvious: How can technology be utilized to improve vision and help emerging technologies like self-driving cars and robots to see?

As you can guess research is looking into (no pun intended) these things and scientists have discovered that technologies that are currently used can be utilized to both improve vision and put into service in self driving cars and robots.

One way to improve vision may come from a familiar yet, surprising source. Scientists at the University of Southern California Viterbi School of Engineering’s Department of Biomedical Engineering are researching the use of ultrasound as a way to help persons who are blind to see.

That’s right, ultrasound waves, which are used in getting images of a baby in-utero, may be utilized to achieve vision. In this study, the eyes of blind rats were stimulated by mechanical pressure generated by ultrasound waves, similar to how bright spots and shapes appear when you gently push on your eyeballs when your eyes are closed. The neurons in the retina have channels that respond to mechanical stimulation and scientists found that the neurons become activated when they used ultrasound.

They created a small ultrasound device that was able to be directed to a specific region of the rat’s eye in order to send sound waves to the retina. These high-frequency sounds can be manipulated and focused so that when the waves are projected as pattern, such as a letter from the alphabet, the rat’s brain was able to pick up a similar pattern.

Since rats can’t communicate what they are seeing, scientists measured visual activity from the rats’ visual cortex by attaching a multi-electrode array. Based on the visual activities that were recorded, it was found that rats were able to perceive visualization similar to the ultrasound stimulation pattern that was projected to the eye. Scientists plan to continue this work with primates and hopeful progress to human trials.

Now for utilizing technology to help robots and self-driving cars see. Currently, robotics companies are using Light Detection and Ranging (LiDAR) for imaging. This is works like radar but it sends out short pulses of light from lasers. It has its drawbacks. Since it requires detection of weak reflected light signals, other LiDAR systems and ambient sunlight can overwhelm the detector. Also, it can take a long time to scan a large area, such as a highway.

To overcome these things, researchers in the lab of Joseph Izatt, a professor of biomedical engineering at Duke University utilized another form of LiDAR called frequency-modulated continuous wave (FMCW LiDAR). This form of LiDAR is similar to optical coherence tomography (OCT) but instead of imaging eye tissue, its high-resolution capabilities are being used for distance and speed.

Since FMCW LiDAR sends out a laser beam that shifts between frequencies, when the detector gathers light to measure reflection time, it can tell between specific light frequency patterns. This allows it to work in all kinds of lighting conditions and it can do so at a very high speed. It measures any phase shift against the laser beams, which is a more accurate way to determine distance than current LiDAR systems.

To scan the laser, researchers used a diffraction grating that works like a prism. This breaks the laser into a rainbow of frequencies that spread out as they travel from the source. Since the original laser is sweeping through the range of frequencies, this translates into sweeping the LiDAR beam in such a way that the system can cover a wide area without losing depth or accuracy. As a result, the system has a greater imaging range and speed, meaning it can capture the details of moving body parts, such as a nodding head or clenching hand, in real time.

“These are exactly the capabilities needed for robots to see and interact with humans safely or even to replace avatars with live 3D video in augmented reality,” said Izatt. “The world around us is 3D, so if we want robots and other automated systems to interact with us naturally and safely, they need to be able to see us as well as we can see them.”

Sources:
https://viterbischool.usc.edu/news/2022/04/ultrasound-gave-us-our-first-baby-pictures-can-it-also-help-the-blind-see/

https://pratt.duke.edu/about/news/oct-for-robots

Leave a Reply