If you learned everything you know about self-driving cars from Elon Musk’s Twitter feed, you’re missing a lot. Companies including Waymo (part of Google) and startups such as Zoox have unveiled autonomous shuttle concepts, as have traditional manufacturers including Cadillac and Toyota (it was one of Toyota’s shuttles that collided with a visually impaired athlete during the Tokyo Paralympics last month). And Waymo is currently allowing the general public to hail its driverless shuttles in Phoenix. There’s also a world of smaller companies working to build the hardware that will help driverless cars, shuttles, and delivery bots perceive our world. One of those companies is AEye, a California-based lidar (which stands for light detection and ranging) firm with a sensor that, when mounted on a car, can detect obstacles—even small ones—from quite a long distance.

We saw the current iteration of AEye’s technology at work during a late-June test at a facility in Michigan. A Ford Fusion with an AEye lidar mounted on top was stationed near the entrance to a tunnel like one you might come across on an urban freeway. The road curved as it entered the tunnel, and AEye reps placed various obstacles in the shade created by the overhang. Two humanoid dummies and a canine one were set 361 feet away. Five large bricks were scattered in the road 33 feet past the dummies.

From where we stood—beneath a tent pitched next to the car—none of the obstacles were clearly visible. AEye had intended to showcase the system’s abilities in both good weather and bad, setting up a rainmaker between the car and the tunnel. But at the time of the test, natural rain was coming down so hard that AEye’s engineers had to raise their voices to be heard over the sound of water pounding on the roof of the tent.

a eye liar set up atop a car

AEye

Weather matters here as lidar works by sending out laser pulses, with a receiver sensing the light reflected from any obstacles it encounters and—aided by a lot of code—using that information to pinpoint the location and the type of obstacles in a car’s path. Driving rain like what we were experiencing during the AEye test could theoretically flummox a lidar system. Water can absorb some of the light the laser sends out, leaving both less light and information to bounce back to the system’s sensors.

Watch Test Video

AEye’s system handled the rain well, however. When the car’s lidar was turned on, a display outside the car—set up specifically for the demonstration—showed the system’s interpretation of the feedback from its sensors. We could identify the outline of the tunnel walls and a roughly evenly spaced array of dots that showed the road surface. A few rows of closer-spaced dots corresponded to the sheet of rain pouring over the entrance to the tunnel. Beyond that, three clusters of dots showed that the lidar system was registering the dummies placed in its path. Further still, a few more dot clusters represented the bricks on the ground.

“I’ve never seen a demo like that one before—in a real-world scenario under poor weather, behind the windshield, while still being able to achieve the distance and detection,” said Sam Abuelsamid, a principal research analyst at Guidehouse Insights also present for the AEye demonstration. He added, “What we saw was really impressive.”

Stephen Lambright, AEye’s chief marketing officer, said part of the company’s difference lies in the choice to separate the part of module that sends laser pulses from the part that receives them. Lambright said other companies integrate both functions into a single part, which means the laser can’t send a new pulse until the feedback from a prior one has returned. AEye’s two-part solution allows the laser to send more pulses in less time, which means more data. AEye has programmed its system so that when a laser pulse comes back indicating an object in the car’s path, the laser sends several repeat pulses to the same area to fill out the picture of the object in question.

AEye has also diverged from the crowd in choosing its laser. It uses a 1550-nanometer laser as opposed to the cheaper 905-nanometer laser favored by many others in the lidar world (those measurements refer to the wavelength of the light emitted by the laser). Light in the 905-nanometer spectrum can cause retinal damage, so those lasers are subject to regulations that limit their power, a necessary safety step that also limits the distance at which they can detect obstacles. The lasers AEye uses (Volvo’s lidar partner, Luminar, uses 1550 lasers too) are safe for the eyes, and can send out laser pulses that will travel farther than the shorter-wave lasers.

Had the Fusion on display really been driving itself, it would have picked up on the obstacles in the road before a human could have, especially given the heavy rain. Seeing trouble on the horizon is half of being a good driver. The other part is knowing what to do next. Those problems will belong to someone else; AEye designs and engineers lidar components, but doesn’t design the self-driving systems they’ll eventually help power, so the burden of programming a car’s driver-assistance systems to avoid obstacles will fall to its partners. Continental will build and license AEye’s systems for the automotive market. Other applications include aerospace, construction, mining, and smart cities projects.

AEye still faces challenges on the path to widespread adoption, cost perhaps chief among them. Lambright says AEye is currently assuming a per-car cost of less than $1000 for its lidar setup and that the company is on a trajectory to sell its modules for around $100 a piece. But AEye’s technology is still in the early development stages, trajectories can change, and competitor Luminar says it already has a production-ready unit that costs just $500.

There may be one other stumbling block. The federal agencies that regulate autonomous vehicle testing and transportation safety are still muddling through how best to regulate self-driving cars, and Tesla is now the subject of an investigation that could result in a mass recall of its Autopilot-equipped cars. So, don’t expect to see an AEye module in your next new car. But next time you see Elon Musk bloviating about artificial intelligence on Twitter, remember that he’s not the only one working on self-driving cars.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io