fbpx
Sunday, August 14, 2022 | 08:41 pm
blank

This Camera System Is Better Than Lidar For Depth Perception

0
This Camera System Is Better Than Lidar For Depth Perception

So far, almost every autonomous vehicle we’ve encountered uses lidar to determine how far away things are, just as the winners of the DARPA Grand Challenges did back in the early 2000s. But not every AV will use lidar in the future; there are other sensors reaching maturity, some of which may even do a better job. One sensor that recently caught my eye is developed from smartphone camera tech by a company called Light.

Light pivoted from its original position as a provider of cameras for smartphones to become a company that uses imaging technology for automotive applications like advanced driver assistance systems (aka ADAS) and AVs.

Specifically, Light developed an optical camera system, called Clarity, that can also calculate the distance to every pixel it sees. Knowing the exact distance to objects means there is no need for a separate lidar sensor, and it also means more accurate data for machine-learning algorithms (a billboard of a face wouldn’t be recognized as an actual human by Clarity, for example).

The cameras run at 30 Hz, and “for each frame, we’re computationally putting those camera images together and determining the depth of all the objects in the scene, and we usually have about a million points or so per frame,” explained Light co-founder and CEO Dave Grannan.

That system provides a significant advantage over even the most expensive lidar sensors, which only return less than a tenth as many points in a frame. There’s a computational overhead required to calculate the depth of each pixel, but the depth perception is perfectly matched with the camera image, something that’s not possible when different sensors are fused.

This comparison was performed at the GoMentum Station, an autonomous car testing ground at the former Concord Naval Weapons Station in California. Clarity can detect obstacles that lidar misses completely, including a tire on the road at 114 m.

Clarity uses at least two cameras, although “the advantages you get adding three or four are improvements in range and redundancy,” Grannan told me. “You can have a pair at longer focal length and a pair at shorter [in a four-camera setup]. If you run with three at the same focal length and one gets occluded with dirt or something, you still have two, so you’ve got some fail-over.”

Light is currently in trials with eight different industry partners, with 11 planned to be underway by year’s end. Some of the companies are even combining Clarity and lidar. “We’ve got partners working within the level 4, class-8 trucking—so autonomous semi trucks—and they’re going to run lidars and cameras because they want the fault-tolerance and redundancy,” Grannan said. “And because they’ve got the lidars, they don’t really care much about the first 150 meters or so. They want only 150 [meters] and beyond.”

Alternatively, a four-camera setup could cover everything, with no need for lidar. “If somebody really wanted to cover 10 centimeters from the vehicle to 1,000 meters, yeah, we would run four cameras. Two pairs—one wider, another narrower field of view,” he told me.

If the trials go well, we might start seeing Clarity-enabled vehicles on the road in three to four years.

98,656FansLike
643FollowersFollow
9,151FollowersFollow