


But unlike lidar, wide-baseline stereo vision cameras offer longer range and denser sampling to capture any small obstacle in the road, further improving reliability and safety, even in harsh weather, and fueling renewed interest in stereo vision cameras for autonomous vehicles.Īnother non-obvious benefit of long-range sensors that can see beyond 150 meters is the ability to push computations to the cloud. The angle subtended between an object and two cameras with a known baseline distance directly yields the distance to the object, which is exactly the principle of triangulation. Just like lidar, stereo vision directly measures a physical quantity-angles. The distance to the object is proportional to the time of flight, which is a direct measurement, unlike neural network approaches, which are indirect measurements that can only infer the desired quantity from 2D image data. Lidar measures distance by directly measuring a physical quantity-the round-trip time-of-flight of a laser pulse from the vehicle to an object. One such technology is the wide-baseline stereo vision camera, which, like lidar, offers reliable depth measurements. Thankfully, other technologies have filled this gap. The maximum range of lidars is unlikely to increase significantly since the transmit laser power is limited by eye-safety concerns. Since 2007, lidar has improved slowly in range (approximately 150 meters) and density of points (approximately 128 scan lines), but it still does not meet requirements for highway driving at high speeds, which requires denser sampling and longer range. Although lidar enabled autonomous driving in rural and urban environments, the first-place winner of the 2007 DARPA Urban Challenge had an average speed of only 14 mph, in part limited by the short range of the lidar (approximately 50 meters at the time) and density of points (only 64 scan lines).
