How soda cans, bad weather and snakes present challenges to the latest technology
By Laine Higgins, Dylan Moriarty and Jieqian Zhang, The Wall Street Journal, Nov. 15, 2018
image (not from article) from
Companies from Silicon Valley to Detroit are racing to become the first to bring driverless cars to the mass market. But doing so relies on autonomous vehicles’ sensors properly interpreting inputs from the environment, and that’s not as easy as it seems.
Everyday scenarios that human drivers navigate with ease, such as bumpy roads or snowy weather, can be unintelligible to a car’s sensors. But there is no industry standard for what sensors or backup mechanisms driverless cars must have if primary sensors fail.
Most autonomous vehicles currently under development employ some combination of lidar—a 3-D laser view of the environment—radar and cameras. There are other types of sensor technologies in the early stages of development, but they aren't being used by the industry's biggest players.
Before self-driving cars start cruising down any old block, they'll need to overcome not only physical obstacles, but also developmental hurdles that accompany any nascent technology. Here's a look at the most commonly used sensors - and the roadblocks that can trip them up. ...
2 / 3
LIDAR
Lidar sensors are located on top of the car and rotate 360 degrees, shooting up to 150,000 laser pulses per second to create a 3-D map of the car’s surroundings.
Curved objects, such as the bottom of a soda can, can skew data from radar sensors by falsely amplifying radio waves that bounce off the object. This makes the car think it is approaching a much larger object and could errantly trigger emergency mechanisms such as sudden braking or swerving.
Objects that are low to the ground, such as snakes or sharp debris, can easily fail to register with cameras or lidar sensors placed on top of the car. Missed objects can become roadkill or potentially damage the car.
Cameras aren’t as effective at capturing the environment during low visibility conditions, including after sundown. Lidar and radar are unaffected by darkness, however, because they collect information about the environment from electromagnetic wavelengths higher and lower than that of visible light.
Radar sensors work by matching an object’s outline to a catalog of shapes stored in the car’s operating system’s memory. Objects with irregular shapes may confuse radar, yield inaccurate readings, and possibly cause the car to needlessly brake or swerve.
Driving on uneven surfaces can compromise the calibration of lidar and cause excessive wear on the ball-bearings that stabilize the sensor atop the car. The more often a vehicle encounters these conditions, the more frequently the lidar sensor will need to be replaced.
When bugs collide with the lidar sensors or camera lenses while driving, they leave smudges that require cleaning to return to peak functionality.
Snow is a nuisance to cameras, as falling flakes reduce long-range visibility and snow that accumulates on or near lenses can render a camera useless. Plus, it’s hard to remove frost from a spherical lidar sensor with a flat ice scraper.
Many independent researchers, startups and well-funded players like Uber Technologies Inc. and the Cruise unit of General Motors Co. are working to resolve these challenges. Streets around Silicon Valley, Pittsburgh and suburban Phoenix, among other places, have become test tracks for many of these efforts. They’re testing their vehicles both on obstacle-free closed courses without human drivers and on public roadways with human “safety operator” backups.
States and federal regulators are still trying to determine how to best oversee the technology to ensure safety while also encouraging its development. Arizona, once a state with few rules governing self-driving car testing, took Uber off the roads in March after a fatal collision between one of the company’s test vehicles and a pedestrian near Tempe.
But regulators are still signing off on autonomous vehicle projects. Alphabet Inc.’s Waymo subsidiary tests its fully driverless cars on public roads in Chandler, Ariz., and won permission from California in October to test its autonomous vehicles without human backups in a handful of areas near Google’s headquarters in Mountain View.
“Not every sensor is perfect by itself,” says Chuck Price, vice president of product at autonomous trucking company TuSimple. Until the technology becomes road-ready for the mass market, he says, companies developing autonomous vehicles will continue to employ and test multiple types of sensors. And however flawed humans may be, they’re still the best drivers on the road.
No comments:
Post a Comment