On Saturday August 25, a Tesla Model S crashed into a stopped firetruck in San Jose, California. The two occupants inside the Tesla sustained minor injuries, and the 37-year-old driver was arrested on suspicion of driving under the influence of alcohol. According to a police report, he told authorities, “I think I had Autopilot on.” Tesla has not confirmed the semi-autonomous system was in use, but it’s at least the third time this year a Tesla has hit a stopped firetruck at highway speeds. We’ve updated this story, which originally ran on January 25, 2018, about why Autopilot and similar systems have trouble detecting stopped vehicles.
Early Saturday morning, a Tesla Model S driving south on the 101 Freeway slammed into the back of a stopped firetruck in San Jose, California, the latest in a series of crashes that highlight the shortcomings of the increasingly common semi-autonomous systems that let cars drive themselves in limited conditions. A Tesla spokesperson says the automaker has not yet received data from the vehicle, so can’t confirm if Autopilot mode was running (this typically takes a few days), and that Tesla is “working to establish the facts of the incident.”
Whatever the particulars, there’s a serious sense of déjà vu here. In January, a Tesla Model S drove into the back of a stopped firetruck on the 405 freeway in Los Angeles County. The driver apparently told the fire department the car was in Autopilot mode at the time. In May, a Tesla driver in Utah hit a firetruck at highway speeds; she told reporters Autopilot was engaged and she was looking away from the road at the time.
So this latest surprisingly non-deadly debacle—the San Jose Tesla driver and his passenger sustained minor injuries—also raises a technical question: How is it possible that one of the most advanced driving systems on the planet doesn’t see a freaking fire truck, dead ahead?
The car’s manual does warn that the system is ill-equipped to handle this exact sort of situation: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
While working a freeway accident this morning, Engine 42 was struck by a #Tesla traveling at 65 mph. The driver reports the vehicle was on autopilot. Amazingly there were no injuries! Please stay alert while driving! #abc7eyewitness #ktla #CulverCity #distracteddriving pic.twitter.com/RgEmd43tNe
— Culver City Firefighters (@CC_Firefighters) January 22, 2018
Volvo’s semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. “Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed,” Volvo’s manual reads, meaning the cruise speed the driver punched in. “The driver must then intervene and apply the brakes.” In other words, your Volvo won’t brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.
The same is true for any car currently equipped with adaptive cruise control, or automated emergency braking. It sounds like a glaring flaw, the kind of horrible mistake engineers race to eliminate. Nope. These systems are designed to ignore static obstacles because otherwise, they couldn’t work at all.
“You always have to make a balance between braking when it’s not really needed, and not braking when it is needed,” says Erik Coelingh, head of new technologies at Zenuity, a partnership between Volvo and Autoliv formed to develop driver assistance technologies and self-driving cars. He’s talking about false positives. On the highway, slamming the brakes for no reason can be as dangerous as not stopping when you need to.
“The only safe scenario would be don’t move,” says Aaron Ames, from Caltech’s Center for Autonomous Systems and Technologies. That doesn’t exactly work for driving. “You have to make reasonable assumptions about what you care about and what you don’t.”
Raj Rajkumar, who researches autonomous driving at Carnegie Mellon University, thinks those assumptions concern one of Tesla’s key sensors. “The radars they use are apparently meant for detecting moving objects (as typically used in adaptive cruise control systems), and seem to be not very good in detecting stationary objects,” he says.
That’s not nearly as crazy as it may seem. Radar knows the speed of any object it sees, and is also simple, cheap, robust, and easy to build into a front bumper. But it also detects lots of things a car rolling down the highway needn’t worry about, like overhead highway signs, loose hubcaps, or speed limit signs. So engineers make a choice, telling the car to ignore these things and keep its eyes on the other cars on the road: They program the system to focus on the stuff that’s moving.
This unsettling compromise may be better than nothing, given evidence that these systems prevent other kinds of crashes and save lives. And it’s not much of a problem if every human in a semi-autonomous vehicle followed the automakers’ explicit, insistent instructions to pay attention at all times, and take back control if they see a stationary vehicle up ahead.
The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. The problem is that compared to radar, lidar is a young technology. It’s still very expensive, and isn’t robust enough to survive a life of hitting potholes and getting pelted with rain and snow. Just about everybody working on a fully self-driving system—the kind that doesn’t depend on lazy, inattentive humans for support—plans to use lidar, along with radar and cameras.
Except for Elon Musk. The Tesla CEO insists he can make his cars fully autonomous—no supervision necessary—with just radars and cameras. He hasn’t proven his claim just yet, and no one knows if he ever will. Lidar’s price and reliability problems are less of an issue when it comes to a taxi-like service, where a provider can amortize the cost over time and perform regular maintenance. But in today’s cars, meant for average or modestly wealthy consumers, it’s a no-go.
In the meantime, we’re stuck with a flawed system, the result of a compromise made to navigate the world at speed. And when even the best systems available can’t see a big red big firetruck, it’s a stark reminder of how long and winding the path to autonomy actually is.
© 2018 Condé Nast. All rights reserved.
Frank’s source: https://www.wired.com/story/tesla-autopilot-why-crash-radar/
You may be interested
Stranger Things season 2: Netflix show to continue longer than expectedFrank - Dec 11, 2018
Frank's source: http://www.independent.co.uk/arts-entertainment/tv/news/stranger-things-season-2-3-4-5-netflix-ending-finale-a7973336.html
Keeping up with the Kardashian pregnancy rumors: A timelineFrank - Dec 11, 2018
Frank's source: http://www.independent.co.uk/arts-entertainment/tv/news/keeping-up-with-the-kardashian-pregnancy-rumors-a-timeline-a7973421.html
How CSweetener is helping women healthcare leaders succeedFrank - Dec 10, 2018
A 2017 study of 177 publicly-listed biotech companies found women hold just 1 in 10 board seats. And 2012 research from…