黑料社

3 crashes, 3 deaths raise questions about Tesla鈥檚 Autopilot

In this July 8, 2018, file photo, clouds are reflected above the company logo on the hood of a Tesla vehicle outside a showroom in Littleton, Colo. The National Highway Traffic Safety Administration is investigating the crash of a speeding Tesla that killed two people in a Los Angeles suburb. AP

DETROIT 鈥 Three crashes involving Teslas that killed three people have increased scrutiny of the company鈥檚 Autopilot driving system just months before CEO Elon Musk has planned to put fully self-driving cars on the streets.

On Sunday, a Tesla Model S sedan left a freeway in Gardena, California, at a high speed, ran a red light and struck a Honda Civic, killing two people inside, police said.

On the same day, a Tesla Model 3 hit a parked firetruck on an Indiana freeway, killing a passenger in the Tesla.

And on Dec. 7, yet another Model 3 struck a police cruiser on a Connecticut highway, though no one was hurt.

The special crash investigation unit of the National Highway Traffic Safety Administration is looking into the California crash. The agency hasn鈥檛 decided whether its special-crash unit will review the crash that occurred Sunday near Terre Haute, Indiana. In both cases, authorities have yet to determine whether Tesla鈥檚 Autopilot system was being used.

NHTSA also is investigating the Connecticut crash, in which the driver told police that the car was operating on Autopilot, a Tesla system designed to keep a car in its lane and a safe distance from other vehicles. Autopilot also can change lanes on its own.

Tesla has said repeatedly that its Autopilot system is designed only to assist drivers, who must still pay attention and be ready to intervene at all times. The company contends that Teslas with Autopilot are safer than vehicles without it, but cautions that the system does not prevent all crashes.

Even so, experts and safety advocates say a string of Tesla crashes raises serious questions about whether drivers have become too reliant on Tesla鈥檚 technology and whether the company does enough to ensure that drivers keep paying attention. Some critics have said it鈥檚 past time for NHTSA to stop investigating and to take action, such as forcing Tesla to make sure drivers pay attention when the system is being used.

NHTSA has started investigations into 13 Tesla crashes dating to at least 2016 in which the agency believes Autopilot was operating. The agency has yet to issue any regulations, though it is studying how it should evaluate similar 鈥渁dvanced driver assist鈥 systems.

鈥淎t some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?鈥 said Jason Levine, executive director of the nonprofit Center for Auto Safety in Washington.

鈥淚n this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA.鈥

Levine and others have called on the agency to require Tesla to limit the use of Autopilot to mainly four-lane divided highways without cross traffic. They also want Tesla to install a better system to monitor drivers to make sure they鈥檙e paying attention all the time. Tesla鈥檚 system requires drivers to place their hands on the steering wheel. But federal investigators have found that this system lets drivers zone out for too long.

Tesla plans to use the same cameras and radar sensors, though with a more powerful computer, in its fully self-driving vehicles. Critics question whether those cars will be able to drive themselves safely without putting other motorists in danger.

Doubts about Tesla鈥檚 Autopilot system have long persisted. In September, the National Transportation Safety Board, which investigates transportation accidents, issued a report saying that a design flaw in Autopilot and driver inattention combined to cause a Tesla Model S to slam into a firetruck parked along a Los Angeles-area freeway in January 2018. The board determined that the driver was overly reliant on the system and that Autopilot鈥檚 design let him disengage from driving for too long.

In addition to the deaths on Sunday night, three U.S. fatal crashes since 2016 鈥 two in Florida and one in Silicon Valley 鈥 involved vehicles using Autopilot.

David Friedman, vice president of advocacy for Consumer Reports and a former acting NHTSA administrator, said the agency should have declared Autopilot defective and sought a recall after a 2016 crash in Florida that killed a driver. Neither Tesla鈥檚 system nor the driver had braked before the car went underneath a semi-trailer that had turned in front of the car.

鈥淲e don鈥檛 need any more people getting hurt for us to know that there is a problem and that Tesla and NHTSA have failed to address it,鈥 Friedman said.

In addition to NHTSA, states can regulate autonomous vehicles, though many have decided they want to encourage testing.

In the 2016 crash, NHTSA closed its investigation without seeking a recall. Friedman, who was not at NHTSA at the time, said the agency determined that the problem didn鈥檛 happen frequently. But he said that argument has since been debunked.

Friedman said it鈥檚 foreseeable some drivers will not pay attention to the road while using Autopilot, so the system is defective.

鈥淭he public is owed some explanation for the lack of action,鈥 he said. 鈥淪imply saying they鈥檙e continuing to investigate 鈥 that line has worn out its usefulness and its credibility.鈥

In a statement, NHTSA said it relies on data to make decisions, and if it finds any vehicle poses an unreasonable safety risk, 鈥渢he agency will not hesitate to take action.鈥 NHTSA also has said it doesn鈥檛 want to stand in the way of technology given its life-saving potential.

Messages were left Thursday seeking comment from Tesla.

Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, said it鈥檚 likely that the Tesla in Sunday鈥檚 California crash was operating on Autopilot, which has become confused in the past by lane lines. He speculated that the lane line was more visible for the exit ramp, so the car took the ramp because it looked like a freeway lane. He also suggested that the driver might not have been paying close attention.

鈥淣o normal human being would not slow down in an exit lane,鈥 he said.

In April, Musk said he expected to start converting the company鈥檚 electric cars to fully self-driving vehicles in 2020 to create a network of robotic taxis to compete against Uber and other ride-hailing services.

At the time, experts said the technology isn鈥檛 ready and that Tesla鈥檚 camera and radar sensors weren鈥檛 good enough for a self-driving system. Rajkumar and others say additional crashes have proved that to be true.

Many experts say they鈥檙e not aware of fatal crashes involving similar driver-assist systems from General Motors, Mercedes and other automakers. GM monitors drivers with cameras and will shut down the driving system if they don鈥檛 watch the road.

鈥淭esla is nowhere close to that standard,鈥 he said.

He predicted more deaths involving Teslas if NHTSA fails to take action.

鈥淭his is very unfortunate,鈥 he said. 鈥淛ust tragic.鈥

MOST READ
LATEST STORIES
Read more...