California prosecutors have filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot who ran a red light, slammed into another car and killed two people in 2019.
The defendant appears to be the first person to be charged with a felony in the United States for a fatal crash involving a motorist who was using a partially automated driving system. Los Angeles County prosecutors filed the charges in October, but they came to light only last week.
The driver, Kevin George Aziz Riad, 27, has pleaded not guilty. Riad, a limousine service driver, is free on bail while the case is pending.
What does the charge mean for automated driving systems?
The misuse of Autopilot, which can control steering, speed and braking, has occurred on numerous occasions and is the subject of investigations by two federal agencies. The filing of charges in the California crash could serve notice to drivers who use systems like Autopilot, that they cannot rely on them to control vehicles.
The criminal charges aren’t the first involving an automated driving system, but they are the first to involve a widely used driver technology.
Authorities in Arizona filed a charge of negligent homicide in 2020 against driver Uber had hired to take part in the testing of a fully autonomous vehicle on public roads. The Uber vehicle, an SUV with the human backup driver on board, struck and killed a pedestrian.
By contrast, Autopilot and other driver-assist systems are widely used on roads across the world. An estimated 765,000 Tesla vehicles are equipped with it in the United States alone.
What caused the crash?
In the Tesla crash, police said a Model S was moving at a high speed when it left a freeway and ran a red light in the Los Angeles suburb of Gardena and struck a Honda Civic at an intersection on December 29, 2019.
Two people who were in the Civic, Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez died at the scene. Riad and a woman in the Tesla were hospitalised with non-life threatening injuries.
Criminal charging documents do not mention Autopilot. But the National Highway Traffic Safety Administration, which sent investigators to the crash, confirmed last week that Autopilot was in use in the Tesla at the time of the crash.
Riad’s defense attorney did not respond to requests for comment last week, and the Los Angeles County District Attorney’s Office declined to discuss the case. Riad’s preliminary hearing is scheduled for Febraury 23.
What is the history of crashes involving Autopilot cars?
NHTSA and the National Transportation Safety Board have been reviewing the widespread misuse of Autopilot by drivers, whose overconfidence and inattention have been blamed for multiple crashes, including fatal ones. In one crash report, the NTSB referred to its misuse as “automation complacency.”
The agency said that in a 2018 crash in Culver City, California, in which a Tesla hit a firetruck, the design of the Autopilot system had “permitted the driver to disengage from the driving task.” No one was hurt in that crash. Last May, a California man was arrested after officers noticed his Tesla moving down a freeway with the man in the back seat and no one behind the steering wheel.
Separately, NHTSA is investigating a dozen crashes in which a Tesla on Autopilot ran into several parked emergency vehicles. In the crashes under investigation, at least 17 people were injured and one person was killed.
How did the families of Mr Lopez and Ms Nieves-Lopez reacted to the crash?
The families of Mr Lopez and Ms Nieves-Lopez have sued Tesla and Riad in separate lawsuits. They have alleged negligence by Riad and have accused Tesla of selling defective vehicles that can accelerate suddenly and that lack an effective automatic emergency braking system. A joint trial is scheduled for mid-2023.
Mr Lopez’s family, in court documents, alleges that the car “suddenly and unintentionally accelerated to an excessive, unsafe and uncontrollable speed.” Ms Nieves-Lopez’s family further asserts that Riad was an unsafe driver, with multiple moving infractions on his record, and couldn’t handle the high-performance Tesla.
How has Tesla responded to the charge?
Messages have been left seeking comment from Tesla, which has disbanded its media relations department. Since the Autopilot crashes began, Tesla has updated the software to try to make it harder for drivers to abuse it. It’s also tried to improve Autopilot’s ability to detect emergency vehicles.
The company has said that Autopilot and a more sophisticated “Full Self-Driving” system cannot drive themselves and that drivers must pay attention and be ready to react at anytime. “Full Self-Driving” is being tested by hundreds of Tesla owners on public roads in the US.