On Dec. 29, 2019, a Honda Civic pulled up to the intersection of Artesia Boulevard and Vermont Avenue in Gardena. It was just after midnight. The traffic light was green.
As the car proceeded through the intersection, a 2016 Tesla Model S on Autopilot exited a freeway, ran through a red light and crashed into the Civic. The Civic’s driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.
Nearly two years later, prosecutors in Los Angeles County filed two counts of vehicular manslaughter against the driver of the Tesla, 27-year-old Kevin George Aziz Riad. Experts believe it is the first felony prosecution in the United States of a driver who caused a fatality while using a partially automated driver-assist system.
As such, the case represents a milestone in the increasingly confusing world of automated driving.
“It’s a wake-up call for drivers,” said Alain Kornhauser, director of the self-driving car program at Princeton University. “It certainly makes us, all of a sudden, not become so complacent in the use of these things that we forget about the fact that we’re the ones that are responsible — not only for our own safety but for the safety of others.”
While automated capabilities are intended to assist drivers, systems with names like Autopilot, SuperCruise and ProPilot can mislead consumers into believing the cars are capable of much more than they really are, Kornhauser said.
Yet even as fully autonomous cars are being tested on public roads, automakers, technology companies, organizations that set engineering standards, regulators and legislators have failed to make clear to the public — and in some cases one another — what the technical differences are, or who is subject to legal liability when people are injured or killed.
Riad, a limousine service driver, has pleaded not guilty and is free on bail while the case is pending. His attorney did not respond to a request for comment Tuesday.
Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’” Kornhauser said. “Just like when I was driving a ’55 Chevy — I’m the one that’s responsible for making sure that it stays between the white lines.”
Many legal experts are clear that the liability of Level 2 systems like Autopilot lies squarely on the driver — not on companies that market technologies that may lead consumers to believe the features are more capable than they are.
But the California Department of Motor Vehicles is struggling with confusion over Tesla’s Full Self-Driving feature, a cutting-edge version of Autopilot intended to eventually do just what the name says: provide full autonomy, to the point where no human at all is needed to drive.
But while other autonomous car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting its testing using its own customers, charging car owners $12,000 for the privilege.
And while the other autonomous technology companies are required to report crashes and system failures to the Department of Motor Vehicles under its test-permit system, the agency has been allowing Tesla to opt out of those regulations.
After pressure from state legislators, prompted by scary videos on YouTube and Twitter pointing out Full Self-Driving’s poor performance, the DMV earlier this month said it is “revisiting” its stance on the Tesla technology.
The agency is also conducting a review to determine whether Tesla is violating another DMV regulation with its Full Self-Driving systems — one that bars companies from marketing their cars as autonomous when they are not.
That review began eight months ago; the DMV described it in an email to The Times as “ongoing.”
https://www.latimes.com/california/stor ... nslaughterAmid the confusion over automated cars, what is less cloudy are the real tragedies that result from accidents.
In 2020, authorities in Arizona filed negligent homicide charges against the driver of an Uber SUV that struck and killed a pedestrian during a test of fully autonomous capabilities. The victim of that collision, Elaine Herzberg, is believed to be the first fatality from a self-driving vehicle.
In Los Angeles, the families of Lopez and Nieves-Lopez have filed lawsuits against Riad and Tesla.
Arsen Sarapinian, an attorney for the Nieves family, said Tuesday that they are closely monitoring the criminal case, awaiting the results of NHTSA’s investigative report and hoping for justice.
But, Sarapinian said, “neither the pending criminal case nor the civil lawsuit will bring back Ms. Nieves or Mr. Lopez.”
n theory, identifying and avoiding stationary objects set off by hazard cones or flashing lights ought to be one of the easiest challenges for any autonomous-driving or driver-assist system.
Yet at least 11 times over the last seven years, cars made by Tesla Inc. and running its software have failed this test, slamming into emergency vehicles that were parked on roads and highways. Now the National Highway Traffic Safety Administration wants to know why.
A federal investigation announced Monday involves Tesla cars built between 2014 and 2021, including models S, X, 3 and Y. If the probe results in a recall, as many as 765,000 vehicles could be affected.
The 11 crashes at issue resulted in 17 injuries and one death. Three took place in Southern California.
The new investigation indicates that the safety agency, under President Biden and Transportation Secretary Pete Buttigieg, is paying more attention to automated driving safety than the more laissez-faire Trump administration. In June, the NHTSA ordered automobile manufactures, including Tesla, to forward data on crashes involving automated systems to the agency.
It’s about time, said Alain Kornhauser, director of the self-driving car program at Princeton University. “Teslas are running into stationary objects,” he said. “They shouldn’t be.”
Tesla is also under review by the California Department of Motor Vehicles for its marketing of “Full Self-Driving” technology. That’s a significant enhancement to Autopilot that allows the car to be driven on city streets, with the claimed ability to handle traffic signals and make turns at intersections. The feature costs $10,000, which includes future enhancements, but Tesla has noted that its Full Self-Driving does not make the car self-driving. DMV regulations prevent auto manufacturers from making false claims about automated driving capabilities.
https://www.latimes.com/world-nation/st ... lot-systemThe Times has repeatedly asked to interview DMV officials to clarify its stance. Those requests have been repeatedly declined.
The rush to self-driving vehicles, the technology isn't there yet.