Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

1
On Dec. 29, 2019, a Honda Civic pulled up to the intersection of Artesia Boulevard and Vermont Avenue in Gardena. It was just after midnight. The traffic light was green.

As the car proceeded through the intersection, a 2016 Tesla Model S on Autopilot exited a freeway, ran through a red light and crashed into the Civic. The Civic’s driver, Gilberto Alcazar Lopez, and his passenger, Maria Guadalupe Nieves-Lopez, were killed instantly.

Nearly two years later, prosecutors in Los Angeles County filed two counts of vehicular manslaughter against the driver of the Tesla, 27-year-old Kevin George Aziz Riad. Experts believe it is the first felony prosecution in the United States of a driver who caused a fatality while using a partially automated driver-assist system.

As such, the case represents a milestone in the increasingly confusing world of automated driving.

“It’s a wake-up call for drivers,” said Alain Kornhauser, director of the self-driving car program at Princeton University. “It certainly makes us, all of a sudden, not become so complacent in the use of these things that we forget about the fact that we’re the ones that are responsible — not only for our own safety but for the safety of others.”

While automated capabilities are intended to assist drivers, systems with names like Autopilot, SuperCruise and ProPilot can mislead consumers into believing the cars are capable of much more than they really are, Kornhauser said.

Yet even as fully autonomous cars are being tested on public roads, automakers, technology companies, organizations that set engineering standards, regulators and legislators have failed to make clear to the public — and in some cases one another — what the technical differences are, or who is subject to legal liability when people are injured or killed.

Riad, a limousine service driver, has pleaded not guilty and is free on bail while the case is pending. His attorney did not respond to a request for comment Tuesday.

Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’” Kornhauser said. “Just like when I was driving a ’55 Chevy — I’m the one that’s responsible for making sure that it stays between the white lines.”
Many legal experts are clear that the liability of Level 2 systems like Autopilot lies squarely on the driver — not on companies that market technologies that may lead consumers to believe the features are more capable than they are.

But the California Department of Motor Vehicles is struggling with confusion over Tesla’s Full Self-Driving feature, a cutting-edge version of Autopilot intended to eventually do just what the name says: provide full autonomy, to the point where no human at all is needed to drive.

But while other autonomous car developers, such as Waymo and Argo, use trained test drivers who follow strict safety rules, Tesla is conducting its testing using its own customers, charging car owners $12,000 for the privilege.

And while the other autonomous technology companies are required to report crashes and system failures to the Department of Motor Vehicles under its test-permit system, the agency has been allowing Tesla to opt out of those regulations.

After pressure from state legislators, prompted by scary videos on YouTube and Twitter pointing out Full Self-Driving’s poor performance, the DMV earlier this month said it is “revisiting” its stance on the Tesla technology.

The agency is also conducting a review to determine whether Tesla is violating another DMV regulation with its Full Self-Driving systems — one that bars companies from marketing their cars as autonomous when they are not.

That review began eight months ago; the DMV described it in an email to The Times as “ongoing.”
Amid the confusion over automated cars, what is less cloudy are the real tragedies that result from accidents.

In 2020, authorities in Arizona filed negligent homicide charges against the driver of an Uber SUV that struck and killed a pedestrian during a test of fully autonomous capabilities. The victim of that collision, Elaine Herzberg, is believed to be the first fatality from a self-driving vehicle.

In Los Angeles, the families of Lopez and Nieves-Lopez have filed lawsuits against Riad and Tesla.

Arsen Sarapinian, an attorney for the Nieves family, said Tuesday that they are closely monitoring the criminal case, awaiting the results of NHTSA’s investigative report and hoping for justice.

But, Sarapinian said, “neither the pending criminal case nor the civil lawsuit will bring back Ms. Nieves or Mr. Lopez.”
https://www.latimes.com/california/stor ... nslaughter

n theory, identifying and avoiding stationary objects set off by hazard cones or flashing lights ought to be one of the easiest challenges for any autonomous-driving or driver-assist system.

Yet at least 11 times over the last seven years, cars made by Tesla Inc. and running its software have failed this test, slamming into emergency vehicles that were parked on roads and highways. Now the National Highway Traffic Safety Administration wants to know why.

A federal investigation announced Monday involves Tesla cars built between 2014 and 2021, including models S, X, 3 and Y. If the probe results in a recall, as many as 765,000 vehicles could be affected.

The 11 crashes at issue resulted in 17 injuries and one death. Three took place in Southern California.
The new investigation indicates that the safety agency, under President Biden and Transportation Secretary Pete Buttigieg, is paying more attention to automated driving safety than the more laissez-faire Trump administration. In June, the NHTSA ordered automobile manufactures, including Tesla, to forward data on crashes involving automated systems to the agency.

It’s about time, said Alain Kornhauser, director of the self-driving car program at Princeton University. “Teslas are running into stationary objects,” he said. “They shouldn’t be.”
Tesla is also under review by the California Department of Motor Vehicles for its marketing of “Full Self-Driving” technology. That’s a significant enhancement to Autopilot that allows the car to be driven on city streets, with the claimed ability to handle traffic signals and make turns at intersections. The feature costs $10,000, which includes future enhancements, but Tesla has noted that its Full Self-Driving does not make the car self-driving. DMV regulations prevent auto manufacturers from making false claims about automated driving capabilities.
The Times has repeatedly asked to interview DMV officials to clarify its stance. Those requests have been repeatedly declined.
https://www.latimes.com/world-nation/st ... lot-system

The rush to self-driving vehicles, the technology isn't there yet.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

4
Even when I have my Toyota RAV4 on cruise control I still am in control. I look at these "self driving" cars as the same as pilots do when they fly by autopilot. They are still responsible for what the plane does and must monitor the action of the plane. I see it frequently when driving people texting or doing other things while going down the road and not paying attention to driving or what is going on around them. Maybe if Elon Musk was held liable for the accidents crated by a faulty product and its advertising this would change the way people se the Tesla vehicles.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

5
sig230 wrote: Wed Jan 19, 2022 11:25 am The sad fact is the with all its current failings the self-driving vehicles are still better than the average driver.
I totally agree. I’ve been watching a lot of stupid car accidents on Youtube, and there’s no doubt in my mind that computers do better in most cases. An autopilot is not perfect, but it doesn’t get tired, angry, or distracted by texting. If all cars are driven by robots, fatal accident rate will drop tremendously.
Glad that federal government is boring again.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

7
Wino wrote: Wed Jan 19, 2022 8:21 pm I've had three separate computers go tits up in past 3-4 weeks! Yeah, I'm gonna trust some cars auto pilot?? Don't think so. :oops2: :thumbsdown: :roflmao:
I dunno, I have used many computers over the years, personally and professionally, and none have died on me so far. One failed to boot after I left it off gathering dust for years, but that’s probably me not keeping it in a climate-controlled room.

Everything is relative. Computers are not perfect, but humans directly caused over 40,000 deaths on the road in 2020, in USA. That’s more than 100 deaths per day. Those are just the deaths, injuries and permanent disabilities are separate numbers.
Glad that federal government is boring again.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

8
It's all sensors and software and I expect there will be malfunctions. In biology viruses are naturally occurring, but in the computer world humans create viruses and propagate them along with malware and ransom ware. I doubt autonomous vehicles will be exempt unless they aren't connected to the internet, but they'll need updates and manufacturers will want a lot of data on their vehicles.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

10
sikacz wrote: Thu Jan 20, 2022 10:55 pm Machines break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
Humans break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
To be vintage it must be older than me!
The next gun I buy will be the next to last gun I ever buy. PROMISE!
jim

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

11
sig230 wrote: Fri Jan 21, 2022 8:14 am
sikacz wrote: Thu Jan 20, 2022 10:55 pm Machines break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
Humans break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
Didn’t mean we don’t. My point is machines don’t guarantee safety and are not necessarily a replacement for human judgment at this point. Both can break down, both can complement the other, therefore automation in vehicles shouldn’t be without human capacity to override. Complexity of machines guarantee that there will be unexpected failures sooner or later. I seem to recall some planes going down and killing all onboard with automation features that failed or people failed to correct. What I’m saying is automation in vehicles is not going to eliminate accidents or tragic miscalculation. These systems will only be as fault proof as the designers are capable of. At this point machines are still designed by us, humans. Perhaps at some future point machines will completely design and fix themselves. Likewise we will design machines to fix our flaws, not sure where that will end up. The future is not here yet and I’d caution against being too enthusiastic about automation especially in vehicles and where they interact with more unknowns than in a factory or lab setting.
Image
Image

"Resistance is futile. You will be assimilated!" Loquacious of many. Texas Chapter Chief Cat Herder.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

12
sikacz wrote: Fri Jan 21, 2022 8:51 am
sig230 wrote: Fri Jan 21, 2022 8:14 am
sikacz wrote: Thu Jan 20, 2022 10:55 pm Machines break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
Humans break down and malfunction all the time. A time when they won’t is in a utopian future. At this time it doesn’t exist.
Didn’t mean we don’t. My point is machines don’t guarantee safety and are not necessarily a replacement for human judgment at this point. Both can break down, both can complement the other, therefore automation in vehicles shouldn’t be without human capacity to override. Complexity of machines guarantee that there will be unexpected failures sooner or later. I seem to recall some planes going down and killing all onboard with automation features that failed or people failed to correct. What I’m saying is automation in vehicles is not going to eliminate accidents or tragic miscalculation. These systems will only be as fault proof as the designers are capable of. At this point machines are still designed by us, humans. Perhaps at some future point machines will completely design and fix themselves. Likewise we will design machines to fix our flaws, not sure where that will end up. The future is not here yet and I’d caution against being too enthusiastic about automation especially in vehicles and where they interact with more unknowns than in a factory or lab setting.
All true BUT...

We can look at a lot of evidence that shows design of the infrastructure and vehicles increases safety.

We have improved road safety by changing highway design and construction, improving the design of rails and barriers, adding safety devices like crumple zones and collapsible steering columns and window construction and location of mirrors and materials used in dashboards and head restraints and air bags and seat belt systems and placement and shape of controls and improved braking systems and lighting and exhaust emissions and fuel contents and ... the lists go on and on.

We can also look at the human component and there we can see not just little improvement but rather a major regression whit people talking on phones and watching movies and texting and when we do look at the causes of injury and deaths the mechanical failures are by far the smallest number.

The biggest cause of injury and deaths regarding motor vehicles is ... the driver.
To be vintage it must be older than me!
The next gun I buy will be the next to last gun I ever buy. PROMISE!
jim

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

13
Yes, but change is incremental and automation by itself is not going to eliminate failure. I’m advocating caution, not saying development shouldn’t continue. My worry is the same as with “smart guns” as with “smart cars”, they are both limited by our development of technology and if we are too optimistic some crafty legislators will enact mandated use before it becomes a truly viable technology. Seat belts, air bags and other safety features did not remove our input in the vehicles. Fully automated vehicles if mandated and without human oversight would. The question of liability as noted would become clouded at best.
Image
Image

"Resistance is futile. You will be assimilated!" Loquacious of many. Texas Chapter Chief Cat Herder.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

14
Sobering statistics:
https://www.cdc.gov/vitalsigns/motor-ve ... index.html
Major risk factors for crash deaths in the US.
Not using seat belts, car seats, and booster seats contributed to over 9,500 crash deaths.
Drunk driving contributed to more than 10,000 crash deaths.
Speeding contributed to more than 9,500 crash deaths.
Reducing major risk factors could save thousands of lives and hundreds of millions of dollars in direct medical costs each year.

Seat belts saved over 12,500 lives in the US in 2013, yet:
The US had lower-than-average front and back seat belt use compared with other high-income countries.
About half of drivers or passengers who died in crashes in the US weren’t buckled up.
Clearly technology doesn’t guarantee a fix, nor laws. Seat belts are a pretty rudimentary device and required to be worn by law, yet people don’t. Speeding is against the law and people do it. Drunk driving is against the law as well. All of those deaths could’ve been reduced by wearing a belt, not driving drunk and staying within the speed limits. No new technology needed, just people doing the right thing. Yes people are the main cause of vehicular accidents and often it’s people killed that are also not in cars, pedestrians and bicyclists for example. Could automation also reduce these type of accidents, some yes, but reduced rates would also happen if people just obeyed the existing laws. As with deaths related to gun use, if suicides are removed numbers are significantly lower and if deaths by failure to obey laws were factored in vehicular death numbers would be significantly lower as well. Takeaway for me is addressing the underlying causes reduces numbers significantly in both cases and does not require new technology. Work on new technology of course, but just like any technology, humans will ignore it or find a way to circumvent it. Technology by itself is not a solution without addressing the human component.

32,000 deaths minus 10,000 minus 9,500 minus 9,500 brings vehicular deaths to 3,000 per year just by obeying laws. Considering the number of vehicles on the road that is pretty low numbers. I suppose someone could extrapolate the number of autonomous vehicles on the road and work out the deaths in comparison. There were 284 million vehicles on our roads last year, that 3,000 deaths would be a small fraction of a percent.
Last edited by sikacz on Fri Jan 21, 2022 11:25 am, edited 1 time in total.
Image
Image

"Resistance is futile. You will be assimilated!" Loquacious of many. Texas Chapter Chief Cat Herder.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

15
sikacz wrote: Fri Jan 21, 2022 10:27 am Technology by itself is not a solution without addressing the human component.
Kinda-sorta.

Technology by itself can be a solution without addressing the human component.

Road design and changes to barriers and rails and crumple zones and collapsible steering columns and dashboard padding and window material changes and head restraints all work despite the human component. Taking the lead out of gasoline and adding exhaust converters electronic fuel injection monitoring all work despite the human component.

The goal is to find ways that work despite the human component and automation can provide just such a solution.

Despite the airline crashes mentioned above, overall automating flying and landing planes has proven beyond a shadow of a doubt to save lives.

There is no reason to think the same won't be true on the highways.
To be vintage it must be older than me!
The next gun I buy will be the next to last gun I ever buy. PROMISE!
jim

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

16
Considering there are about 1,400 autonomous vehicles in our roads, even one pedestrian death will make deaths by autonomous vehicles greater than with a manned vehicle. Pedestrians have been killed by autonomous vehicles so this is a plausible scenario, automation promises potential safety gains, but is not yet there.
Image
Image

"Resistance is futile. You will be assimilated!" Loquacious of many. Texas Chapter Chief Cat Herder.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

17
It might be an age thing but I’m not trusting a computer to drive my car at a speed that will kill me while I read etc. I’ve had more than my share of blue screens of death. Lol. The navigation/ radio etc. in my 2019 Ram even periodically has a brain fart and resets itself while I’m driving. And I’ve been in Subarus where the lane following technology literally just cuts out down a part of my country road with a 55 mph speed limit. As far as planes go, unless something has changed in the last few years, the GPS signal is deliberately degraded to be a few feet off for us lowly civilians. And the govt can just turn the whole system off without notice (they’ve done it as a test and caught plaines in dead zones before. ) I would guess that cameras can pick up the slack. But in the winter with the lanes covers in snow while you’re a few feet across from someone barreling down the other lane- and both of you have salted-up windshields- nope, not yet.

Don’t get me wrong. I think the whole auto-stop so you don’t hit things or the blind spot cameras are awesome, if not strictly necessary. It’s just the whole self-driving thing. And get off of my lawn. ☺️

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

18
sig230 wrote: Fri Jan 21, 2022 9:40 am We can look at a lot of evidence that shows design of the infrastructure and vehicles increases safety.

We have improved road safety by changing highway design and construction, improving the design of rails and barriers, adding safety devices like crumple zones and collapsible steering columns and window construction and location of mirrors and materials used in dashboards and head restraints and air bags and seat belt systems and placement and shape of controls and improved braking systems and lighting and exhaust emissions and fuel contents and ... the lists go on and on.

We can also look at the human component and there we can see not just little improvement but rather a major regression whit people talking on phones and watching movies and texting and when we do look at the causes of injury and deaths the mechanical failures are by far the smallest number.

The biggest cause of injury and deaths regarding motor vehicles is ... the driver.
These are things that have helped make the roads safer. But Sig 230 was correct when about the biggest cause of accidents is the driver.

When my Dad was in the Air force he served on many Accident Review Boards for both Air and Ground accidents. He said it was amazing that a pilot would never think about flying a jet without bing strapped in and wearing his helmet, but give him two drinks at the Officers Club he thought he could hop in his sports cars with no seatbelt buckled and think he was Mario Andretti till he hits the light pole.

As for the changes in road design I think they are great if they help prevent accidents. On a neighbor board there is a long discussion about the Roundabout that was recently installed on one of threads out of my neighborhood. It is amazing how many drivers don't understand how to drive through a roundabout. They also fuss and fume about how it is going to cause more wrecks and the roundabouts should be outlawed as a foreign monstrosity forced on the good people of Texas. Even when we show how to approach and drive on one and that statistics show it reduced accidents and increases traffic flow. But these are the same ones that run stop signs and red lights without any regard for other drivers.

As for the human component, which many have a major malfunction between their ears, we can look at the BNSF railroad. They have personnel on most trains, but on all BNSF trains they are monitored in continuous real time with both video and computer data from their headquarter center in north Fort Worth. Not only monitored but they can take control and operate the trains from there. Think of it as Air Traffic Control with additions. Their Rail yards in north Fort Worth/ Saginaw has big signs up where the roads cross the tracks "there is no human aboard the trains moving in the rail yard". They are all control from the rail yard tower. These advancements have increased safety and decreased accidents throughout the BNSF System. Many have been adapted by other major railroads.

We could have the safest roads in the world, if we adopted some of these ideas to the road system. The idea of atomomus trucks is already being test along with driverless car for public transportation.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

19
My point is automation has potential, but is not there yet. Article published in the American Journal of Preventive Medicine:
https://www.sciencedirect.com/science/a ... 9718320932
Conclusions
Although technologies are being developed for automated vehicles to successfully detect pedestrians in advance of most fatal collisions, the current costs and operating conditions of those technologies substantially decrease the potential for automated vehicles to radically reduce pedestrian fatalities in the short term.
As I said in some utopian future, I doubt I’ll live to see it. Who knows, I could be wrong.
Image
Image

"Resistance is futile. You will be assimilated!" Loquacious of many. Texas Chapter Chief Cat Herder.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

20
I agree. It is interesting how many people react to even the safety devices that are on the cars today. I have a 2019 Toyota RAV 4 XSE it has all the bells and whistles. This includes lane departure alert, back up alert sound, anti-collision auto braking, hybrid slow speed alert, etc. On the RAV4 boards all you can read is how to turn these alerts off, even when they are required by the DOT.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

21
sikacz wrote: Fri Jan 21, 2022 4:35 pm My point is automation has potential, but is not there yet. Article published in the American Journal of Preventive Medicine:
https://www.sciencedirect.com/science/a ... 9718320932
Conclusions
Although technologies are being developed for automated vehicles to successfully detect pedestrians in advance of most fatal collisions, the current costs and operating conditions of those technologies substantially decrease the potential for automated vehicles to radically reduce pedestrian fatalities in the short term.
As I said in some utopian future, I doubt I’ll live to see it. Who knows, I could be wrong.
I agree, technology has a long way to go. I had rental vehicles with collision avoidance and other "safety" features and I found them annoying. I may be old, but I'm a good driver and I don't need annoying bells and whistles going off to pay attention to the road. I deliberately didn't get them when I purchased my last vehicle. A lot of them were developed for high end vehicles like MB and BMW and were later adopted by other manufacturers. I do like lane drift alerts, not all lanes are the same width especially in road construction areas or if a vehicle is out of alignment.
"Everyone is entitled to their own opinion, but not their own facts." - Daniel Patrick Moynihan

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

22
The only path to some utopian future is through the present.

the evidence that technology is capable of building that future is overwhelming. Many planes today, particularly military fighters, could simply not fly if the pilot was required to make the adjustments needed to keep the plane in the air. Automation has made factory floors safer, is making great strides on railways and highways and is essential if we are to make roads safer. Automation does not mean that drivers should not be held responsible.

The problem as so often is the case is the total lack of any form of oversight when it comes to advertising. The current crop of TV ads touting "Self-Driving" features borders on criminal IMHP and certainly can be said to create an attractive nuisance and threat.

But the root cause is that we fail to actually enforce the existing highway safety laws.

That should be the first concurrent tack.
To be vintage it must be older than me!
The next gun I buy will be the next to last gun I ever buy. PROMISE!
jim

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

23
Technology is not a cure-all. It has never been, and it will never be. I’m not advocating a magical solution, just a statistically significant improvement in safety.

There is an overwhelming evidence that removing complete dependency on humans increases safety. Machines don’t get distracted nor angry nor drunk.

I’m open to having a self-driving car. It will make long distance driving safer and less tiring. I would still not trust it 100% and sleep while it runs, but I can choose to watch a movie or read while glancing on the road, ready to take over at any second.

More importantly, there is no law that mandates the use of self-driving technology and ban conventional cars. This is not the same situation with smart guns.
Glad that federal government is boring again.

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

24
Stiff wrote: Sat Jan 22, 2022 4:25 pm Technology is not a cure-all. It has never been, and it will never be. I’m not advocating a magical solution, just a statistically significant improvement in safety.

There is an overwhelming evidence that removing complete dependency on humans increases safety. Machines don’t get distracted nor angry nor drunk.

I’m open to having a self-driving car. It will make long distance driving safer and less tiring. I would still not trust it 100% and sleep while it runs, but I can choose to watch a movie or read while glancing on the road, ready to take over at any second.
It is the seconds, when reading or watching a movie, when you are distracted from what is happening around you, that can cause the accident.
Facts do not cease to exist because they are ignored.-Huxley
"We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can't have both." ~ Louis Brandeis,

Re: Driver using Tesla's "Autopilot" is charged after an accident with two fatalities.

25
TrueTexan wrote: Sat Jan 22, 2022 4:40 pm
Stiff wrote: Sat Jan 22, 2022 4:25 pm Technology is not a cure-all. It has never been, and it will never be. I’m not advocating a magical solution, just a statistically significant improvement in safety.

There is an overwhelming evidence that removing complete dependency on humans increases safety. Machines don’t get distracted nor angry nor drunk.

I’m open to having a self-driving car. It will make long distance driving safer and less tiring. I would still not trust it 100% and sleep while it runs, but I can choose to watch a movie or read while glancing on the road, ready to take over at any second.
It is the seconds, when reading or watching a movie, when you are distracted from what is happening around you, that can cause the accident.
And reading or watching a movie or playing hand slap games while behind the steering wheel of a motor vehicle should be treated just like texting or using a cell phone while driving.

The car should be stopped, the driver's license impounded, and locks put on the vehicle wheels until it can be towed away. The driver and occupants should be allowed to use the cell phone to call a cab or uber or a friend, but that car should be off the road until after a trial.
To be vintage it must be older than me!
The next gun I buy will be the next to last gun I ever buy. PROMISE!
jim

Who is online

Users browsing this forum: Amazon [Bot] and 1 guest