+88 I think self driving cars of high autonomy like level 3 and above are too dangerous for the streets. amirite?

by Lanky_Serve_5985 1 day ago

They don't have to be safe. just safer than people. which they already are.

by Anonymous 1 day ago

I would say they need to be significantly safer. To be universally accepted, even then it'll have serious detractors.

by oconnellanabel 1 day ago

They are though? Back when Waymo was still part of Google, the self driving cars had an incident rate in the like 99.99th percentile of all drivers, with their claim being all the incidents they had were caused by the other drivers or getting hit as a parked car.

by Anonymous 1 day ago

Lol, they have no idea how to do roundabouts. They think parked cars are travelling at you and slams the brakes on to avoid imaginary motorbikes which isn't the best when travelling down the motorway. Also has a massive panic on narrow roads, can't read road speeds when they are slightly obscured and seems to aim for potholes and doesnt brake for speed bumps whenever the paint is faded

by Christygutkowsk 1 day ago

Not in all ways, do you support lethal autonomous weapons? That a company or the government should program the value of a life into a machine?

by Lanky_Serve_5985 1 day ago

It's the same situation, an AI being programmed the value of a life.

by Lanky_Serve_5985 1 day ago

Wow, what a red herring

by Life-Wishbone 1 day ago

Thats not a red herring. You probably mean false equivalence.

by Anonymous 1 day ago

Appreciate it! I figured there was something more accurate

by Life-Wishbone 1 day ago

Youre welcome.

by Anonymous 1 day ago

It's the same situation, a machine choosing who deserves to die

by Lanky_Serve_5985 1 day ago

I'd prefer the machine killing 1 in every 1000 drivers than humans killing 1 in every 50.

by Disastrous-Big 1 day ago

A self riving car = autonomous killing war machines The state of human intelligence everybody. Sad.

by Wild-Performer 1 day ago

In what scenario is an AI car deciding who deserves to die?

by Anonymous 1 day ago

Swerving or slowing down when someone is in front of it. It's happened to me multiple times.

by Lanky_Serve_5985 1 day ago

How is an AIs decision in a real life trolly problem any different than a real person's?

by Anonymous 1 day ago

You've killed multiple people???

by Christygutkowsk 1 day ago

They are probably thinking of some rare scenarios where the car has to choose to hit a person or swerving into a car / person

by Life-Wishbone 1 day ago

And OP thinks that's analogous to using AI drones?

by Anonymous 1 day ago

I can't wait until most cars on the street are self driving autonomous vehicles. I'd rather have them than texting, drunk, incompetent drivers on the road.

by adrien50 1 day ago

When you live in a society, you need to zoom out of hypotheticals and look at data. if autonomous vehicles cause less deaths overall, it is a net win for society and something that should be encouraged. Yes, some may still die, statistically lower than how many may have and if it is someone you know, your emotions will override common sense anyway. So there is no way to win if we think like that.

by adrien50 1 day ago

It's not a hypothetical that these machines were programmed the worth of a life by a company.

by Lanky_Serve_5985 1 day ago

I don't see what that has to do with them being objectively less dangerous on the road than humans. I can't wait until human drivers are banned from the roads. Drunk driving causes more harm a year than ai cars ever could

by Vegetable-Coffee-911 1 day ago

I also think we should have stricter restrictions. One I am in favor of is banning human drivers in favor of objectively less dangerous ai drivers

by Vegetable-Coffee-911 1 day ago

Why should I accept a thing with deep flaws just because of a few pluses.

by Lanky_Serve_5985 1 day ago

Because you're looking at it the other way around. Why should you deny a thing which will save plenty of lives just because of a few minor flaws?

by Vegetable-Coffee-911 1 day ago

A machine being programmed by a billionaire the worth of a human life isn't minor

by Lanky_Serve_5985 1 day ago

I don't want machines to be programmed some value for a human life and who deserves to live or die, especially not some value chosen by a billionaire

by Lanky_Serve_5985 1 day ago

License back? The first couple of DUI offenses are misdemeanors in my neck of the woods. They dont lose their license at all. I'd rather have autonomous vehicles over Billy Bob who's on their 14th OWI.

by Anonymous 1 day ago

Did you read the first sentence of my thing? I said we also need stricter driving laws.

by Lanky_Serve_5985 1 day ago

Good luck with that. Getting all 50 states to agree on driving rules is never going to happen.

by Anonymous 1 day ago

Because, frankly, the pluses far outweigh the flaws. You can be dragged into the future kicking and screaming but you'll be dragged there just the same.

by Anonymous 1 day ago

If course not, but they are programmed to serve or not, and have a solution deciding the worth of a human life

by Lanky_Serve_5985 1 day ago

Incorrect. Again this isn't an unpopular opinion, this is just ignorant lack of understanding.

by Billie01 1 day ago

They are programmed how much risk it is worth to swerve

by Lanky_Serve_5985 1 day ago

Id rather a machine kill me .5% of the time, rather than my random neighbor at 5% of the time.

by Anonymous 1 day ago

No one has ever had to choose between slowing down, swerving, or hitting a person? Try living in an urban area pal. Why should a machine be programmed the amount of risk to property and others that a life is worth? I don't think there is a precise amount of money a life is worth.

by Lanky_Serve_5985 1 day ago

No one has ever had to choose between slowing down, swerving, or hitting a person? That's not what I said. I said that nobody has ever had to choose which of two different people they should hit. I literally said that nearly all emergencies can be dealt with by breaking hard and a few require turning as well. Like I already said, the cars won't be programmed to assess risk by value. They'll be programmed to avoid getting into emergency situations in the first place.

by No-Cartographer9509 1 day ago

Nobody is not avoiding emergency situations, it's what they do when they inevitably happen that mateers

by Lanky_Serve_5985 1 day ago

You seem to grossly misunderstand how self driving cars are programmed. It isn't using AI to determine human life, it has a set of sensors and a set of rules and it follows them to avoid hitting things. It isn't performing cost calculations at any point in the process.

by LongjumpingFox8868 1 day ago

Swerve or not swerve

by Lanky_Serve_5985 1 day ago

That's not using AI to calculate or think on what to do. It receives sensor data and tries to avoid hitting things. If something is going to hit you, it will swerve as long as it won't hit something else. There isn't any opinion involved.

by LongjumpingFox8868 1 day ago

Self driving cars are ALREADY vastly safer than human drivers. I use Waymo anytime I'm in a city that has it. And yes, I'd rather have the software make the decisions than a drunk, stoned, or distracted human.

by Miserable-Ad 1 day ago

The first sentence of my thing is that we should have stricter driving laws. Do you care about the flaws with Self Driving Cars and or the companies choosing the value of a life?

by Lanky_Serve_5985 1 day ago

Your argument presumes that most people genuinely value the life of random strangers. They don't. We already allow people to drive vehicles that can easily travel at more than twice the highest speed limits in the United States, despite already-strict laws against such behavior. If we really cared about safety, we'd start by dealing with that issue. We won't because we don't.

by enolaleffler 1 day ago

As long as they are safer than the average driver, they are safe enough. Until that point, they are not.

by Anonymous 23 hours ago

All cars are too dangerous for the streets. 1.3 million deaths every year.

by DowntownDisk 23 hours ago