AmberCutie's Forum
An adult community for cam models and members to discuss all the things!

Autonomous Cars

  • ** WARNING - ACF CONTAINS ADULT CONTENT **
    Only persons aged 18 or over may read or post to the forums, without regard to whether an adult actually owns the registration or parental/guardian permission. AmberCutie's Forum (ACF) is for use by adults only and contains adult content. By continuing to use this site you are confirming that you are at least 18 years of age.

Would You Ride In A Driverless Car

  • Yes

  • No

  • Other: please explain


Results are only viewable after voting.
Status
Not open for further replies.
I don't think we are going to see driverless cars en masse any time soon, and here's why:

When you program a driverless car, the programmer is vested with decisions that carry HUGE moral and ethical considerations. What do you program a car to do when involved in a situation where there are only two options: avoid having the passenger killed by killing another driver OR allow a passenger fatality instead of colliding with (and killing) another driver? There was a TED Talk about this very dilemma a few years back. This is but one example in an entire arena of complicated ethical issues in driverless vehicles.

Sometimes we need to remind ourselves that technology isn't created for it's own sake - sure, we could easily build driverless, flying Jetsons-mobiles, but is there a need for them? Tech is based on demand, scalability, and application, not novelty.

So no, I wouldn't want a driverless car nor do I think we will see them available to the public. They're just not a necessary technology.
 
  • Like
Reactions: SaffronBurke
I'd buy a secondhand driverless car that I could install my own firmware on, if it had physical input controls... for testing. What? I'm as good a car coder as anyone you know!

Never take a driverless Uber. What's the fun in that? You'll miss out on Boce's stories.
 
I'm reminded of the elderly woman and a cab driver tale by Kent Nerburn. True or not it's a good story. The likes of which will never to be heard from a GoogleCab.

 
Why bother making trains autonomous?

Because it would save lives and simpler to implement. Trucks don't carry hundreds of people. Recently a passenger train derailed because it was going 80mph instead of the 30mph speed limit for the turn because the conductor wasn't paying attention and another tragedy happened because the conductor fell asleep, to name a few recent incidents.

read more...

https://en.wikipedia.org/wiki/List_of_rail_accidents_(2010–present)

If any form of current transportation needs computer-aided safety control upgrades, it's the railways.

I totally understand why that's not a priority because there's no monetary profit in it.

Autonomous cars are painfully slow... they stop for 3 seconds at every stop sign and actually do the speed limit. A car doing the highway speed limit in my neck of the woods is a hazard in itself.

If you want to risk your safety and/or maybe your kids, or entire family's safety in the hands of a coded machine, best of luck with that.
 
Last edited:
What do you program a car to do when involved in a situation where there are only two options: avoid having the passenger killed by killing another driver OR allow a passenger fatality instead of colliding with (and killing) another driver?

Check out what Tesla Autopilot is already doing.




Why couldn't I take over driving my car for me

The car might not even have a passenger.



a different route adds 2+ hours to a 20 minute drive.

GPS programs already handle this problem without issue. And the car can always prompt a passenger or it's home base if the owner/user wants to be prompted about high value choices.
 
GPS programs already handle this problem without issue. And the car can always prompt a passenger or it's home base if the owner/user wants to be prompted about high value choices.

Not sure what you mean in the context of my statement? Sure, a GPS can easily find an alternate route. The point being made was that if an autonomous car encounters construction (or any situation) it doesn't know how to deal with, it can't always just take another route. That other route could be 2+ hours longer (thinking of situations in the southwest where a different route involves driving around a 100 mile long canyon). Doesn't make sense to add 2 hours to a drive simply because an autonomous vehicle doesn't understand a construction worker directing traffic. Or am I misunderstanding something there?
 
GPS can also severely screw up. Had one of the nightmare incidents that left me stranded in a rainstorm in a mudpit in the middle of the night about 3 years ago.
 
Check out what Tesla Autopilot is already doing.


In the second accident on that video, the car swerves over to the side of the road to avoid the accident in the intersection. What if there was a pedestrian on the side of the road or crossing the road? What does it do then? Does it save the driver and hit the pedestrian or does it risk the drivers life by getting into an accident, but sparing the pedestrian? Should it factor in if a child is in my car? Now in the real world, drivers don't really make these decisions consciously. They react out of instinct (which will most likely be self preservation). No one can evaluate the moral dilemma of a situation like that in 1/2 second as it's happening. However, a computer is programmed for these situations. So these decisions are made ahead before they even happen. Who makes these decisions?
 
Why bother making trains autonomous?

Because it would save lives and simpler to implement. Trucks don't carry hundreds of people. Recently a passenger train derailed because it was going 80mph instead of the 30mph speed limit for the turn because the conductor wasn't paying attention and another tragedy happened because the conductor fell asleep, to name a few recent incidents.

read more...

https://en.wikipedia.org/wiki/List_of_rail_accidents_(2010–present)

If any form of current transportation needs computer-aided safety control upgrades, it's the railways.

I totally understand why that's not a priority because there's no monetary profit in it.

Autonomous cars are painfully slow... they stop for 3 seconds at every stop sign and actually do the speed limit. A car doing the highway speed limit in my neck of the woods is a hazard in itself.

If you want to risk your safety and/or maybe your kids, or entire family's safety in the hands of a coded machine, best of luck with that.
They already have PTC technology available for trains but they keep stalling on implementing it.
 
  • Like
Reactions: weirdbr
GPS programs already handle this problem without issue.

Really??

Whatever reports you are reading are not only misleading, but they can also cost lives.

And the car can always prompt a passenger or it's home base if the owner/user wants to be prompted about high value choices.

Whatever the fuck that is that supposed to mean, it doesn't sound like progress to me.
 
Last edited:
That other route could be 2+ hours longer (thinking of situations in the southwest where a different route involves driving around a 100 mile long canyon).

GPS programs already handle this problem without issue. And the car can always prompt a passenger or it's home base if the owner/user wants to be prompted about high value choices.

Whatever reports you are reading are not only misleading, but they can also cost lives.


@Bocefish I wasn't taking about train safety. I was talking determining the shortest route while taking traffic into account. Waze has taken me through plenty of construction zones, because even with traffic it was still the shortest route.

If you were expecting a 1 hour drive, and the car thinks the fastest route is going to take 2 hours more. You would probable want to know before the car takes you on a 3 hour trip.
 
I'm more thinking where there's minor construction on a non-major road. You'll have only one side of the road "closed" and a flagger directing which lane can proceed. It's not enough to "go around". It needs to know when to go around. These minor things on minor roads often aren't indicated on waze. I'd be very skeptical that these types of situations can be 100% avoided. Inevitably, you're going to end up in a situation where a human is directing traffic at some point.

You can't avoid construction/road blocks, but you can teach the cars to deal with the situation. That article about this issue is a bit misleading, because it only quotes one manufacturer (Nissan) giving up; meanwhile from talking with people who actually work on this issue at other companies, I know this is a known issue that is being worked on and they are really optimistic that it's a treatable problem. If you look at this video from Waymo and you pay attention to the screens inside the car, you can see it identifying road construction workers and their signage.

Now, for the issue of it not being shown on Waze - that's because of how the data is ingested. There's basically two ways - governments provide the data to Google about road closures or drivers using the app report it. The former is highly unreliable because, as we all know, governments tend to suck. And the latter works well only if it's roads that have enough traffic (as they don't accept a single report of road closure; for it to be considered true, there must be multiple reports).

What do you program a car to do when involved in a situation where there are only two options: avoid having the passenger killed by killing another driver OR allow a passenger fatality instead of colliding with (and killing) another driver?

If you hit this condition where it's a choice between two vehicles (or a vehicle and a passenger), you already failed. It's an interesting thought experiment to try to come up with how the cars should behave in this scenario, but the reality is if the car detected that a collision is imminent and can't avoid it, then the system failed.

When you actually dig down into things, you see that this thought experiment is based on a misunderstanding of how much knowledge the car has of its surroundings; most self-driving cars track *a lot* more than humans do - for example, based on what Waymo says on their latest video which I linked above, the cars track every vehicle up to several hundred meters around them, so you can essentially predict an accident before it happens.


And related to my previous comment about disengagement rates - https://arstechnica.com/cars/2018/02/waymo-now-has-a-serious-driverless-car-rival-gms-cruise/ . The TL;DR version:
- waymo is at over 5000 miles driven between each disengagement
- all the others who came later to the party are way under a thousand miles driven between disengagements. This is expected as they are on their first or second year of development, while Waymo is at its 9th year working on this tech. Personally I expect this to improve drastically once they ramp up the numbers of cars on their test programs, as the key to this game is real world usage data.
 
Status
Not open for further replies.