Armed Polite Society

Main Forums => The Roundtable => Topic started by: TechMan on January 18, 2017, 12:16:38 PM

Title: MIT Moral Machine
Post by: TechMan on January 18, 2017, 12:16:38 PM
http://moralmachine.mit.edu/ (http://moralmachine.mit.edu/)

Quote
Welcome to the Moral Machine! A platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars.
We show you moral dilemmas, where a driverless car must choose the lesser of two evils, such as killing two passengers or five pedestrians.  As an outside observer, you judge which outcomes you think is more acceptable. You can then see how your responses compare with those of other people.
If you are feeling creative, you can also design your own scenarios, for you and other users to browse, share, and discuss

Title: Re: MIT Moral Machine
Post by: makattak on January 18, 2017, 12:25:21 PM
My "most saved" character was a young boy and "most killed" character was a cat.

Yeah, I think that's about right. Humans over animals.

(I also note that although I counted a pregnant woman as 2 people, the "researchers" did not.)
Title: Re: MIT Moral Machine
Post by: K Frame on January 18, 2017, 12:51:54 PM
Getting ready to play with it now...

Animals are going to live!

Humans?

Not so much.
Title: Re: MIT Moral Machine
Post by: K Frame on January 18, 2017, 12:54:38 PM
Yep, my species preference is ALL pets!

Ole Yeller is gonna LIVE!  :rofl:
Title: Re: MIT Moral Machine
Post by: Fly320s on January 18, 2017, 01:32:51 PM
MIT students can't get basic grammar correct yet they want to program self-driving cars?
Title: Re: MIT Moral Machine
Post by: TechMan on January 18, 2017, 01:51:46 PM
MIT students can't get basic grammar correct yet they want to program self-driving cars?

Are you talking about the quote I put in the OP?  If so, I couldn't copy and paste their text, so I retyped it.
Title: Re: MIT Moral Machine
Post by: Fly320s on January 18, 2017, 01:54:15 PM
Are you talking about the quote I put in the OP?  If so, I couldn't copy and paste their text, so I retyped it.

Yes.  Sorry, thought it was a quote.  The test does say "Hoomans" instead of "Humans," but I think that was intentional.
Title: Re: MIT Moral Machine
Post by: RevDisk on January 18, 2017, 01:59:14 PM
MIT students can't get basic grammar correct yet they want to program self-driving cars?

Same type of folks already write the code in your self-flying flight systems.  =D
Title: Re: MIT Moral Machine
Post by: Fly320s on January 18, 2017, 02:11:21 PM
Same type of folks already write the code in your self-flying flight systems.  =D

Oh, I know.  Guess who gets in trouble when the coders get it wrong.
Title: Re: MIT Moral Machine
Post by: MillCreek on January 18, 2017, 02:26:56 PM
Oh, I know.  Guess who gets in trouble when the coders get it wrong.

So does your multi-function display say 'Abort, retry, ignore?' right before the computer puts the airplane into a mountainside?
Title: Re: MIT Moral Machine
Post by: Hawkmoon on January 18, 2017, 02:55:29 PM
Totally messed up. I took their questionnaire after the exercise, and the results they showed when I was done didn't correspond to my choices at all.

Maybe it's not a bad thing I wasn't accepted by M.I.T. when I was applying to colleges.
Title: Re: MIT Moral Machine
Post by: HankB on January 18, 2017, 04:20:02 PM
When you assume that you're a passenger in the car, or that you own the self-driving car . . . pedestrians don't fare very well.
Title: Re: MIT Moral Machine
Post by: Fly320s on January 18, 2017, 07:58:51 PM
So does your multi-function display say 'Abort, retry, ignore?' right before the computer puts the airplane into a mountainside?

No, it just sits there quietly, maliciously, waiting.
Title: Re: MIT Moral Machine
Post by: Fly320s on January 18, 2017, 07:59:34 PM
Totally messed up. I took their questionnaire after the exercise, and the results they showed when I was done didn't correspond to my choices at all.

Maybe it's not a bad thing I wasn't accepted by M.I.T. when I was applying to colleges.

I had the same problem, but I'm sure I did it right.
Title: Re: MIT Moral Machine
Post by: BlueStarLizzard on January 18, 2017, 10:06:01 PM
I killed all the old ladies!  :rofl:
Title: Re: MIT Moral Machine
Post by: AJ Dual on January 19, 2017, 05:20:02 PM
I played. Mostly in line or trending with the majority, except humans 100% over animals, and if there was an option to splat people of "low social value" on purpose even when there was a no-win dilemma facing the car, I guess I would have chosen that.

I still think all the hand-wringing over "who to kill" in the autonomous car Kobiashi Maru scenario is bunk.

I say just program the cars to try and crash with the least force possible in any "unwinnable" scenario, and as to human life and injury, let the chips fall where they may. The situation where "least force" kills more people, I expect that to be vanishingly small.
Title: Re: MIT Moral Machine
Post by: Hawkmoon on January 19, 2017, 07:53:07 PM
My criteria were pretty simple. I confess that I viewed it as an abstract moral problem rather than a personal exercise, so that undoubtedly skewed my answers relative to most of you. That meant I always chose to kill animals rather than people. (Sorry, Liz. I still brake and swerve for squirrels and turtles, but this was an abstract exercise.)

Beyond that, I took the approach that no autonomous vehicle should ever kill a pedestrian. Riders made a choice to get into the death box. So, my choices always killed the vehicle occupants rather than pedestrians.

Mind you ... none of the scenarios involved dare-devil jaywalkers. IMHO there should be open season on them, with no bag limit.
Title: Re: MIT Moral Machine
Post by: BlueStarLizzard on January 19, 2017, 08:38:39 PM
My criteria were pretty simple. I confess that I viewed it as an abstract moral problem rather than a personal exercise, so that undoubtedly skewed my answers relative to most of you. That meant I always chose to kill animals rather than people. (Sorry, Liz. I still brake and swerve for squirrels and turtles, but this was an abstract exercise.)

Beyond that, I took the approach that no autonomous vehicle should ever kill a pedestrian. Riders made a choice to get into the death box. So, my choices always killed the vehicle occupants rather than pedestrians.

Mind you ... none of the scenarios involved dare-devil jaywalkers. IMHO there should be open season on them, with no bag limit.

I need this in my life.
Title: Re: MIT Moral Machine
Post by: Hawkmoon on January 19, 2017, 09:55:20 PM
I need this in my life.

Liz, I live in a 'burb just outside a self-styled "sanctuary city." It's very common for certain types of people to step out in a crosswalk (or not in a crosswalk), and then proceed to amble in the most leisurely and indirect way across the street, all the while staring at oncoming cars and just daring a driver to hit them. They NEED to be hit.
Title: Re: MIT Moral Machine
Post by: Scout26 on January 19, 2017, 10:19:35 PM
I'll find some time to do this tomorrow.  Since I hate people, my goal will be to get the high score on the body count....
Title: Re: MIT Moral Machine
Post by: BlueStarLizzard on January 19, 2017, 11:10:05 PM
Liz, I live in a 'burb just outside a self-styled "sanctuary city." It's very common for certain types of people to step out in a crosswalk (or not in a crosswalk), and then proceed to amble in the most leisurely and indirect way across the street, all the while staring at oncoming cars and just daring a driver to hit them. They NEED to be hit.

Super liberal college town, trust me, I know the pain.
Title: Re: MIT Moral Machine
Post by: Perd Hapley on January 20, 2017, 12:34:22 AM
I need this in my life.

 :rofl:


That phrase may very well replace Tina Fey's "I want to go to there," in my repertoire.
Title: Re: MIT Moral Machine
Post by: Firethorn on January 20, 2017, 01:41:45 AM
I still think all the hand-wringing over "who to kill" in the autonomous car Kobiashi Maru scenario is bunk.

Indeed.  I had trouble accepting the scenarios as "realistic" enough to take them seriously.
1.  By the time the car has determined that it's suffered a brake failure, it'll be too late, much like with a human driver.  He hits the brakes and they fail, by the time he's adjusted it's too late.  Another part is that I can see hydraulic braking going away for electric eventually.  Less likely to fail.
2.  Program the car to seize the engine up and lock the transmission in that case!
3.  Seatbelted passengers in a vehicle hitting a concrete barrier at any sort of legal speed near a red light where you expect pedestrians isn't going to be lethal in a modern car.
4.  That you couldn't have the car, detecting that it's in an unsafe state for pedestrians, perform alert actions - honk the bloody horn, flash the lights, etc...]
5.  Hell, throw the car in reverse and gas it.  Who cares about the rest of the car's systems in such an event?  I don't!
etc...
6.  How about doing complete circles in the car? 

In the scenarios I generally had the vehicle go straight - assuming it's honking and such, straight is predictable, and allows people to avoid it better.

Quote
I say just program the cars to try and crash with the least force possible in any "unwinnable" scenario, and as to human life and injury, let the chips fall where they may. The situation where "least force" kills more people, I expect that to be vanishingly small.

Exactly.