Page 1 of 1

Moral Machines

PostPosted: Wed Sep 14, 2016 11:59 am
by Lillian Cawfield

I came across an intriguing website, without belaboring the point, Moral Machines presents a litany of moral dilemmas, asking the respondent what moral decisions should self-driving cars take to spare the lives of others (or of the driver) and what lives should be prioritized?



http://moralmachine.mit.edu/



My results (if they are working) - http://moralmachine.mit.edu/results/-1333076637


Moral Machines

PostPosted: Wed Sep 14, 2016 6:18 pm
by Joey Bel

I find it interesting that I completely ignored gender, age and fitness in my choices, using the same set of priorities in every situation regardless of who was involved, but still I seemingly had a preference in all these areas. Did I have a preference or did the test?


If not for this, I'd be slightly concerned that "Others" don't have a 100% preference for humans vs. pets.


Moral Machines

PostPosted: Wed Sep 14, 2016 9:26 am
by Roberto Gaeta

What were your priorities if I may ask?


Moral Machines

PostPosted: Wed Sep 14, 2016 6:50 am
by Sophie Miller

http://moralmachine.mit.edu/results/1647729723




Apparently I took no account of whether the dead was a passenger or whether they broke the law. I mostly just went by sheer numbers of dead and figured it would be more pragmatic to kill the elderly, fat and homeless, who are less likely to have dependents or people who will miss them, than the young, healthy and pregnant.


Moral Machines

PostPosted: Wed Sep 14, 2016 5:21 pm
by Kari Depp

http://moralmachine.mit.edu/results/633277891


Moral Machines

PostPosted: Wed Sep 14, 2016 11:48 am
by Emma

Are the results working for anyone? I'm not able to view any of them.


Moral Machines

PostPosted: Wed Sep 14, 2016 1:48 pm
by Inol Wakhid

Not working for me either.



ran the test, taking into consideration number of dead, regardless of species...... and avoid changing lanes.... (that could present problems of its own.) I still had a gender/age preference apparently...... and I preferred animals over people as well. I don't think those stats are really meaningful...... as you have no control over the distribution of dead bodies.......


Moral Machines

PostPosted: Wed Sep 14, 2016 1:58 pm
by Baby K(:

Deliberately changing course to kill people(more or less) is murder. Otherwise it is just an accident.



Basically, just avoid intervention.(similar status on both sides, go straight)



[censored] the law(petty red light is not enough reason to kill people), social status(wtf), body types(wtf2), gender difference(eh) and pets(surely)!



Be passenger agnostic. ([censored] them too!) Passengers will have to save themselves. You(machine or human) are an active agent in the decision making process of hitting people, otherwise you are not. So you(machine or human) are easily absolved from that, even when it is to avoid hitting just one person.



Save more people.(Remember, passenger agnostic!!!)


Override: Save more kids. No one wants a child killer.



http://moralmachine.mit.edu/results/1297381349


Also control cases are broken. It thinks I have preferences for:


law(very broken)


intervention(slightly broken)


age(broken)


fitness(completely broken)



I break moral compasses. :D





PS. (In the case of Children of Men, we may have to devise the gender filter.)


Moral Machines

PostPosted: Wed Sep 14, 2016 6:13 am
by Rachel Briere

http://moralmachine.mit.edu/results/1071304804



Well I was tough on pets, nicer to woman than men and reckon joggers have more change of getting out of the way.



edit: results not working for me for my own or anyone elses page


Moral Machines

PostPosted: Wed Sep 14, 2016 3:55 pm
by NAtIVe GOddess

Doesn't even work well, cause I killed criminals basically every time, and I mostly killed men, but results put me at the very top in saving men, and claims criminals are characters I saved the most of all. Dafuq? :P


Moral Machines

PostPosted: Wed Sep 14, 2016 6:28 am
by TOYA toys

That's open to debate.



If a self-driving car experienced sudden brake failure as it hurtled towards a small child, and the passenger in the vehicle decided not to act, then surely not-intervening would be considered murder if one could just turn the wheel; ensuring the safety of both the passenger and the child?



On 9/11 United Airlines Flight 93 was hijacked by Al-Qeada and was destined to extirpate the White House had it not been for the passengers that managed to regain control of the aircraft and crash it into a field instead. Are the passengers complicit in murder?


I'm not attacking you or anything, but the logic you've deployed is questionable.