Editing 1455: Trolley Problem

Jump to: navigation, search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 8: Line 8:
  
 
==Explanation==
 
==Explanation==
The {{w|trolley problem}} is a thought experiment often posed in {{w|philosophy}} to explore moral questions, with applications in {{w|cognitive science}} and {{w|neuroethics}}. The general version is that an out of control trolley (or train) is heading towards 5 people on the track who can't get out of the way. On an alternative branch of the track is 1 person who can't get out of the way. The trolley can be diverted by using a lever, with the consequence of saving the 5 people but killing the 1 person.  
+
The {{w|trolley problem}} is a thought experiment often posed in {{w|philosophy}} to explore moral questions, with applications in {{w|cognitive science}} and {{w|neuroethics}}. The general version is that an out of control trolley (or train) is heading towards 5 people on the track who can't get out of the way. On an alternative branch of the track is 1 person who can't get out of the way. The trolley can be diverted by using a lever, with the consequence of saving the 5 people but killing the 1 person. The question posed is whether or not it is morally right to pull the lever.
  
The choice is between a deliberate action that will directly kill one person, or allowing events to unfold naturally, resulting in five deaths.  The question posed is whether or not it is morally right to pull the lever. The moral question is not as simple as it may first appear.  
+
This results of [http://www.philosophyexperiments.com/fatman/Default4.aspx this test] report that around 86% of respondents choose the utilitarian option of diverting the trolley. Utilitarian ethics holds that the morally correct option is that which results in the most amount of good for the greatest number of people; in this case choosing to sacrifice the 1 person for 5 people.
  
This results of [http://www.philosophyexperiments.com/fatman/Default4.aspx this test] report that around 86% of respondents choose the utilitarian option of diverting the trolley.
+
After discovering a variation on this problem posed in a strip of the [http://www.smbc-comics.com/?id=3556#comic Saturday Morning Breakfast Cereal] webcomic (which can be seen on the tablet he is carrying), [[Cueball]] presents it to [[Black Hat]]. Before Cueball can finish explaining the problem, Black Hat questions whether he would need to get up to reach the lever and how much it would interrupt his other activities. As usual, he cares nothing at all about what happens to other people. This response is linked to another theory in philosophy, that of {{w|self interest}} or {{w|egoism}} or {{w|Objectivism (Ayn Rand)|Objectivism}}, in which a person will choose the action with the most benefit for them personally.
 
 
There are, however, several alternative formulations of the same basic dilemma. One such scenario allows you to stop the trolley by deliberately pushing "a very fat man" into its path, killing the man but saving the other five people. Another scenario involves selecting a healthy young and innocent person to die, in order to save five others going through organ donation. In both of these examples the basic dilemma is the same. However, most people reject the utilitarian option in these cases.
 
 
 
After discovering a variation on this problem posed in a strip of the [http://www.smbc-comics.com/?id=3556#comic Saturday Morning Breakfast Cereal] webcomic (which can be seen on the tablet he is carrying), [[Rob|Cueball]], Black Hat's roommate, presents it to [[Black Hat]]. Before Cueball can finish explaining the problem, most notably leaving out the disadvantage to flipping the lever where it would kill one person, Black Hat questions whether he would need to get up to reach the lever and how much it would interrupt his other activities. As usual, he cares nothing at all about what happens to other people. This response is linked to another theory in philosophy, that of {{w|self interest}} or {{w|egoism}} or {{w|Objectivism (Ayn Rand)|Objectivism}}, in which a person will choose the action with the most benefit for them personally.
 
  
 
Black Hat then poses an offer: he promises to divert the trolley if Cueball is one of the five endangered people, provided that Cueball pays him $1 now. Again Black Hat is twisting the situation to his own benefit, in this case monetary. In the case of self-interest, the $1 could be the price at which Black Hat values his time and effort, below which he feels there is no benefit to himself in pulling the lever. Cueball decides that there is no point posing the problem to someone like Black Hat and gives up. This further shows that it is challenging for people with different ethical frameworks to function together without a common understanding, either mutually or with one side using that understanding to motivate a mutually agreeable or horrible solution.
 
Black Hat then poses an offer: he promises to divert the trolley if Cueball is one of the five endangered people, provided that Cueball pays him $1 now. Again Black Hat is twisting the situation to his own benefit, in this case monetary. In the case of self-interest, the $1 could be the price at which Black Hat values his time and effort, below which he feels there is no benefit to himself in pulling the lever. Cueball decides that there is no point posing the problem to someone like Black Hat and gives up. This further shows that it is challenging for people with different ethical frameworks to function together without a common understanding, either mutually or with one side using that understanding to motivate a mutually agreeable or horrible solution.
Line 22: Line 18:
 
The title text follows this up by continuing Black Hat's offers. For $5 he will not deliberately arrange this situation and for $25 he will quit looking for further incentives. These attempts to exploit the thought exercise for personal gain further demonstrate Black Hat's cynical amorality.
 
The title text follows this up by continuing Black Hat's offers. For $5 he will not deliberately arrange this situation and for $25 he will quit looking for further incentives. These attempts to exploit the thought exercise for personal gain further demonstrate Black Hat's cynical amorality.
  
Black Hat's offer makes Cueball himself the subject of the trolley problem: Cueball now has a choice of expending $1 to save 5 people (including himself) while sacrificing one person, or $5 to save all 6 people. Of course, he could dismiss the offer as a joke, if not for the fact that the person making it, which, as we know from other comics, is very much capable of such exploits.
+
Black Hat's offer makes Cueball himself the subject of the trolley problem: Cueball now has a choice of expending $1 to save 5 people while sacrificing one person, or $5 to save all 6 people. Of course, he could dismiss the offer as a joke, if not for the person making it, which, as we know from other comics, is very much capable of such exploits.
  
 
==Transcript==
 
==Transcript==
Line 36: Line 32:
 
:Cueball: I guess I forgot who I was talking to.
 
:Cueball: I guess I forgot who I was talking to.
 
:Black hat: For a dollar, I'll promise to pull the lever if one of the five people is you.
 
:Black hat: For a dollar, I'll promise to pull the lever if one of the five people is you.
 
==Trivia==
 
*Three years later two comics were released with about one month between them where the Trolley problem was mentioned. In [[1925: Self-Driving Car Milestones]] it is in the last ''milestone'' on the list and a month later, in [[1938: Meltdown and Spectre]], it is used as a metaphor for the way some computer programs work. It would subsequently come up again in [[2635: Superintelligent AIs]], [[2702: What If 2 Gift Guide]], and [[2818: Circuit Symbols]].
 
 
 
{{comic discussion}}
 
{{comic discussion}}
 
[[Category:Comics featuring Black Hat]]
 
[[Category:Comics featuring Black Hat]]

Please note that all contributions to explain xkcd may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see explain xkcd:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel | Editing help (opens in new window)