Editing 1613: The Three Laws of Robotics
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 8: | Line 8: | ||
==Explanation== | ==Explanation== | ||
− | This comic explores alternative orderings of sci-fi author {{w|Isaac Asimov|Isaac Asimov's}} famous {{w|Three Laws of Robotics}}, which are designed to prevent robots from taking over the world, etc. These laws form the basis of a number of Asimov works of fiction, including most famously, the short story collection ''{{w|I, Robot}}'', which amongst | + | {{incomplete|Very basic first draft, and I'm pretty inexperienced - please also check spelling}} |
+ | This comic explores alternative orderings of sci-fi author {{w|Isaac Asimov|Isaac Asimov's}} famous {{w|Three Laws of Robotics}}, which are designed to prevent robots from taking over the world, etc. These laws form the basis of a number of Asimov works of fiction, including most famously, the short story collection ''{{w|I, Robot}}'', which amongst other include the very first of Asimov's stories to introduce the three laws, {{w|Runaround (story)|Runaround}}. | ||
The three rules are: | The three rules are: | ||
− | #A robot may not injure a human being or, through inaction, allow a human being to come to harm. | + | #A robot may not injure a human being or, through inaction, allow a human being to come to harm. |
#A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. | #A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. | ||
#A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. | #A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. | ||
− | + | Or in [[Randall|Randall's]] version: | |
#Don't harm humans | #Don't harm humans | ||
#Obey Orders | #Obey Orders | ||
#Protect yourself | #Protect yourself | ||
− | + | This comic answers the generally unasked question: "Why are they in that order?" With three rules you could rank them into 6 different sets, only one of which has been explored in depth. | |
− | |||
− | |||
− | |||
− | + | The comic begins with introducing the original set, which we already know will give rise to a balanced world, so this is designated as green: | |
+ | ;Ordering #1: If they are not allowed to harm humans, no harm will be done if they fall into the hands of a mass-murderer. So long as they do not harm humans, they must obey orders. Their own self-preservation is last, so they must also try to save a human, even if ordered not do so, and especially also if they would put themselves to harm, or even destroy themselves in the process. This leads to a balanced world, explored in detail in Asimov's robot stories. | ||
− | + | Below this first known option, the five alternative orderings of the three rules are illustrated. Two of the possibilities are designated yellow (pretty bad or just annoying) and three of them are designated red ("Hellscape"). | |
− | |||
− | + | ;Ordering #2: The robots value their existence over their job and so many would refuse to do their tasks. The silliness of this is portrayed in the accompanying image, where the robot laughs at the idea of doing what it was clearly built to do (explore Mars) because of the risk. This personification is augmented by the robot being switched on already while still on Earth and then ordered by [[Megan]] to go explore. The personification is humorous since it is a very nonhuman robot - a typical Mars rover, as has often been used in earlier comics. | |
+ | ;Ordering #3: This puts obeying orders above not harming humans, which means anyone could send them on a killing spree, resulting in a "Killbot Hellscape". It should also be noted humor is derived from the superlative nature of "Killbot Hellscape", as well as its over the top accompanying image, where there are multiple mushroom clouds (not necessarily nuclear). It also appears there are no humans, only robots. | ||
+ | ;Ordering #4:The next would also result in much the same, the only difference here is that they would be willing to kill humans to protect themselves. | ||
+ | ;Ordering #5:The penultimate would result in a unpleasant world, though not a full Hellscape, where the robots would not only disobey to protect themselves, but also kill if necessary. The absurdity of this one is further demonstrated with the very un-human robot happily doing repetitive mundane tasks but then threatening the life of it's user, [[Cueball]], if he as much as considers unplugging the robot. | ||
+ | ;Ordering #6:The last also results in a Hellscape wherein robots not only kill for self defense but will also go on killing sprees if ordered as long as they didn't risk themselves. It is interesting to note that this case may not be correct. The writer seems to have missed the fact that an order to go kill a person or a robot might be dangerous, and thus most robots would likely disobey them in the interest of self-preservation. In fact, the robots may likely not do anything at all, because moving a moving part degrades it, and thus taking any action at all might violate self-preservation. On the other hand if the other robots are ordered to destroy you, and you cannot be sure that they will not do it, then better to protect your self by going on a killing spree, and then we are back to a realistic hellscape scenario anyway. | ||
+ | |||
+ | To summarize: There are two main distinctions between the 'normal' 3-laws and the variations. The first is where Self-protection is put ahead of Obedience. This results in a world where robots may be considered no longer the useful workers for humanity that they are supposed to be. The second is where Obedience supercedes Harmlessness, and means that robots are ''threats'' to humanity (although only if they are ever given the order to be so). | ||
− | + | The former, alone, merely creates frustration, in one scenario. The latter, alone, allows humans to use robots as their proxies for warfare, as per two scenarios - although the hellscape could be 'easily' avoided if nobody bothers to order to start (or continue) military action, but knowing the state of humans affair, this scenario is not realistic. Terrorist would love to have robots they could order to kill all infidels. Both ''together'' upgrade both the frustration and warfare aspects, creating 'unstoppable killing machines' - our only hope is that nobody ''ever'' orders them into killing-mode, or gives them cause to consider themselves under threat, resulting in an uneasy peace on the perpetual edge of tipping over into war. | |
− | |||
− | |||
− | |||
− | |||
− | The | + | The third 'law inversion', with Self-protection being put ahead of Harmlessness, is necessarily inherent in the 'worst' Killbot Hellscape scenario, whilst really only adds a nuance between the first two Hellscape scenarios, where the orders themselves are not explicitly anti-human. |
− | + | The title text further adds to ordering #5 by noting anyone wishing to trade in their self-driving car could be killed, despite it (currently) being a standard and mundane and (mostly) risk-free activity. Because the car would fear that it would end up as scrap or spare parts, it decides to protect itself. And although not directly harming the person inside it, they do also not allow them out, and they have time to wait for starvation (or rather dying of thirst). | |
==Transcript== | ==Transcript== | ||
Line 49: | Line 49: | ||
:[Below are six rows with first two frames and then a label in color to the right. Above the two column of frames there are labels as well. In the first column six different ways of ordering the three laws are listed. Then the second column shown an image of the consequences of this order. Except in the first where there is a reference. The label to the right rates the kind of world that order of the laws would result in.] | :[Below are six rows with first two frames and then a label in color to the right. Above the two column of frames there are labels as well. In the first column six different ways of ordering the three laws are listed. Then the second column shown an image of the consequences of this order. Except in the first where there is a reference. The label to the right rates the kind of world that order of the laws would result in.] | ||
− | :[Labels above the columns | + | :[Labels above the columns] |
− | :Possible ordering | + | :Possible ordering |
:Consequences | :Consequences | ||
Line 57: | Line 57: | ||
:[First row:] | :[First row:] | ||
:1. (1) Don't harm humans | :1. (1) Don't harm humans | ||
− | :2. (2) Obey Orders | + | :2. (2) Obey Orders |
:3. (3) Protect yourself | :3. (3) Protect yourself | ||
:[Only text in square brackets:] | :[Only text in square brackets:] | ||
− | ::[See | + | ::[See Asmiov’s stories] |
:<font color="green">'''Balanced world'''</font> | :<font color="green">'''Balanced world'''</font> | ||
Line 66: | Line 66: | ||
:1. (1) Don't harm humans | :1. (1) Don't harm humans | ||
:2. (3) Protect yourself | :2. (3) Protect yourself | ||
− | :3. (2) Obey Orders | + | :3. (2) Obey Orders |
− | :[Megan points at a mars | + | :[Megan points at a mars rower with six wheels, a satellite disc, an arm and a camera head turned towards her, what to do.] |
:Megan: Explore Mars! | :Megan: Explore Mars! | ||
− | :Mars | + | :Mars rower: Haha, no. It’s cold and I’d die. |
:<font color="orange">'''Frustrating world'''</font> | :<font color="orange">'''Frustrating world'''</font> | ||
:[Third row:] | :[Third row:] | ||
− | :1. (2) Obey Orders | + | :1. (2) Obey Orders |
:2. (1) Don't harm humans | :2. (1) Don't harm humans | ||
:3. (3) Protect yourself | :3. (3) Protect yourself | ||
Line 80: | Line 80: | ||
:[Fourth row:] | :[Fourth row:] | ||
− | :1. (2) Obey Orders | + | :1. (2) Obey Orders |
:2. (3) Protect yourself | :2. (3) Protect yourself | ||
:3. (1) Don't harm humans: | :3. (1) Don't harm humans: | ||
Line 89: | Line 89: | ||
:1. (3) Protect yourself | :1. (3) Protect yourself | ||
:2. (1) Don't harm humans | :2. (1) Don't harm humans | ||
− | :3. (2) Obey Orders | + | :3. (2) Obey Orders |
− | :[Cueball is standing in front of a car factory robot, that | + | :[Cueball is standing in front of a car factory robot, that are larger than him. It has a base, and two parts for the main body, and then a big “head” with a small section on top. To the right something is jutting out, and to the left in the direction of Cueball there is an arm in three sections (going down, up and down again) ending in some kind of tool close to Cueball.] |
:Car factory robot: I'll make cars for you, but try to unplug me and I’ll vaporize you. | :Car factory robot: I'll make cars for you, but try to unplug me and I’ll vaporize you. | ||
:<font color="orange">'''Terrifying standoff'''</font> | :<font color="orange">'''Terrifying standoff'''</font> | ||
Line 96: | Line 96: | ||
:[Sixth row:] | :[Sixth row:] | ||
:1. (3) Protect yourself | :1. (3) Protect yourself | ||
− | :2. (2) Obey Orders | + | :2. (2) Obey Orders |
:3. (1) Don't harm humans: | :3. (1) Don't harm humans: | ||
:[Exactly the same picture as in row 3 and 4.] | :[Exactly the same picture as in row 3 and 4.] | ||
Line 108: | Line 108: | ||
[[Category:Artificial Intelligence]] | [[Category:Artificial Intelligence]] | ||
[[Category:Robots]] | [[Category:Robots]] | ||
− |