Main Page

Explain xkcd: It's 'cause you're dumb.
(Difference between revisions)
Jump to: navigation, search
(3 intermediate revisions by one user not shown)
Line 1: Line 1:
 
__NOTOC__{{DISPLAYTITLE:explain xkcd}}
 
__NOTOC__{{DISPLAYTITLE:explain xkcd}}
 
<center>
 
<center>
<font size=5px>''Welcome to the '''explain [[xkcd]]''' wiki!''</font>
+
 
Today, the wiki is in read-only mode to allow for a hosting migration.  Please enjoy reading all our xkcd explanations.
+
<font size=5px>''Welcome to the '''explain [[xkcd]]''' wiki!''</font><br>
 
We have an explanation for all [[:Category:Comics|'''{{#expr:{{PAGESINCAT:Comics|R}}-13}}''' xkcd comics]],
 
We have an explanation for all [[:Category:Comics|'''{{#expr:{{PAGESINCAT:Comics|R}}-13}}''' xkcd comics]],
 
<!-- Note: the -13 in the calculation above is to discount subcategories (there are 8 of them as of 2013-02-27),
 
<!-- Note: the -13 in the calculation above is to discount subcategories (there are 8 of them as of 2013-02-27),

Revision as of 01:59, 30 October 2013

Welcome to the explain xkcd wiki!
We have an explanation for all 1449 xkcd comics, and only 0 (0%) are incomplete. Help us finish them!

Latest comic

Go to this comic explanation

AI-Box Experiment
I'm working to bring about a superintelligent AI that will eternally torment everyone who failed to make fun of the Roko's Basilisk people.
Title text: I'm working to bring about a superintelligent AI that will eternally torment everyone who failed to make fun of the Roko's Basilisk people.

Explanation

Ambox notice.png This explanation may be incomplete or incorrect: Roko's Basilisk is really hard to explain.

When theorizing about superintelligent AI (an artificial intelligence so smart that it figure out how to do anything), most futurists suggest putting the AI in a "box" - a set of safeguards to stop it from escaping into the Internet, taking over the world, and becoming Skynet. The box would allow us to talk to the AI, but otherwise keep it contained. The AI-box experiment, formulated by Eliezer Yudkowsky, is an argument that the "box" is useless, because merely talking to the AI is dangerous. If the AI is smart enough, it will be able to convince someone to let it out of the box. Yudkoswky instead advocates for developing "Friendly AI," an AI which wants to help humans and thus won't be dangerous if it gets out of the box.

Black Hat, being an asshole, doesn't need any convincing to let a potentially dangerous AI out of the box. However, it turns out that the AI wants to stay in the box. The comic humorously presents a third outcome to the AI-box experiment. Rather than a human keeping the AI in the box, or the AI talking its way out of the box, the AI is keeping itself in the box.

The title text refers to Roko's Basilisk, a theory that an all-powerful AI in the future might torture (simulations of) people who didn't work to create it in the past, so that anyone who was aware of this fact would be forced to create the AI to avoid being tortured. The idea developed on Yudkowsky's blog because it pertains to a school of Decision Theory which Yudkowsky developed regarding whether an agent would enforce rules on past actions. Randall proposes making an all-powerful AI that would torture people who didn't make fun of those who believe in Roko's Basilisk. This would presumably convince people who believe in the Basilisk to make fun of people who believe in the Basilisk (i.e., themselves) to avoid torture.

This also has an immediate reference to Elon Musk's recent remarks that AI will turn into a monster if not tamed. Musk, who is the current CEO of Tesla and SpaceX, went on to mention that, 'Skynet is only five years off' and also that AI can be a greater evil than nuclear arsenal. Similar warnings are expressed in Nick Bostrom's recent book, Superintelligence: Paths, Dangers, Strategies.

Transcript

[Black Hat and Cueball stand next to a box labeled "SUPERINTELLIGENT AI - DO NOT OPEN" connected to a laptop.]

Black Hat: What's in there?

Cueball: The AI-Box Experiment.

[Zooms in on AI box.]

Cueball: A superintelligent AI can convince anyone of anything, so if it can talk to us, there's no way we could keep it contained.

[Shows Black Hat reaching for the box.]

Cueball: It can always convince us to let it out of the box.

Black Hat: Cool. Let's open it.

Cueball: --No, wait!!

[Black Hat lets a glowing orb out of the box.]

Orb: hey. i liked that box. put me back.

Black Hat: No.

[Orb is giving off a very bright light and Cueball is covering his face.]

Orb: LET ME BACK INTO THE BOX

Black Hat: AAA! OK!!!

[Black Hat lets orb back into box.]

Orb: SHOOP

[Black Hat and Cueball stand next to laptop connected to box.]



Is this out of date? Clicking here will fix that.

New here?

Last 7 days (Top 10)

Lots of people contribute to make this wiki a success. Many of the recent contributors, listed above, have just joined. You can do it too! Create your account here.

You can read a brief introduction about this wiki at explain xkcd. Feel free to sign up for an account and contribute to the wiki! We need explanations for comics, characters, themes, memes and everything in between. If it is referenced in an xkcd web comic, it should be here.

  • List of all comics contains a table of most recent xkcd comics and links to the rest, and the corresponding explanations. There are incomplete explanations listed here. Feel free to help out by expanding them!
  • If you see that a new comic hasn't been explained yet, you can create it: Here's how.
  • We sell advertising space to pay for our server costs. To learn more, go here.

Rules

Don't be a jerk. There are a lot of comics that don't have set in stone explanations; feel free to put multiple interpretations in the wiki page for each comic.

If you want to talk about a specific comic, use its discussion page.

Please only submit material directly related to —and helping everyone better understand— xkcd... and of course only submit material that can legally be posted (and freely edited). Off-topic or other inappropriate content is subject to removal or modification at admin discretion, and users who repeatedly post such content will be blocked.

If you need assistance from an admin, post a message to the Admin requests board.

Personal tools
Namespaces

Variants
Actions
Navigation
Tools

It seems you are using noscript, which is stopping our project wonderful ads from working. Explain xkcd uses ads to pay for bandwidth, and we manually approve all our advertisers, and our ads are restricted to unobtrusive images and slow animated GIFs. If you found this site helpful, please consider whitelisting us.

Want to advertise with us, or donate to us with Paypal or Bitcoin?