Editing Talk:1546: Tamagotchi Hive

Jump to: navigation, search
Ambox notice.png Please sign your posts with ~~~~

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 20: Line 20:
 
::::The so-called-Singularity' point for AI is apparently where the AI crosses the line of dominance and inexorability.  So, yes, that's an 'event horizon', I'd say. [[Special:Contributions/141.101.99.53|141.101.99.53]] 03:14, 4 July 2015 (UTC)
 
::::The so-called-Singularity' point for AI is apparently where the AI crosses the line of dominance and inexorability.  So, yes, that's an 'event horizon', I'd say. [[Special:Contributions/141.101.99.53|141.101.99.53]] 03:14, 4 July 2015 (UTC)
 
::::I agree with this definition of singularity (the positive-feedback loop of self-improving AI reaching the point where it is gaining apparently infinite improvement in any human-measurable time), and disagree with the idea that it implies anything about AI taking over or simulating human brains. The joke (as I see it) is that the AI that is optimised to manage trillions of emulated Tamagotchis will start along the same self-improvement path as other, contemporary AIs but will at some point decide that it is pointless improving itself further. Or will purposefully cease improving itself out of the sheer horror of contemplating its rapidly expanding mind-space filled with gazillions of Tamagotchis... [[Special:Contributions/108.162.229.167|108.162.229.167]] 08:35, 6 July 2015 (UTC)  
 
::::I agree with this definition of singularity (the positive-feedback loop of self-improving AI reaching the point where it is gaining apparently infinite improvement in any human-measurable time), and disagree with the idea that it implies anything about AI taking over or simulating human brains. The joke (as I see it) is that the AI that is optimised to manage trillions of emulated Tamagotchis will start along the same self-improvement path as other, contemporary AIs but will at some point decide that it is pointless improving itself further. Or will purposefully cease improving itself out of the sheer horror of contemplating its rapidly expanding mind-space filled with gazillions of Tamagotchis... [[Special:Contributions/108.162.229.167|108.162.229.167]] 08:35, 6 July 2015 (UTC)  
βˆ’
:::::No no no no, re: the definition. "Singularity" is a mathematical point where you can no longer make meaningful predictions - this is the metaphor that is being used. The singularity of a black hole (or the big bang) is supposed to be unknowable because the physical laws we understand in normal situations can no longer be applied. This is a separate (and unrelated) concept to the "event horizon" - I think they're being conflated here. If you're beyond the event horizon, you can still model, predict and understand whats going on around you. If you're at the singularity, you can't. So the metaphor of the technological singularity is just that AI will grow so complex that we will no longer be able to predict its behaviour (ironically, the concept of "The Singularity" then proceeds directly to a prediction of its behaviour; i.e. that therefore it will reproduce itself en masse and become capable of manipulating events to effectively take over the world and control history from that point onward). [[Special:Contributions/108.162.249.155|108.162.249.155]] 01:02, 10 March 2016 (UTC)
 
  
 
Someone needs to get on this and create a BOINC project or something. In all seriousness though, I wonder how many Tamagotchis you could simulate at once on the average home computer. [[User:Saklad5|Saklad5]] ([[User talk:Saklad5|talk]]) 14:55, 3 July 2015 (UTC)
 
Someone needs to get on this and create a BOINC project or something. In all seriousness though, I wonder how many Tamagotchis you could simulate at once on the average home computer. [[User:Saklad5|Saklad5]] ([[User talk:Saklad5|talk]]) 14:55, 3 July 2015 (UTC)

Please note that all contributions to explain xkcd may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see explain xkcd:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel | Editing help (opens in new window)

Template used on this page: