Editing 1838: Machine Learning

Jump to: navigation, search

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision Your text
Line 8: Line 8:
  
 
==Explanation==
 
==Explanation==
 +
{{incomplete|Work in progress. <s>This explanation is an attempt at {{w|design by committee|machine learning by committee}}.</s>}}
 +
 
{{w|Machine learning}} is a method employed in automation of complex tasks. It usually involves creation of algorithms that deal with statistical analysis of data and pattern recognition to generate output. The validity/accuracy of the output can be used to give feedback to make changes to the system, usually making future results statistically better.
 
{{w|Machine learning}} is a method employed in automation of complex tasks. It usually involves creation of algorithms that deal with statistical analysis of data and pattern recognition to generate output. The validity/accuracy of the output can be used to give feedback to make changes to the system, usually making future results statistically better.
  
Cueball stands next to what looks like a pile of garbage (or compost), with a Cueball-like friend standing atop it. The pile has a funnel (labelled "data") at one end and a box labelled "answers" at the other. Here and there mathematical matrices stick out of the pile. As the friend explains to the incredulous Cueball, data enters through the funnel, undergoes an incomprehensible process of {{w|linear algebra}}, and comes out as answers. The friend appears to be a functional part of this system himself, as he stands atop the pile stirring it with a paddle. His machine learning system is probably very inefficient, as he is integral to both the mechanical part (repeated stirring) and the learning part (making the answers look "right").
+
Pinball stands atop his machine learning system, which consists of a pile of mathematical functions with an input funnel (labelled "data") at one end, an output box (labelled "answers") at the other, and a whole mess of mathematical functions in between. As Pinball explains to the incredulous Cueball, data enters through the funnel, undergoes an incomprehensible process of linear algebra, and comes out as answers. Pinball appears to be a functional part of this system himself, as he stands atop the pile stirring it with a paddle. Pinball's machine learning system is probably very inefficient, as he is integral to both the mechanical part (repeated stirring) and the learning part (making the answers look "right").
  
The main joke is that, despite this description being too vague and giving no intuition or details into the system, it is close to the level of understanding most machine learning experts have of the many techniques in machine learning. 'Machine learning' algorithms that can be reasonably described as pouring data into linear algebra and stirring until the output looks right include {{w|support vector machine|support vector machines}}, {{w|linear regression|linear regressors}}, {{w|logistic regression|logistic regressors}}, and {{w|neural network|neural networks}}. Major recent advances in machine learning often amount to 'stacking' the linear algebra up differently, or varying stirring techniques for the compost. <!--''(Replaced reference to neural networks, but still needs explanation of vector machines.)''--> <!-- Dear previous comment-leaver: having geeked out moderately hard on neural network trivia for the last year or so, I regret to inform you that Randall's description also applies to neural networks. Most 'big advances' in neural networks are just stacking the linear algebra differently or adding different functions between them, you're still just pouring data onto linear algebra and stirring until the answers look right. Am changing to reflect that.-->
+
The main joke is that, despite this description being too vague and giving no intuition or details into the system, it is close to the level of understanding most machine learning experts have of the most popular class of techniques in machine learning, namely support vector machines. <!--''(Replaced reference to neural networks, but still needs explanation of vector machines.)''-->
  
====Composting====
 
 
This comic compares a machine learning system to a compost pile. {{w|Composting}} is the process of taking organic matter, such as food and yard waste, and allowing it to decompose into a form that serves as fertilizer. A common method of composting is to mound the organic matter in a pile with a certain amount of moisture, then "stirring" the pile occasionally to move the less-decomposed material from the top to the interior of the pile, where it will decompose faster.  
 
This comic compares a machine learning system to a compost pile. {{w|Composting}} is the process of taking organic matter, such as food and yard waste, and allowing it to decompose into a form that serves as fertilizer. A common method of composting is to mound the organic matter in a pile with a certain amount of moisture, then "stirring" the pile occasionally to move the less-decomposed material from the top to the interior of the pile, where it will decompose faster.  
  
In large-scale composting operations, the raw organic matter added to the pile is referred to as "input". This cartoon implies a play on the term "input", comparing a compost input to a data input.
+
''One of the most popular paradigms of machine learning is that of supervised learning, where a function mapping an input to an output is learned from several input-output pairs, e.g. a function mapping images of faces to people names, from a dataset of static labelled images. Classic machine learning techniques like regression, or logistic regression, have understandable parameters, and provable algorithms, but require significant engineering in the pre-processing step and don't perform very well for data like images or natural text. Deep learning techniques, on the other hand, require very little pre-processing, but require the data to be run through several steps of linear algebra, where essentially in each step the output of the previous step is multiplied with a matrix and sent to the the next step. This multi-step process has proven to be very successful for image and text data, but the structure of the parameters, arranged as a matrix for each step, allows for very little interpretation, and can only be described as "data going through a pile of linear algebra".''
 +
 
 +
The method of training such deep neural networks is via gradient descent, which can be viewed as "stirring the pile of linear algebra until the answers start looking right".
 +
 
 +
The title text refers to recurrent neural networks, which are a useful class of deep neural networks for dealing with sequence data like speech or text.
 +
 
 +
This comic satirizes machine learning, more specifically neural networks. In its most basic form, a neural network takes data and results and strengthens connections that give the right answer and weakens ones that don't, until the results "look right". Neural networks are extremely data-dependent, and make remarkably few guarantees when compared to most other computing techniques, thus the joke.
  
====Title text====
+
''Recently, other forms of neural networks, such as LSTMs, feed old sequence data back into the network with some delay, making it recurrent. The title text calls this the pile "getting mushy". The title text is also be a pun based on how Pinball is going through the data. Instead of using a shovel, he is using a canoe paddle. Canoes can be used on rivers, and rivers by definition have currents. Thus, a recurrent data could, in this situation, mean data treated as if it were part of a river.''
  
A {{w|recurrent neural network}} is a neural network where the nodes affect one another in cycles, creating feedback loops in the network that allow it to change over time. To put it another way, the neural network has 'state', with the results of previous inputs affecting how each successive input is processed. In the title text, [[Randall]] is saying that the machine learning system is technically recurrent because it "changes" (i.e. gets mushy) over time.
+
In large-scale composting operations, the raw organic matter added to the pile is referred to as "input". This cartoon implies a play on the term "input", comparing a compost input to a data input.
  
 
==Transcript==
 
==Transcript==
  
[Cueball Prime holds a canoe paddle at his side and stands on top of a "big pile of linear algebra" containing a funnel labeled "data" and box labeled "answers". Cueball II stands to the left side of the panel.)]
+
[Cueball Prime, holds a canoe paddle at his side and stands on top of a "big pile of linear algebra" containing a funnel labeled "data" and box labeled "answers". Cueball II stands to the left side of the panel.)]
  
 
Cueball II: <i>This</i> is your machine learning system?
 
Cueball II: <i>This</i> is your machine learning system?
Line 36: Line 43:
  
 
{{comic discussion}}
 
{{comic discussion}}
 
[[Category:Multiple Cueballs]]
 
[[Category:Artificial Intelligence]]
 
[[Category:Math]]
 
[[Category:Comics featuring Cueball]]
 

Please note that all contributions to explain xkcd may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see explain xkcd:Copyrights for details). Do not submit copyrighted work without permission!

To protect the wiki against automated edit spam, we kindly ask you to solve the following CAPTCHA:

Cancel | Editing help (opens in new window)