Editing 1838: Machine Learning
Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.
The edit can be undone.
Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 8: | Line 8: | ||
==Explanation== | ==Explanation== | ||
− | {{ | + | {{incomplete|Work in progress. Current explanation probably contains too much jargon, making it more complex than the comic it is trying to explain.}} |
− | + | This comic compares machine learning to composting. | |
− | + | Machine learning is a method employed in automation of complex tasks. It usually involves creation of algorithms that deal with statistical analysis of data and pattern recognition to generate output. The validity/accuracy of the output can be used to give feedback to make changes to the system, usually making future results statistically better. | |
− | + | Composting is the process of taking organic matter, such as food and yard waste, and allowing it to decompose into a form that serves as fertilizer. A common method of composting is to mound the organic matter in a pile with a certain amount of moisture, then "stirring" the pile occasionally to move the less-decomposed material from the top to the interior of the pile, where it will decompose faster. | |
− | |||
− | In | + | In this comic, Cueball explains to a Cueball-like guy his machine learning system, which consists of a pile of mathematical functions with an input funnel (labelled "data") at one end and an output box (labelled "answers) at the other. Cueball himself appears to be a functional part of this system as he stands atop the pile stirring it with a paddle. |
− | + | Data is input into a funnel, and goes through a mess of linear algebra, and comes out as answers. The main joke is that, despite this description being too vague and giving no intuition or details into the system, it is close to the level of understanding most machine learning experts have of the most popular class of techniques in machine learning, namely deep learning with neural networks ''(Why reference to neural networks here? They are non-linear. A better example is support vector machines.)'' | |
− | + | ''One of the most popular paradigms of machine learning is that of supervised learning, where a function mapping an input to an output is learned from several input-output pairs, e.g. a function mapping images of faces to people names, from a dataset of static labelled images. Classic machine learning techniques like regression, or logistic regression, have understandable parameters, and provable algorithms, but require significant engineering in the pre-processing step and don't perform very well for data like images or natural text. Deep learning techniques, on the other hand, require very little pre-processing, but require the data to be run through several steps of linear algebra, where essentially in each step the output of the previous step is multiplied with a matrix and sent to the the next step. This multi-step process has proven to be very successful for image and text data, but the structure of the parameters, arranged as a matrix for each step, allows for very little interpretation, and can only be described as "data going through a pile of linear algebra".'' | |
+ | |||
+ | The method of training such deep neural networks is via gradient descent, which can be viewed as "stirring the pile of linear algebra until the answers start looking right". | ||
+ | |||
+ | The title text refers to recurrent neural networks, which are a useful class of deep neural networks for dealing with sequence data like speech or text. | ||
+ | |||
+ | This comic satirizes machine learning, more specifically neural networks. In its most basic form, a neural network takes data and results and strengthens connections that give the right answer and weakens ones that don't, until the results "look right". Neural networks are extremely data-dependent, and make remarkably few guarantees when compared to most other computing techniques, thus the joke. | ||
+ | |||
+ | Cueball's machine learning system is probably very inefficient, as he is integral to both the mechanical part (repeated stirring) and the learning part (making the answers look "right"). | ||
+ | |||
+ | ''Recently, other forms of neural networks, such as LSTMs, feed old sequence data back into the network with some delay, making it recurrent. The title text calls this the pile "getting mushy". The title text is also be a pun based on how Cueball is going through the data. Instead of using a shovel, he is using a canoe paddle. Canoes can be used on rivers, and rivers by definition have currents. Thus, a recurrent data could, in this situation, mean data treated as if it were part of a river. | ||
+ | '' | ||
==Transcript== | ==Transcript== | ||
− | [Cueball | + | [Cueball, holding a canoe paddle at his side, is standing on top of a "big pile of linear algebra" containing a funnel labeled "data" and box labeled "answers" while talking to a Cueball-like person to the left (from the reader's perspective)] |
− | + | Guy: <i>This</i> is your machine learning system? | |
− | Cueball | + | Cueball: Yup! You pour the data into this big pile of linear algebra, then collect the answers on the other side. |
− | + | Guy: What if the answers are wrong? | |
− | Cueball | + | Cueball: Just stir the pile until they start looking right. |
{{comic discussion}} | {{comic discussion}} | ||
− | |||
− | |||
− | |||
− | |||
− |