Talk:2318: Dynamic Entropy

Explain xkcd: It's 'cause you're dumb.
Revision as of 09:08, 11 June 2020 by IByte (talk | contribs) (Entropy again)
Jump to: navigation, search

Can confirm, have never lost an argument. Dynamic Entropy (talk) 00:45, 11 June 2020 (UTC)

Allegrini, P., Douglas, J. F., & Glotzer, S. C. (1999). Dynamic entropy as a measure of caging and persistent particle motion in supercooled liquids. Physical Review E, 60(5), 5714, doi: 10.1103/physreve.60.5714.
Asadi, M., Ebrahimi, N., Hamedani, G., & Soofi, E. (2004). Maximum Dynamic Entropy Models. Journal of Applied Probability, 41(2), 379-390. Retrieved June 11, 2020, from
Green, J. R., Costa, A. B., Grzybowski, B. A., & Szleifer, I. (2013). Relationship between dynamical entropy and energy dissipation far from thermodynamic equilibrium. Proceedings of the National Academy of Sciences, 110(41), 16339-16343.
S. Satpathy et al., "An All-Digital Unified Static/Dynamic Entropy Generator Featuring Self-Calibrating Hierarchical Von Neumann Extraction for Secure Privacy-Preserving Mutual Authentication in IoT Mote Platforms," 2018 IEEE Symposium on VLSI Circuits, Honolulu, HI, 2018, pp. 169-170, doi: 10.1109/VLSIC.2018.8502369.
Bugstomper (talk) 01:28, 11 June 2020 (UTC)
Can someone with knowledge of the reference system in a wiki make the reference appear above the discussion, maybe in a section named References?--Kynde (talk) 07:06, 11 June 2020 (UTC)

Well bugger me (METAPHOR! METAPHOR!) but my current Master thesis in Computer Science could use that term without much shoehorning. (tl;dr: Binary search trees that adapt, =dynamic, can serve a query series faster than static, and the gain depends on the structure of the query series, =entropy. I prefer the good old "instance optimality", though...) 08:58, 11 June 2020 (UTC)

This seems to tie in with the recent comic 2315: Eventual Consistency, which is also about entropy (in a thermodynamic(al) sense), but I guess that like the rest of the world I don't know what entropy really is, because if entropy is a measure of how "surprising" a variable is, why is everything being flat and spread out evenly called a state of maximum entropy? Everything being the same doesn't sound very surprising to me... --IByte (talk) 09:08, 11 June 2020 (UTC)