30 Matching Annotations
  1. Dec 2023
  2. Sep 2023
  3. May 2023
    1. Entropy is not a property of the string you got, but of the strings you could have obtained instead. In other words, it qualifies the process by which the string was generated.
  4. Apr 2023
    1. the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes
  5. Feb 2023
    1. Suppose that we were asked to arrange the followingin two categories—distance, mass, electric force, entropy, beauty, melody.I think there are the strongest grounds for placingentropy alongside beauty and melody and not with thefirst three.
  6. Jun 2022
    1. I very much appreciate your commitment to growth and learning. I also think it's nice to have colorful posts here vs. a ghost town. My feedback would be to gear your posts towards how to use an Antinet to produce written output. Specifically what main note you created, with pictures of the main note, and then elaborate on what they actually mean, and share a written post about the idea. You've done several of these posts, and I'd say lean even more towards sharing the most powerful thought/the most powerful maincard you've developed all week. For frequency, I'd say one post a week on this would be great.My main point is this: the primary use of Luhmann's Antinet was written output. The thoughts he shared were deep and developed because of the Antinet process. We're not in the PKM space, we're in the AKD space. Analog Knowledge Development, focusing on written output. The paradox is, when changing your mindset to written output, you actually become more of a learning machine.

      One of the toughest parts about these systems is that while they're relatively easy to outline (evidence: the thousands of 500-1000 word blog posts about zettelkasten in the last 3 years), they're tougher to practice and many people have slight variations on the idea (from Eminem's "Stacking Ammo" to Luhmann's (still incomplete) digital collection). Far fewer people are sticking with it beyond a few weeks or doing it for crazy reasons (I call it #ProductivityPorn, while Scott has the colorful phrase "bubble graph boys").

      For those who visit here, seeing discrete cards and ideas, videos, or examples of how others have done this practice can be immensely helpful. While it can be boring to watch a video of someone reading and taking notes by hand, it can also be incredibly useful to see exactly what they're doing and how they're doing it (though God bless you for speeding them up 😅).

      This is also part of why I share examples of how others have practiced these techniques too. Seeing discrete examples to imitate is far easier than trying to innovate your way into these methods, particularly when it's difficult to see the acceleration effects of serendipity that comes several months or years into the process. Plus it's fun to see how Vladimir Nabokov, Anne Lamott, Gottfried Wilhelm Leibniz, Bob Hope, Michael Ende, Twyla Tharp, Roland Barthes, Kate Grenville, Marcel Mauss, Claude Lévi-Strauss, Joan Rivers, Umberto Eco, Georg Christoph Lichtenberg, Raymond, Llull, George Carlin, John Locke, and Eminem all did variations of this for themselves. (This last sentence has so much entropy in it, I'm certain that it's never been written before in the history of humanity.)

      And isn't everyone tired to death of Luhmann, Luhmann, Luhmann? You'd think that no one had ever thought to take note of anything before?!

      While my own approach is a hybrid of online and offline techniques, I've gotten long emails from people following my Hypothes.is feed of notes and annotations saying what a useful extended example it is. Of course they don't see the follow up that entails revision of the notes or additional linking, tagging, and indexing that may go on, but it's at least enough of an idea that they understand the start of the practice.

      (Incidentally, I wrote most of this using a few cards from my own system. 🗃️✂️🖋️)

    1. William James’s self-assessment: “I am no lover of disorder, but fear to lose truth by the pretension to possess it entirely.”
  7. Mar 2022
  8. Nov 2021
    1. "In the Zettelkasten, there is a note that contains the argument that disproves all assertions on all other notes. But this note disappears once you open the Zettelkasten. That is, it changes its number and relocates itself, making it impossible to find. A joker."

      Ha! A great meta card to have in one's system!

    2. Now that we're digitizing the Zettelkasten we often find dated notes that say things like "note 60,7B3 is missing". This note replaces the original note at this position. We often find that the original note is maybe only 20, 30 notes away, put back in the wrong position. But Luhmann did not start looking, because where should he look? How far would he have to go to maybe find it again? So, instead he adds the "note is missing"-note. Should he bump into the original note by chance, then he could put it back in its original position. Or else, not.

      Niklas Luhmann had a simple way of dealing with lost cards by creating empty replacements which could be swapped out if found later. It's not too dissimilar to doing inventory in a book store where mischievous customers pick up books, move them, or even hide them sections away. Going through occasionally or even regularly or systematically would eventually find lost/misfiled cards unless they were removed entirely from the system (similar to stolen books).

  9. Oct 2021
    1. ”My expectation is that we will hear many, many nice speeches, we will hear many pledges that - if you really look into the details - are more or less meaningless but they just say them in order to have something to say, in order for media to have something to report about," she said."And then I expect things to continue to remain the same. ... The COPs as they are now will not lead to anything unless there is big, massive pressure from the outside."

      Greta Thunberg on COP26

      In which Greta calls bullshit on the capitalist entropy machine’s attempts to spin the culture of learned helplessness, trained incapacities, and bureaucratic intransigence that is designed to maintain the status quo while pretending to be the world’s saviours through philanthropy, social entrepreneurship, and greenwashing.

      via Twitter

  10. Feb 2021
    1. Person - definition of entropy

      • Clausius - entropy from Greek "Transformation"
      • Leon Cooper (1968) - lost heat
      • dictionaries - unavailable energy
      • Arieh Ben-Naim - missing information (or uncertainty)
    2. The main purpose of this book is to go one step forward, not onlyto use the principle of maximum entropy in predicting probabilitydistributions, but to replace altogether the concept of entropy withthe more suitable concept of information, or better yet, the missinginformation (MI).

      The purpose of this textbook

    3. Levine, R. D. and Tribus, M (eds) (1979),The Maximum Entropy Principle,MIT Press, Cambridge, MA.

      Book on statistical thermodynamics that use information theory, mentioned in Chapter 1.

  11. Oct 2020
    1. Passive transport

      The process of moving ions and other atomic/molecular substances across the cell membranes without the need of an input of energy. Instead, it relies on the system to grow during entropy.

  12. Aug 2020
    1. Bernard Stiegler analysiert die Beziehungen zwischen Wissen und Technik, und zwar ausgehend von Derrida und Heidegger. Vieles wirkt auf mich wie eine Art Parallel-Unternehmen zu Latour, bei dem ich bisher nie einen Hinweis auf Stiegler gefunden habe. Auch bei Stiegler geht es darum zu erklären, warum Staaten und Wirtschaft nicht auf die Klimakrise reagieren. Wenn ich es richtig sehe, dann verbindet er das Konzept der différance mit einer Art bioökonomischem Ansatz.- Bemerkenswert ist auch seine Marketing- und Medienkritik und generell sein Versuch, Denkgewohnheiten in eine Beziehung zu den Mechanismen des neoliberalen Kapitalismus zu setzen.

  13. Jun 2020
    1. To get a feel for how much pseudo-random data is available in your entropy pool, you can run this command:$ cat /proc/sys/kernel/random/entropy_avail 2684 The number shown appears to represent the number of bits of entropy that have been collected. Even 2,684 might not seem like much in a world in which we routinely speak in terms of terrabytes, but numbers above 100 are said to be a good sign. I
  14. Feb 2019
    1. Maximum Entropy Generators for Energy-Based Models

      【能量视角下的GAN模型】本文直接受启发于Bengio团队的新作《Maximum Entropy Generators for Energy-Based Models》,作者给出了GAN/WGAN的清晰直观的能量图像,讨论了判别器(能量函数)的训练情况和策略,指出了梯度惩罚一个非常漂亮而直观的能量解释。此外,本文还讨论了GAN中优化器的选择问题。http://t.cn/EcBIwqJ

  15. Dec 2018
    1. A Probe into Understanding GAN and VAE models

      paper 提出了个 VAE-GAN 模型,不过正如作者自己说的可能是 GPU 资源不够,图像质量并不太如意,而且用的是 FCN 不是 CNN;主要用 Entropy 来量化评估生成变现。

    1. In essence, for a change to occur, you must apply more energy to the system than is extracted by the system.
  16. Oct 2018
    1. entropic

      This is what Edgar Orrin Klapp meant when he wrote in his 1986 Overload and Boredom: Essays on the Quality of Life in the Information Society that “meaning and interest are found mostly in the mid-range between extremes of redundancy and variety-these extremes being called, respectively, banality and noise” (). Redundancy is repetition of the same, which creates a condition of insufficient difference, while noise is the chaos of non-referentiality, or entropy. In a way, these extremes collapse into each other, in that both can be viewed “as a loss of potential … for a certain line of action at least” ().

      There is perhaps something of "the real" here, as well. Volker Woltersdorff (2012, 134) writes that: The law of increasing entropy is a concept of energy in the natural sciences that assumes the tendency of all systems to eventually reach their lowest level of energy. Organic systems therefore tend toward inertia … Freud identifies the death drive with entropy … within his theory, the economy of the death drive is to release tension."

      Adam Phillips clarifies the death drive: “People are not, Freud seems to be saying, the saboteurs of their own lives, acting against their own best interests; they are simply dying in their own fashion (to describe someone as self-destructive is to assume a knowledge of what is good for them, an omniscient knowledge of the ‘real’ logic of their lives)” (2000, 81, cf. 77).

  17. Sep 2018
    1. The concept of “extropy” was used to encapsulate the core values and goals of transhumanism. Intended not as a technical term opposed to entropy but instead as a metaphor, extropy was defined as “the extent of a living or organizational systems intelligence, functional order, vitality, and capacity and drive for improvement.”

      It's interesting that the author chooses to emphasize the distinction between extropy as an opposition to entropy, but instead as a metaphor. However, would extropy not be the opposite of entropy metaphorically as well? Scientific definition aside, entropy is defined as the universe's tendency towards chaos in all manners of the word meaning constant expansion and disorder. Extropy is the universe's tendency towards the idea of a 'singularity' (discussed in "The Technological Singularity") so essentially the exact opposite? The universe's tendency to follow "intelligence, functional order", etc. toward a single point of "posthuman" where we've gone beyond human capability?

  18. Jun 2018
    1. entropic

      This is what Edgar Orrin Klapp meant when he wrote in his 1986 Overload and Boredom: Essays on the Quality of Life in the Information Society that “meaning and interest are found mostly in the mid-range between extremes of redundancy and variety-these extremes being called, respectively, banality and noise” (). Redundancy is repetition of the same, which creates a condition of insufficient difference, while noise is the chaos of non-referentiality, or entropy. In a way, these extremes collapse into each other, in that both can be viewed “as a loss of potential … for a certain line of action at least” ().

      There is perhaps something of "the real" here, as well. Volker Woltersdorff (2012, 134) writes that: The law of increasing entropy is a concept of energy in the natural sciences that assumes the tendency of all systems to eventually reach their lowest level of energy. Organic systems therefore tend toward inertia … Freud identifies the death drive with entropy ... within his theory, the economy of the death drive is to release tension."

      Adam Phillips clarifies the death drive: “People are not, Freud seems to be saying, the saboteurs of their own lives, acting against their own best interests; they are simply dying in their own fashion (to describe someone as self-destructive is to assume a knowledge of what is good for them, an omniscient knowledge of the ‘real’ logic of their lives)” (2000, 81, cf. 77).

  19. Feb 2018
    1. Se constituye de galaxias, de astros, de soles, dicho de otro modo, se desarrolla mediante la organización al mismo tiempo que se produce mediante la desorganización. El mundo biológico es un mundo que evoluciona

      De hecho, algunos estudios sugieren que la organización es una buena manera de acelerar la desorganización, particularmente en el caso de la vida (https://www.quantamagazine.org/a-new-thermodynamics-theory-of-the-origin-of-life-20140122/)

  20. Dec 2017
    1. Life conforms to neither of these conditions. Take self-reliance. Living systems exist far from the state known as thermodynamic equilibrium – instead of their energy spreading itself out over the widest possible space, it’s concentrated in specific areas and flows along defined pathways, such as the cardiovascular or nervous system. Such phenomena are very improbable, as far as fundamental physics is concerned. Maintaining this unusual arrangement requires constant activity, or metabolism, which in turn demands that organisms extract energy from their environment via eating, breathing, photosynthesis, and so on. That belies any pretensions of independence.

      May be this the reason why life "goes against" entropy?

  21. Jan 2014
    1. An effective data management program would enable a user 20 years or longer in the future to discover , access , understand, and use particular data [ 3 ]. This primer summarizes the elements of a data management program that would satisfy this 20-year rule and are necessary to prevent data entropy .

      Who cares most about the 20-year rule? This is an ideal that appeals to some, but in practice even the most zealous adherents can't picture what this looks like in some concrete way-- except in the most traditional ways: physical paper journals in libraries are tangible examples of the 20-year rule.

      Until we have a digital equivalent for data I don't blame people looking for tenure or jobs for not caring about this ideal if we can't provide a clear picture of how to achieve this widely at an institutional level. For digital materials I think the picture people have in their minds is of tape backup. Maybe this is generational? New generations not exposed widely to cassette tapes, DVDs, and other physical media that "old people" remember, only then will it be possible to have a new ideal that people can see in their minds-eye.

    2. A key component of data management is the comprehensive description of the data and contextual information that future researchers need to understand and use the data. This description is particularly important because the natural tendency is for the information content of a data set or database to undergo entropy over time (i.e. data entropy ), ultimately becoming meaningless to scientists and others [ 2 ].

      I agree with the key component mentioned here, but I feel the term data entropy is an unhelpful crutch.

    3. data entropy Normal degradation in information content associated with data and metadata over time (paraphrased from [ 2 ]).

      I'm not sure what this really means and I don't think data entropy is a helpful term. Poor practices certainly lead to disorganized collections of data, but I think this notion comes from a time when people were very concerned about degradation of physical media on which data is stored. That is, of course, still a concern, but I think the term data entropy really lends itself as an excuse for people who don't use good practices to manage data and is a cover for the real problem which is a kind of data illiteracy in much the same way we also face computational illiteracy widely in the sciences. Managing data really is hard, but let's not mask it with fanciful notions like data entropy.