45 Matching Annotations
  1. Last 7 days
    1. Here by "learning" is meant understanding more, not remem­bering more information that has the same degree of intelli­gibility as other information you already possess.

      A definition of learning here. Is this the thing that's missing from my note above?

    2. The first sense is the one in which we speak of ourselves as reading newspapers, magazines, or anything else that, according to our skill and talents, is at once thoroughly intel­ligible to us. Such things may increase our store of informa­tion, but they cannot improve our understanding, for our understanding was equal to them before we started. Otherwise, we would have felt the shock of puzzlement and perplexity that comes from getting in over our depth-that is, if we were both alert and honest.

      Here they're comparing reading for information and reading for understanding.

      How do these two modes relate to Claude Shannon's versions of information (surprise) and semantics (the communication) itself. Are there other pieces which exist which we're not tacitly including here? It feels like there's another piece we're overlooking.

  2. Jul 2021
    1. These criteria – surprise serendipity, information and inner complexity

      These criteria – surprise serendipity, information and inner complexity – are the criteria any communication has to meet.

      An interesting thesis about communication. Note that Luhmann worked in general systems theory. I'm curious if he was working in cybernetics as well?

    2. Irritation: basically, without surprise or disappointment there’s no information. Both partners have to be surprised in some way to say communication takes place.

      This is a basic tenet of information theory. Interesting to see it appear in a work on writing.

  3. May 2021
    1. These “Songline” stories are ancient, exhibit little variation over long periods of time, and are carefully learned and guarded by the Elders who are its custodians [7].

      What is the best way we could test and explore error correction and overwriting in such a system from an information theoretic standpoint?

  4. Apr 2021
    1. A reproduction of Carroll’snotes on his number alphabet will be found in Warren Weaver’s arti-cle “Lewis Carroll: Mathematician,” inScientific Americanfor April1956.)

      I need to track down this reference and would love to see what Weaver has to say about the matter.

      Certainly Weaver would have spoken of this with Claude Shannon (or he'd have read it).

  5. Mar 2021
    1. He introduces the idea of the apophatic: what we can't put into words, but is important and vaguely understood. This term comes from Orthodox theology, where people defined god by saying what it was not.

      Too often as humans we're focused on what is immediately in front of us and not what is missing.

      This same thing plagues our science in that we're only publishing positive results and not negative results.

      From an information theoretic perspective, we're throwing away half (or more?) of the information we're generating. We might be able to go much farther much faster if we were keeping and publishing all of our results in better fashion.

      Is there a better word for this negative information? #openquestions

  6. Feb 2021
    1. The main purpose of this book is to go one step forward, not onlyto use the principle of maximum entropy in predicting probabilitydistributions, but to replace altogether the concept of entropy withthe more suitable concept of information, or better yet, the missinginformation (MI).

      The purpose of this textbook

    2. Thereare also a few books on statistical thermodynamics that use infor-mation theory such as those by Jaynes, Katz, and Tribus.

      Books on statistical thermodynamics that use information theory.

      Which textbook of Jaynes is he referring to?

    3. Levine, R. D. and Tribus, M (eds) (1979),The Maximum Entropy Principle,MIT Press, Cambridge, MA.

      Book on statistical thermodynamics that use information theory, mentioned in Chapter 1.

    4. Katz, A. (1967),Principles of Statistical Mechanics: The Informational TheoryApproach,W.H.Freeman,London.

      Books on statistical thermodynamics that use information theory.

  7. Jan 2021
  8. Nov 2020
  9. Oct 2020
    1. The notion that counting more shapes in the sky will reveal more details of the Big Bang is implied in a central principle of quantum physics known as “unitarity.” Unitarity dictates that the probabilities of all possible quantum states of the universe must add up to one, now and forever; thus, information, which is stored in quantum states, can never be lost — only scrambled. This means that all information about the birth of the cosmos remains encoded in its present state, and the more precisely cosmologists know the latter, the more they can learn about the former.
    1. Social scientist, on the other hand, have focused on what ties are more likely to bring in new information, which are primarily weak ties (Granovetter 1973), and on why weak ties bring new information (because they bridge structural holes (Burt 2001), (Burt 2005)).
    1. Found reference to this in a review of Henry Quastler's book Information Theory in Biology.

      A more serious thing, in the reviewer's opinion, is the compIete absence of contributions deaJing with information theory and the central nervous system, which may be the field par excellence for the use of such a theory. Although no explicit reference to information theory is made in the well-known paper of W. McCulloch and W. Pitts (1943), the connection is quite obvious. This is made explicit in the systematic elaboration of the McCulloch-Pitts' approach by J. von Neumann (1952). In his interesting book J. T. Culbertson (1950) discussed possible neuraI mechanisms for recognition of visual patterns, and particularly investigated the problems of how greatly a pattern may be deformed without ceasing to be recognizable. The connection between this problem and the problem of distortion in the theory of information is obvious. The work of Anatol Rapoport and his associates on random nets, and especially on their applications to rumor spread (see the series of papers which appeared in this Journal during the past four years), is also closely connected with problems of information theory.

      Electronic copy available at: http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf

    1. Similar to my recent musings (link coming soon) on the dualism of matter vs information, I find that the real beauty may lie precisely in the complexity of their combination.

      There's a kernel of an idea hiding in here that I want to come back and revisit at a future date.

  10. Aug 2020
  11. Jul 2020
  12. Jun 2020
  13. May 2020
  14. Apr 2020
  15. Nov 2019
    1. The most interesting examples have been the weird ones (cf. HI7), where the language model has been trained on narrower, more colorful sets of texts, and then sparked with creative prompts. Archaeologist Shawn Graham, who is working on a book I’d like to preorder right now, An Enchantment of Digital Archaeology: Raising the Dead with Agent Based Models, Archaeogaming, and Artificial Intelligence, fed GPT-2 the works of the English Egyptologist Flinders Petrie (1853-1942) and then resurrected him at the command line for a conversation about his work. Robin Sloan had similar good fun this summer with a focus on fantasy quests, and helpfully documented how he did it.

      Circle back around and read this when it comes out.

      Similarly, these other references should be an interesting read as well.

  16. Sep 2019
    1. He is now intending to collaborate with Bourne on a series of articles about the find. “Having these annotations might allow us to identify further books that have been annotated by Milton,” he said. “This is evidence of how digital technology and the opening up of libraries [could] transform our knowledge of this period.”
    2. “Not only does this hand look like Milton’s, but it behaves like Milton’s writing elsewhere does, doing exactly the things Milton does when he annotates books, and using exactly the same marks,” said Dr Will Poole at New College Oxford.

      The discussion of the information theoretic idea of "hand" is interesting here, particularly as it relates to the "hand" of annotation and how it was done in other settings by the same person.

  17. Apr 2019
    1. Digital sociology needs more big theory as well as testable theory.

      Here I might posit that Cesar Hidalgo's book Why Information Grows (MIT, 2015) has some interesting theses about links between people and companies which could be extrapolated up to "societies of linked companies". What could we predict about how those will interact based on the underlying pieces? Is it possible that we see other emergent complex behaviors?

  18. Mar 2019
    1. Engelbart insisted that effective intellectual augmentation was always realized within a system, and that any intervention intended to accelerate intellectual augmentation must be understood as an intervention in a system. And while at many points the 1962 report emphasizes the individual knowledge worker, there is also the idea of sharing the context of one’s work (an idea Vannevar Bush had also described in “As We May Think”), the foundation of Engelbart’s lifelong view that a crucial way to accelerate intellectual augmentation was to think together more comprehensively and effectively. One might even rewrite Engelbart’s words above to say, “We do not speak of isolated clever individuals with knowledge of particular domains. We refer to a way of life in an integrated society where poets, musicians, dreamers, and visionaries usefully co-exist with engineers, scientists, executives, and governmental leaders.” Make your own list.
  19. Jan 2019
    1. By examining information as a product of people’s contingent choices, rather than as an impartial recording of unchanging truths, the critically information-literate student develops an outlook toward information characterized by a robust sense of agency and a heightened concern for justice.

      It seems like there's still a transfer problem here, though. There seems to be an assertion that criticality will be inherently cross-domain, but I'm not clear why that should be true. Why would the critical outlook not remain domain-specific. (To say "if it does, then it isn't critical", seems like a tautology.)

  20. Nov 2018
    1. I had begun to think of social movements’ abilities in terms of “capacities”—like the muscles one develops while exercising but could be used for other purposes like carrying groceries or walking long distances—and their repertoire of pro-test, like marches, rallies, and occupations as “signals” of those capacities.

      I find it interesting that she's using words from information theory like "capacities" and "signals" here. It reminds me of the thesis of Caesar Hidalgo's Why Information Grows and his ideas about links. While within the social milieu, links may be easier to break with new modes of communication, what most protesters won't grasp or have the time and patience for is the recreation of new links to create new institutions for rule. As seen in many war torn countries, this is the most difficult part. Similarly campaigning is easy, governing is much harder.

      As an example: The US government's breaking of the links of military and police forces in post-war Iraq made their recovery process far more difficult because all those links within the social hierarchy and political landscape proved harder to reconstruct.

    1. Understanding Individual Neuron Importance Using Information Theory

      前几天也发现了这个文,果断收藏下载了!

      在信息论下,讨论互信息和分类效率等在网络内部的影响~

    2. Understanding Convolutional Neural Network Training with Information Theory

      要认真读的文,从信息论观点去理解 CNN。

  21. Sep 2018