24 Matching Annotations
  1. Last 7 days
    1. From this perspective, GPT-2 says less about artificial intelligence and more about how human intelligence is constantly looking for, and accepting of, stereotypical narrative genres, and how our mind always wants to make sense of any text it encounters, no matter how odd. Reflecting on that process can be the source of helpful self-awareness—about our past and present views and inclinations—and also, some significant enjoyment as our minds spin stories well beyond the thrown-together words on a page or screen.

      And it's not just happening with text, but it also happens with speech as I've written before: Complexity isn’t a Vice: 10 Word Answers and Doubletalk in Election 2016 In fact, in this mentioned case, looking at transcripts actually helps to reveal that the emperor had no clothes because there's so much missing from the speech that the text doesn't have enough space to fill in the gaps the way the live speech did.

    2. The most interesting examples have been the weird ones (cf. HI7), where the language model has been trained on narrower, more colorful sets of texts, and then sparked with creative prompts. Archaeologist Shawn Graham, who is working on a book I’d like to preorder right now, An Enchantment of Digital Archaeology: Raising the Dead with Agent Based Models, Archaeogaming, and Artificial Intelligence, fed GPT-2 the works of the English Egyptologist Flinders Petrie (1853-1942) and then resurrected him at the command line for a conversation about his work. Robin Sloan had similar good fun this summer with a focus on fantasy quests, and helpfully documented how he did it.

      Circle back around and read this when it comes out.

      Similarly, these other references should be an interesting read as well.

    3. For those not familiar with GPT-2, it is, according to its creators OpenAI (a socially conscious artificial intelligence lab overseen by a nonprofit entity), “a large-scale unsupervised language model which generates coherent paragraphs of text.” Think of it as a computer that has consumed so much text that it’s very good at figuring out which words are likely to follow other words, and when strung together, these words create fairly coherent sentences and paragraphs that are plausible continuations of any initial (or “seed”) text.

      This isn't a very difficult problem and the underpinnings of it are well laid out by John R. Pierce in An Introduction to Information Theory: Symbols, Signals and Noise. In it he has a lot of interesting tidbits about language and structure from an engineering perspective including the reason why crossword puzzles work.

      close reading, distant reading, corpus linguistics

  2. Sep 2019
    1. He is now intending to collaborate with Bourne on a series of articles about the find. “Having these annotations might allow us to identify further books that have been annotated by Milton,” he said. “This is evidence of how digital technology and the opening up of libraries [could] transform our knowledge of this period.”
  3. Aug 2019
  4. languagedev.wikispaces.com languagedev.wikispaces.com
    1. Laban's

      kids who are in a kindergarten program that provides good language input have better later vocabularies, more complex sentences, higher reading and writing competencies

    2. Children who have had frequent storybook internet ions wilh a wide variety of types of texts (or genres) will develop an aware· ness of how language is used in oach type of discourse

      importance of reading to kids so that they are exposed to more than just one type of language input

    3. meaningful phrases.

      syntactic knowledge=being able to create meaningful sentences/phrases

    4. Prosodic fealures in a languago represent the wa)~sometl1h.!_g is said

      intonation/ inflection involved here

    5. honological knowledge refers to knowledge about sound-symbol relations in a language. A phoneme is the smallest linguistic unit of sound, which is combined with other phonemes lo form words. Phonemes consist of sounds that are considered to be a single perceptual unit by a listener, such as th

      phonological knowledge= different from phonetics.

    6. It forms the foundation of our perceptions, com-municntion!f and daily interactions.

      Interesting question: does language shape reality or does reality shape language?

    7. attention to lan8!!Q.@_ a~municg_li9Il..!J!!.J1er than a fo_cus on spi3_e.ch_pro_ctu<1tion ~nd..th~_.de.v.elopment of articulation. This approach recognizes that language is a medium of communi-cation with ot

      this is important, and I relate this to my studies as a linguistics minor: language/studying language is not necessarily about speaking correctly, but in how we communicate naturally

  5. Jul 2019
  6. Apr 2019
    1. Digital sociology needs more big theory as well as testable theory.

      I can't help but think here about the application of digital technology to large bodies of literature in the creation of the field of corpus linguistics.

      If traditional sociology means anything, then a digital incarnation of it should create physical and trackable means that can potentially be more easily studied as a result. Just the same way that Mark Dredze has been able to look at Twitter data to analyze public health data like influenza, we should be able to more easily quantify sociological phenomenon in aggregate by looking at larger and richer data sets of online interactions.

      There's also likely some value in studying the quantities of digital exhaust that companies like Google, Amazon, Facebook, etc. are using for surveillance capitalism.

  7. Nov 2018
  8. Oct 2018
    1. It was the schoolteacher and writer Anne Fisher whose English primer of 1745 began the notion that it's somehow bad to use they in the plural and that he stands for both men and women.
  9. Aug 2018
    1. He wants to iron out differences, not protect them. He suggests measures like a mandatory national-service requirement and a more meaningful path to citizenship for immigrants.

      What if we look at the shrinking number of languages as a microcosm of identity. Are people forced to lose language? Do they not care? What are the other similarities and differences.

      Cross reference: https://boffosocko.com/2015/06/08/a-world-of-languages-and-how-many-speak-them-infographic/

  10. Feb 2018
  11. Sep 2017
    1. “It is also important to note that what we are doing now is in some ways fulfilling a number of longstanding principles that other presidents have always talked about.”

      Neomi Rao, newly confirmed administrator of White House Information and Regulatory affairs attempts here to renounce personal ownership of deregulation efforts instead framing the current move as the continuation of an existing motion present in previous leadership. She attempts to insure the rational saliency of deregulation through this logic of a theoretical continuum.

  12. Mar 2017
    1. This implies that there is no such thing as a code-organon of it-erability-which could be structurally secret.

      It's interesting with examples of current undeciphered writing, such as the Voynich Manuscript and the Beale cipher, since it implies they're all crackable so long as they are not nonsense. The following sentence feels like something important to that, that languages are constituted as an iterable network, a sustained internal logic.

  13. Feb 2017
    1. which they may be most readily put together

      It's interesting how natively some of these things come to us, even though the order of language is based mostly on arbitrary cues, as demonstrated by how other languages don't follow these rules. It reminds me of Blair, who concluded that there had to be some sense of taste because he "knew," and his audience "knew," that they had to have some means that made them better than the foreigners.

  14. Jan 2017
    1. Many people implicitly or explicitly use this cognitive outsourcing model to think about augmentation. It's commonly used in press accounts, for instance. It is also, I believe, a common way for programmers to think about augmentation. In this essay, we've seen a different way of thinking about augmentation. Rather than just solving problems expressed in terms we already understand, the goal is to change the thoughts we can think:

      Good distinctions here. Cf. also what happens when one begins to master the heptapod language in "Story of Your Life." It's Whorf-Sapir, but a "soft" Whorf-Sapir. So I'd say, anyhow. Relevant too that Engelbart discusses Whorf-Sapir.

  15. Dec 2016
    1. The team on Google Translate has developed a neural network that can translate language pairs for which it has not been directly trained. "For example, if the neural network has been taught to translate between English and Japanese, and English and Korean, it can also translate between Japanese and Korean without first going through English."

  16. Dec 2015
    1. The Book of Human Emotions, Tiffany Watt Smith

      Emotions are not just biological, but cultural. Different societies have unique concepts for combinations of feelings in particular circumstances.

      If you know a word for an emotion, you can more easily recognize it, control it -- and perhaps feel it more intensely.

      Emotions and how they are valued also varies across time as well as space. Sadness was valued in Renaissance Europe: they felt it made you closer to God. Today we value happiness, and we may value it too much. Emodiversity is the idea that feeling a wide range of emotions is good for you mentally and physically.

  17. Nov 2013