220 Matching Annotations
  1. Oct 2020
    1. Complexity, interconnectivity, novelty, & creation is beyond any single entity's ability to effectively forecast.
  2. Sep 2020
    1. Blokland, I. V. van, Lanting, P., Ori, A. P., Vonk, J. M., Warmerdam, R. C., Herkert, J. C., Boulogne, F., Claringbould, A., Lopera-Maya, E. A., Bartels, M., Hottenga, J.-J., Ganna, A., Karjalainen, J., Study, L. C.-19 cohort, Initiative, T. C.-19 H. G., Hayward, C., Fawns-Ritchie, C., Campbell, A., Porteous, D., … Franke, L. H. (2020). Using symptom-based case predictions to identify host genetic factors that contribute to COVID-19 susceptibility. MedRxiv, 2020.08.21.20177246. https://doi.org/10.1101/2020.08.21.20177246

  3. Aug 2020
    1. Menni, C., Valdes, A. M., Freidin, M. B., Sudre, C. H., Nguyen, L. H., Drew, D. A., ... & Visconti, A. (2020). Real-time tracking of self-reported symptoms to predict potential COVID-19. Nature Medicine, 1-4.

    1. Wittgenstein writes: “The limits of my language mean the limits of my world”. Maybe he was trying to make a restrictive statement, one about how we can’t know the world beyond our language. But the reverse is also true; language and the world have the same boundaries. Learn language really well, and you understand reality. God is One, and His Name is One, and God is One with His Name. “Become good at predicting language” sounds like the same sort of innocent task as “become good at Go” or “become good at Starcraft”. But learning about language involves learning about reality, and prediction is the golden key. “Become good at predicting language” turns out to be a blank check, a license to learn every pattern it can.

      Because language is an isomorphic mapping to the world, learning to predict language means you're learning to predict patterns that occur in the world.

    1. In “the eggplant is a fruit,” probably what is meant is that all eggplants are fruits. In “the dog is a Samoyed,” probably what is meant is that some dog is a Samoyed. We can reasonably assume these meanings from our background understanding of their topics. This knowledge is nowhere in the sentence. The meaning depends on its parts—but not only on them.

      It's common in speech coding (e.g. a vocoder) to rely on a thing that reconstructs the 'meaning' of a signal by predicting its 'full' representation.

      This act of predicting then is also a form of compressing, by predicting the full representation from its lossy analogue you require less bandwidth to transmit messages just like if you'd used a non-stochastic compression technique.

  4. Jul 2020
  5. Jun 2020
    1. Saltelli, A., Bammer, G., Bruno, I., Charters, E., Di Fiore, M., Didier, E., Nelson Espeland, W., Kay, J., Lo Piano, S., Mayo, D., Pielke Jr, R., Portaluri, T., Porter, T. M., Puy, A., Rafols, I., Ravetz, J. R., Reinert, E., Sarewitz, D., Stark, P. B., … Vineis, P. (2020). Five ways to ensure that models serve society: A manifesto. Nature, 582(7813), 482–484. https://doi.org/10.1038/d41586-020-01812-9