91 Matching Annotations
  1. Apr 2024
    1. reproducibility, which emphasizes transparency of data analysis the logical path to scientific conclusio

      According to Patil, P et al. (2016) state that "everyone agrees that scientific studies should be reproducible and replicable. The problem is almost no one agrees upon what those terms mean. A major initiative in psychology used the term reproducibility'' to refer to completely re­doing experiments including data collection (1). In cancer biologyreproducibility'' has been used to refer to the re calculation of results using a fixed set of data and code." (pag. 1)

      A possible approach to statistical reproducibility is to re-do experiments over and over again gathering all the data available to get the findings, but emphasising in the analysis with transparency in the information to have accurate conclusions.

      References

      Prasad Patil, Roger D. Peng, Jeffrey T. Leek. (2016). A statistical definition for reproducibility and replicability. see on https://www.biorxiv.org/content/10.1101/066803v1.full.pdf

    1. an organisation, i.e. a set of specialised systemscoordinated to work collectively with the same end in view.

      organization<br /> : specialized systems carrying out concerted actions toward a stated goal or purpose

      Note that even if some members of the organization create statistical noise in their actions, the larger organization may still move in a useful direction

  2. Oct 2023
    1. Just bychanging something, the desire often gets fulfilled.
    2. New ideas can come along during the process, too. And a film isn’tfinished until it’s finished, so you’re always on guard. Sometimesthose happy accidents occur. They may even be the last pieces ofthe puzzle that allow it all to come together. And you feel so thankful:How in the world did this happen?
  3. Sep 2023
    1. Recent work has revealed several new and significant aspects of the dynamics of theory change. First, statistical information, information about the probabilistic contingencies between events, plays a particularly important role in theory-formation both in science and in childhood. In the last fifteen years we’ve discovered the power of early statistical learning.

      The data of the past is congruent with the current psychological trends that face the education system of today. Developmentalists have charted how children construct and revise intuitive theories. In turn, a variety of theories have developed because of the greater use of statistical information that supports probabilistic contingencies that help to better inform us of causal models and their distinctive cognitive functions. These studies investigate the physical, psychological, and social domains. In the case of intuitive psychology, or "theory of mind," developmentalism has traced a progression from an early understanding of emotion and action to an understanding of intentions and simple aspects of perception, to an understanding of knowledge vs. ignorance, and finally to a representational and then an interpretive theory of mind.

      The mechanisms by which life evolved—from chemical beginnings to cognizing human beings—are central to understanding the psychological basis of learning. We are the product of an evolutionary process and it is the mechanisms inherent in this process that offer the most probable explanations to how we think and learn.

      Bada, & Olusegun, S. (2015). Constructivism Learning Theory : A Paradigm for Teaching and Learning.

  4. Jul 2023
    1. weakly informative approach to Bayesian analysis

      In [[Richard McElreath]]'s [[Statistical Rethinking]], he defines [[weakly informative priors]] (aka [[regularizing priors]]) as

      priors that gently nudge the machine [which] usually improve inference. Such priors are sometimes called regularizing or weakly informative priors. They are so useful that non-Bayesian statistical procedures have adopted a mathematically equivalent approach, [[penalized likelihood]]. (p. 35, 1st ed.)

    1. Science is not described by thefalsification standard, as Popper recognized and argued.4 In fact, deductive falsification isimpossible in nearly every scientific context. In this section, I review two reasons for thisimpossibility.(1) Hypotheses are not models. The relations among hypotheses and different kinds ofmodels are complex. Many models correspond to the same hypothesis, and manyhypotheses correspond to a single model. This makes strict falsification impossible.(2) Measurement matters. Even when we think the data falsify a model, another ob-server will debate our methods and measures. They don’t trust the data. Sometimesthey are right.For both of these reasons, deductive falsification never works. The scientific method cannotbe reduced to a statistical procedure, and so our statistical methods should not pretend.

      Seems consistent with how Popper used the terms [[falsification]] and [[falsifiability]] noted here

    2. Statistical RethinkingA Bayesian Coursewith Examplesin R and StanRichard McElreath

      A companion book to [[Richard McElreath]]'s phenomenal lecture course [[Statistical Rethinking]] which he made freely available here.

      Note that this is the 1st ed. of the book (2015).

      source

    3. Statisticians, for theirpart, can derive pleasure from scolding scientists, which just makes the psychological battleworse.

      Note to self: don't do this.

    4. So where do priors come from? They are engineering assumptions, chosen to help themachine learn. The flat prior in Figure 2.5 is very common, but it is hardly ever the best prior.You’ll see later in the book that priors that gently nudge the machine usually improve infer-ence. Such priors are sometimes called regularizing or weakly informative priors.They are so useful that non-Bayesian statistical procedures have adopted a mathematicallyequivalent approach, penalized likelihood. These priors are conservative, in that theytend to guard against inferring strong associations between variables.

      p. 35 where [[Richard McElreath]] defines [[weakly informative priors]] aka [[regularizing priors]] in [[Bayesian statistics]]. Notes that non-Bayesian methods have a mathematically equivalent approach called [[penalized likelihood]].

    5. The other imagines instead that population size fluctuates through time, which can be trueeven when there is no selective difference among alleles.

      McElreath is referring to \(\text{P}_{0\text{B}}\) (process model zero-B).

    6. one assumes the population size andstructure have been constant long enough for the distribution of alleles to reach a steady state

      The population size & structure being "constant" is what [[Richard McElreath]] means by "equilibrium" in \(\text{P}_{0\text{A}}\) (process model zero-A), which corresponds to the null hypothesis

      \(\text{H}_0: \text{``Evolution is neutral"}\)

    7. Andrew Gelman’s

      Per Andrew Gelman's wiki:

      Andrew Eric Gelman (born February 11, 1965) is an American statistician and professor of statistics and political science at Columbia University.

      Gelman received bachelor of science degrees in mathematics and in physics from MIT, where he was a National Merit Scholar, in 1986. He then received a master of science in 1987 and a doctor of philosophy in 1990, both in statistics from Harvard University, under the supervision of Donald Rubin.[1][2][3]

  5. Mar 2023
    1. Detailed descriptions, assumptions, limitations and test cases of many popular statistical methods for ecological research can be found in the GUSTAME server (Buttigieg and Ramette, 2014), and in the review by Paliy and Shankar (2016).
  6. Feb 2023
    1. First, I am a big fan of Chris’ posts. He is our best historian. Second, I did not challenge his ideas but asked for clarification about some terms which I believe are of general interest. Chris is well-positioned to answer my questions. Third, statistical mechanics is more about microscopic systems that do not evolve. As we know, ideas (from concepts to theories) evolve and generally emerge from previous ideas. Emergence is the key concept here. I suggested Phenomics as a potential metaphor because it represents well the emergence of some systems (phenotypes) from pre-existing ones (genotypes).

      reply to u/New-Investigator-623 at https://www.reddit.com/r/antinet/comments/10r6uwp/comment/j6wy4mf/?utm_source=reddit&utm_medium=web2x&context=3

      Ideas, concepts, propositions, et al. in this context are just the nebulous dictionary definitions. Their roots and modern usage have so much baggage now that attempting to separate them into more technical meanings is difficult unless you've got a solid reason to do so. I certainly don't here. If you want to go down some of the rabbit hole on the differences, you might appreciate Winston Perez' work on concept modeling which he outlines with respect to innovation and creativity here: https://www.youtube.com/watch?v=gGQ-dW7yfPc.

      I debated on a more basic framing of chemistry or microbiology versus statistical mechanics or even the closely related statistical thermodynamics, but for the analogy here, I think it works even if it may scare some off as "too hard". With about 20 linear feet of books in my library dedicated to biology, physics, math, engineering with a lot of direct focus on evolutionary theory, complexity theory, and information theory I would suggest that the underlying physics of statistical mechanics and related thermodynamics is precisely what allows the conditions for systems to evolve and emerge, for this is exactly what biological (and other) systems have done. For those intrigued, perhaps Stuart Kauffman's Origins of Order (if you're technically minded) or At Home in the Universe (if you're less technically oriented) are interesting with respect to complexity and emergence. There's also an interesting similar analogy to be made between a zettelkasten system and the systems described in Peter Hoffman's book Life's Rachet. I think that if carefully circumscribed, one could define a zettelkasten to be "alive". That's a bigger thesis for another time. I was also trying to stay away from the broad idea of "atomic" and drawing attention to "atomic notes" as a concept. I'm still waiting for some bright physicist to talk about sub-atomic notes and what that might mean... I see where you're going with phenomics, but chemistry and statistical mechanics were already further afield than the intended audience who already have issues with "The Two Cultures". Getting into phenomics was just a bridge too far... not to mention, vastly more difficult to attempt to draw(!!!). 😉 Besides, I didn't want Carol Greider dropping into my DMs asking me why didn't I include telomeres or chancing an uncomfortable LAX-BWI flight and a train/cab ride into Baltimore with Peter Agre who's popped up next to me on more than one occasion.

      Honestly, I was much less satisfied with the nebulousness of "solution of life"... fortunately no one seems to be complaining about that or their inability to grapple with catalysis. 🤷🏼

  7. Jan 2023
    1. When I create a new note, I write and link it as usual. Then I call up a saved search in The Archive via shortcut. I then go through the notes of my favorites and see if the fresh note is usable for one of my favorites. In doing so, I make an effort to find a connection. This effort trains my divergent thinking.

      Sascha Fast juxtaposes his new notes with his own favorite problems to see if they have any connections with respect to improving on or solving them.

      This practice is somewhat similar to Marshall Kirkpatrick's conceptualization of triangle thinking, but rather than being randomly generated with respect to each other, the new things are always generated toward important questions he's actively working on or toward.

      This helps to increase the changes of forward progress in specific areas rather than undirected random progress.

  8. Dec 2022
    1. Aleatoric music (also aleatory music or chance music; from the Latin word alea, meaning "dice") is music in which some element of the composition is left to chance, and/or some primary element of a composed work's realization is left to the determination of its performer(s). The term is most often associated with procedures in which the chance element involves a relatively limited number of possibilities.
    1. Even here, though, schools’ performance is mediocre and unlikely to meaningfully improve. Schools have been trying to overcome reading, writing and math deficits among underperforming students for decades.

      I think the argument would improve greatly if statistics were included here. Statistics to prove school performance is mediocre would improve the argument greatly since that is a premise of the argument.

    2. My work focuses on tests of adult knowledge — what adults retain after graduation. The general pattern is that grown-ups have shockingly little academic knowledge. College graduates know about what you’d expect high school graduates to know; high school graduates know about what you’d expect dropouts to know; dropouts know next to nothing

      Week 11- In this paragraph the author mentions that their work consists of collecting data and makes a claim based off that data but does not provide the readers with it. I believe that the argument would be much stronger if supporting data was shown.

    1. What greater education and skills allow an individual to do is to move fur-ther up in the overall queue of people looking to find a well-paying and re-warding job. However, because of the limited number of such jobs, only a setamount of people will be able to land such jobs. Consequently, one’s positionin the queue can change as a result of human capital, but the same amount ofpeople will still be stuck at the end of the line if the overall opportunities re-main the same.

      There is a direct analogy to statistical mechanics and thermodynamics to be drawn here.

    2. One of the clear signs that the bottleneck to low-income adults working moreresults from their lack of opportunities is provided by looking at their hours of workover the business cycle. When the economy is strong and jobs are plentiful, low-incomeworkers are more likely to find work, find work with higher pay, and be able to securemore hours of work than when the economy is weak. In 2000, when the economy wasclose to genuine full employment, the unemployment rate averaged 4.0 percent and thepoverty rate was 11.3 percent; but in 2010, in the aftermath of the Great Recession, theunemployment rate averaged 9.6 percent and the poverty rate was almost 15.1 percent.What changed in those years was not poor families’ attitudes toward work but simplythe availability of jobs. Among the bottom one-fifth of nonelderly households, hoursworked per household were about 40 percent higher in the tight labor market of 2000than in recession- plagued 2010.Given the opportunity for work or additional work hours, low-income Americanswork more. A full-employment agenda that increases opportunities in the labor market,alongside stronger labor standards such as a higher minimum wage, reduces poverty.

      How can we frame the science of poverty with respect to the model of statistical mechanics?

      Unemployment numbers have very little to do with levels of poverty. They definitely don't seem to be correlated with poverty levels, in fact perhaps inversely so. Many would say that people are lazy and don't want to work when the general reality is that they do want to work (for a variety of reasons including identity and self-esteem), but the amount of work they can find and the pay they receive for it are the bigger problems.

  9. Oct 2022
    1. He argued that God gazes over history in its totality and finds all periods equal.

      Leopold von Ranke's argument that God gazes over history and finds all periods equal is very similar to a framing of history from the viewpoint of statistical thermodynamics: it's all the same material floating around, it just takes different states at different times.

      link to: https://hyp.is/jqug2tNlEeyg2JfEczmepw/3stages.org/c/gq_title.cgi?list=1045&ti=Foucault%27s%20Pendulum%20(Eco)

  10. Sep 2022
    1. For millions of Americans who are living pay-check to paycheck and precariously close to the poverty line, normal life eventslike the birth of a child or temporary loss of a job can send them below thepoverty line. But poverty spells tend to be short, and they are caused by the riskassociated with normal events that happen to most of us across the life course.They are just more catastrophic for some than for others.

      Can poverty be modeled after a statistical thermodynamic framework? How might we move the set point for poverty up significantly to prevent the ill effects of regular, repeated poverty?

      What does the complexity of poverty indicate? Within the web of potential indicators, what might be done to vastly mitigate the movement of people in and out of poverty? What sorts of additional resiliency can be built into the system?

  11. Aug 2022
  12. May 2022
    1. scanned for solutions to long-standing problems in his reading,conversations, and everyday life. When he found one, he couldmake a connection that looked to others like a flash of unparalleledbrilliance

      Feynman’s approach encouraged him to follow his interests wherever they might lead. He posed questions and constantly

      Creating strong and clever connections between disparate areas of knowledge can appear to others to be a flash of genius, in part because they didn't have the prior knowledges nor did they put in the work of collecting, remembering, or juxtaposition.

      This method may be one of the primary (only) underpinnings supporting the lone genius myth. This is particularly the case when the underlying ideas were not ones fully developed by the originator. As an example if Einstein had fully developed the ideas of space and time by himself and then put the two together as spacetime, then he's independently built two separate layers, but in reality, he's cleverly juxtaposed two broadly pre-existing ideas and combined them in an intriguing new framing to come up with something new. Because he did this a few times over his life, he's viewed as an even bigger genius, but when we think about what he's done and how, is it really genius or simply an underlying method that may have shaken out anyway by means of statistical thermodynamics of people thinking, reading, communicating, and writing?

      Are there other techniques that also masquerade as genius like this, or is this one of the few/only?

      Link this to Feynman's mention that his writing is the actual thinking that appears on the pages of his notes. "It's the actual thinking."

    1. Her opposition to police and prison starts with the experiences of marginalized people, who have to deal with police and carceral violence every day.

      Although I know this from a first hand experience and experience of family members, I think statistics on incarcerated people would benefit the article.

      If someone who has never researched this topic reads this, they wouldn't truly understand the gap between marginalized people incarcerated and those not.

      Another good statistic that could be used is the recent amount of people that have been released early or with certain crimes being decrimalized and what groups those are.

    2. Black people, 32 percent of the population in Chicago, account for 72 percent of police stops, according to ACLU of Illinois data.

      I believe this is a great use of statistics in an argument like this. It instantly responds to the statement said before and shows how even though black people take up such a small percent of Chicago, they are the most stopped.

  13. Apr 2022
    1. Kai Kupferschmidt. (2021, December 1). @DirkBrockmann But these kinds of models do help put into context what it means when certain countries do or do not find the the variant. You can find a full explanation and a break-down of import risk in Europe by airport (and the people who did the work) here: Https://covid-19-mobility.org/reports/importrisk_omicron/ https://t.co/JXsYdmTnNP [Tweet]. @kakape. https://twitter.com/kakape/status/1466109304423993348

  14. Mar 2022
    1. In 1925, Ronald Fisher advanced the idea of statistical hypothesis testing, which he called "tests of significance", in his publication Statistical Methods for Research Workers.[28][29][30] Fisher suggested a probability of one in twenty (0.05) as a convenient cutoff level to reject the null hypothesis.[31] In a 1933 paper, Jerzy Neyman and Egon Pearson called this cutoff the significance level, which they named α {\displaystyle \alpha } . They recommended that α {\displaystyle \alpha } be set ahead of time, prior to any data collection.[31][32] Despite his initial suggestion of 0.05 as a significance level, Fisher did not intend this cutoff value to be fixed. In his 1956 publication Statistical Methods and Scientific Inference, he recommended that significance levels be set according to specific circumstances.[31]

      The lofty p=0.5 is utter bullshit. It was just an arbitrary, made-up value with no real evidence behind it.

    1. only 2.5 sigma

      That's 99.38% chance of being correct, yet that's considered "weak". Would that we could do that in medicine or the social sciences.

  15. Feb 2022
  16. Jan 2022
  17. Sep 2021
  18. Aug 2021
  19. May 2021
  20. Apr 2021
  21. Mar 2021
    1. There's an interesting suggestion associated with this, that periodic fasting causes autophagy, which Taleb claims is an evolutionary process by which the weaker proteins are broken down first. If this is true, then always having a full stomach is another way of subsidizing the unfit and weakening the organism.

      This will depend on a very specific and narrow definition of fitness--perhaps one from a very individualistic and libertarian perspective.

      There is fitness at the level of the gene, the organ, the individual, and the group, and even possibly larger groupings above that.

      What if, by starving out and leaving "uneducated" people like Srinivasa Ramanujan, for example, who surely was marginalized for his time, society is left without them? While on an individual level Ramanujan may have been less fit on some levels as G.H. Hardy and may have otherwise dwindled and disappeared, Hardy adopted him and made both mathematicians better while also making dramatic strides for mankind.

      From a statistical mechanics perspective, within some reasonable limits, we should be focusing on improving ourselves as well as the larger group(s) because the end results for humanity and life in general may be dramatically improved. (Though what we mean by improved here may be called into question from a definitional perspective.)

      Compare this with [Malcolm Gladwell]]'s argument in My Little Hundred Million.

      On a nationalistic level within human politics, Republicans should be less reticent to help out marginalized Americans because it may be from this pool of potential that we may find life saving improvements or even protection from other polities (ie, in our competition or threats from countries like China, Iran, North Korea). Consider how different things may have been had the U.S. not taken in Jewish or other foreign nationals like Albert Einstein, John von Neumann, etc. in the early to mid-1900s.? Now consider, which life changing geniuses we may be preventing reaching their potential by our current immigration policies? our current educational policies?

  22. Feb 2021
    1. Thereare also a few books on statistical thermodynamics that use infor-mation theory such as those by Jaynes, Katz, and Tribus.

      Books on statistical thermodynamics that use information theory.

      Which textbook of Jaynes is he referring to?

    2. Levine, R. D. and Tribus, M (eds) (1979),The Maximum Entropy Principle,MIT Press, Cambridge, MA.

      Book on statistical thermodynamics that use information theory, mentioned in Chapter 1.

    3. Katz, A. (1967),Principles of Statistical Mechanics: The Informational TheoryApproach,W.H.Freeman,London.

      Books on statistical thermodynamics that use information theory.

  23. Jan 2021
  24. Oct 2020
    1. Come on, harvest me! I’ll just change your world some more.

      I wonder a bit here about the idea of what in a meme might have a substrate type of effect to decrease the overall energy of the process to help it take off.

    1. Problems of disorganized complexity are problems that can be described using averages and distributions, and that do not depend on the identity of the elements involved in a system, or their precise patterns of interactions. A classic example of a problem of disorganized complexity is the statistical mechanics of Ludwig Boltzmann, James-Clerk Maxwell, and Willard Gibbs, which focuses on the properties of gases.
  25. Aug 2020
  26. Jul 2020
    1. unless the model is embedded in a suitable structure thatpermits extrapolation, no useful inference is possible, either Bayesian or non-Bayesian.
  27. Jun 2020
  28. May 2020
  29. Mar 2020
    1. Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. —Steve Jobs (via lifehacker and Zettel no. 201308301352)

      in other words, it's just statistical thermodynamics. Eventually small pieces will float by each other and stick together in new and hopefully interesting ways. The more particles you've got and the more you can potentially connect or link things, the better off you'll be.

  30. Jan 2020
  31. Dec 2019
    1. Many people luck out like me, accidentally. We recognize what particular path to mastery we’re on, long after we actually get on it.

      Far too many people luck out this way and we all perceive them as magically talented when in reality, they're no better than we, they just had better circumstances or were in the right place at the right time.

  32. Nov 2018
    1. One instructor's use of Slack, comparing and contrasting other LMS (but he used Canvas); good basic breakdown of the conversational tools and samples of how hey can be used; This is a great primer of Slack's use in the classroom (5/5)

  33. Oct 2018
  34. www.projectinfolit.org www.projectinfolit.org
    1. telephone interviews with 37 participants

      I have to wonder at telephone samples of this age group given the propensity of youth to not communicate via voice phone.

  35. May 2018
  36. Nov 2017
    1. pairwise overlaps using Fisher’s test and mutual exclusion (Leiserson et al., 2016xA weighted exact test for mutually exclusive mutations in cancer. Leiserson, M.D.M., Reyna, M.A., and Raphael, B.J. Bioinformatics. 2016; 32: i736–i745Crossref | PubMed | Scopus (4)See all ReferencesLeiserson et al., 2016)
  37. Sep 2017