897 Matching Annotations
  1. May 2023
    1. United States biomedical researchers and pharmaceutical companies are conducting and paying African doctors to conduct unethical and illegal testing of human subjects. Nonconsensual research on human subjects is an atrocity that occurred in Tuskegee, Alabama, and in Guatemala for over forty years. Once outlawed in the U.S., medical researchers began experimenting on thousands of human research subjects without their consent in Cameroon, Ghana, Namibia, Nigeria, Uganda, South Africa, Zimbabwe and other African countries.

  2. Apr 2023
    1. Content that is borderline makes it into a designated Spam folder, where masochists can read through it if they choose. And legitimate companies that use spammy email marketing tactics are penalized, so they’re incentivized to be on their best behavior.

      I believe that seeing false news in the same light as spam is a better way to look at identifying the problem. This may help decrease some of the damage posed by false news for consumers who aren't well-versed in spotting disinformation on the internet. Most individuals can read an email in their inbox claiming that they won the million-dollar lottery and still classify it as spam because they recognize the all too familiar ploy. If we applied this approach to disinformation on social media, I believe it would help people become more familiar with the lies spread on the internet.

    1. Lying press (German: Lügenpresse, lit. 'press of lies') is a pejorative and disparaging political term used largely for the printed press and the mass media at large. It is used as an essential part of propaganda and is thus usually dishonest or at least not based on careful research.
    1. Now, I've made a number of documentaries about fake news. And what interests me is the first person to use the phrase mainstream media was Joseph Goebbels. And he, in one of his propaganda sheets, said “It's very important that you don't read the mainstream media because they'll tell you lies.” You must read the truth by the ramblings of his boss and his associated work. And you do have to watch this. This is a very, very well-established technique of fascists, is to tell you, don't read this stuff, read our stuff.<br /> —Ian Hislop, Editor, Private Eye Magazine 00:16:00, Satire in the Age of Murdoch and Trump, The Problem with Jon Stewart Podcast

    1. He said one upside for publishers was that audiences might soon find it harder to know what information to trust on the web, so “they’ll have to go to trusted sources.”

      That seems somewhat comically optimistic. Misinformation has spread rampantly online without the accelerant of AI.

  3. Mar 2023
    1. Die schiere Menge sprengt die Möglichkeiten der Buchpublikation, die komplexe, vieldimensionale Struktur einer vernetzten Informationsbasis ist im Druck nicht nachzubilden, und schließlich fügt sich die Dynamik eines stetig wachsenden und auch stetig zu korrigierenden Materials nicht in den starren Rhythmus der Buchproduktion, in der jede erweiterte und korrigierte Neuauflage mit unübersehbarem Aufwand verbunden ist. Eine Buchpublikation könnte stets nur die Momentaufnahme einer solchen Datenbank, reduziert auf eine bestimmte Perspektive, bieten. Auch das kann hin und wieder sehr nützlich sein, aber dadurch wird das Problem der Publikation des Gesamtmaterials nicht gelöst.

      link to https://hypothes.is/a/U95jEs0eEe20EUesAtKcuA

      Is this phenomenon of "complex narratives" related to misinformation spread within the larger and more complex social network/online network? At small, local scales, people know how to handle data and information which is locally contextualized for them. On larger internet-scale communication social platforms this sort of contextualization breaks down.

      For a lack of a better word for this, let's temporarily refer to it as "complex narratives" to get a handle on it.

    1. Title: Fox News producer files explosive lawsuits against the network, alleging she was coerced into providing misleading Dominion testimony

      // - This is an example of how big media corporations can deceive the public and compromise the truth - It helps create a nation of misinformed people which destabilizes political governance - the workspace sounds toxic - the undertone of this story: the pathological transformation of media brought about by capitalism - it is the need for ratings, which is the indicator for profit in the marketing world, that has corrupted the responsibility to report truthfully - making money becomes the consumerist dream at the expense of all else of intrinsic value within a culture - knowledge is what enables culture to exist, modernity is based on cumulative cultural evolution - this is an example of NON-conscious cumulative cultural evolution or pathological cumulaitve cultural evolution

  4. Feb 2023
      • Title: Faster than expected
      • subtitle: why most climate scientists can’t tell the truth (in public) Author: Jackson Damien

      • This is a good article written from a psychotherapist's perspective,

      • examining the psychology behind why published, mainstream, peer reviewed climate change research is always dangerously lagging behind current research,
      • and recommending what interventions could be be taken to remedy this
      • This your of scientific misinformation coming from scientists themselves
      • gives minimizers and denialists the very ammunition they need to legitimise delay of the urgently needed system change.
      • What climate scientists say In public is far from what they believe in private.
      • For instance, many climate scientists don't believe 1.5 Deg. C target is plausible anymore, but don't say so in public.
      • That reticence is due to fear of violating accepted scientific social norms,
      • being labeled alarmist and risk losing their job.
      • That creates a collective cognitive dissonance that acts as a feedback signal
      • for society to implement change at a dangerously slow pace
      • and to not spend the necessary resources to prepare for the harm already baked in.
      • The result of this choice dissonance is that
      • there is no collective sense of an emergency or a global wartime mobilisation scale of collective behaviour.
      • Our actions are not commensurate to the permanent emergency state we are now in.
      • The appropriate response that is suggested is for the entire climate science community to form a coalition that creates a new kind of peer reviewed publishing and reporting
      • that publicly responds to the current and live knowledge that is being discovered every day.
      • This is done from a planetary and permanent emergency perspective in order to eliminate the dangerous delays that create the wrong human collective behavioural responses.
    1. People know it’s bad but not how bad. This gap in understanding remains wide enough for denialists and minimisers to legitimise inadequate action under the camouflage of empty eco-jargon and false optimism. This gap allows nations, corporations and individuals to remain distracted by short-term crises, which, however serious, pale into insignificance compared with the unprecedented threat of climate change.
      • it is the conservative nature of science
      • to spend years to validates claims.
      • Unfortunately, in a global emergency as we find ourselves in now, we don’t have the luxury of a few years.
      • In the case of this wicked problem, we need to find a way to make major decisions based on uncertain but plausible data
      • The misinformation has the effect of causing society to set the wrong priorities and making things worse
    1. And it constitutes an important but overlooked signpost in the 20th-centuryhistory of information, as ‘facts’ fell out of fashion but big data became big business.

      Of course the hardest problem in big data has come to be realized as the issue of cleaning up messing and misleading data!

    2. Deutsch’s index and his ‘facts’, then, seemedto his students to embody a moral value in addition to epistemological utility.

      Beyond their epistemological utility do zettelkasten also "embody a moral value"? Jason Lustig argues that they may have in the teaching context of Gotthard Deutsch where the reliance on facts was of extreme importance for historical research.


      Some of this is also seen in Scott Scheper's religious framing of zettelkasten method though here the aim has a different focus.

    3. He tried to show that this‘favorite topic’ of his, ‘insistence on exactness in chronological dates’, amounted tomore than a trifling (Deutsch, 1915, 1905a). Deutsch compared such historical accuracyto that of a bookkeeper who might recall his ledger by memory. ‘People would look uponsuch an achievement’, he reflected, ‘as a freak, harmless, but of no particular value, infact rather a waste of mental energy’ (Deutsch, 1916). However, he sought to show thatthese details mattered, no different from how ‘a difference in a ledger of one centremains just as grievous as if it were a matter of $100,000’ (Deutsch, 1904a: 3).

      Interesting statement about how much memory matters, though it's missing some gravitas somehow.

      Is there more in the original source?

    4. Anecdotes’, he concluded, ‘havetheir historic value, if properly tested’ – reflecting both his interest in details and also theneed to ascertain whether they were true (Deutsch, 1905b).
    1. “It makes me feel like I need a disclaimer because I feel like it makes you seem unprofessional to have these weirdly spelled words in your captions,” she said, “especially for content that's supposed to be serious and medically inclined.”

      Where's the balance for professionalism with respect to dodging the algorithmic filters for serious health-related conversations online?

      link to: https://hypothes.is/a/uBq9HKqWEe22Jp_rjJ5tjQ

    1. One of the most well-documented shortcomings of large language models is that they can hallucinate. Because these models have no direct knowledge of the physical world, they're prone to conjuring up facts out of thin air. They often completely invent details about a subject, even when provided a great deal of context.
    1. Moran said the codes themselves may end up limiting the reach of misinformation. As they get more cryptic, they become harder to understand. If people are baffled by a unicorn emoji in a post about COVID-19, they might miss or dismiss the misinformation.
    2. "The coded language is effective in that it creates this sense of community," said Rachel Moran, a researcher who studies COVID-19 misinformation at the University of Washington. People who grasp that a unicorn emoji means "vaccination" and that "swimmers" are vaccinated people are part of an "in" group. They might identify with or trust misinformation more, said Moran, because it’s coming from someone who is also in that "in" group.

      A shared language and even more specifically a coded shared language can be used to create a sense of community or define an in group identity.

  5. Jan 2023
    1. The uptake of mis- and disinformation is intertwined with the way our minds work. The large body of research on the psychological aspects of information manipulation explains why.

      In an article for Nature Review Psychology, Ullrich K. H. Ecker et al looked(opens in a new tab) at the cognitive, social, and affective factors that lead people to form or even endorse misinformed views. Ironically enough, false beliefs generally arise through the same mechanisms that establish accurate beliefs. It is a mix of cognitive drivers like intuitive thinking and socio-affective drivers. When deciding what is true, people are often biased to believe in the validity of information and to trust their intuition instead of deliberating. Also, repetition increases belief in both misleading information and facts.

      Ecker, U.K.H., Lewandowsky, S., Cook, J. et al. (2022). The psychological drivers of misinformation belief and its resistance to correction.

      Going a step further, Álex Escolà-Gascón et al investigated the psychopathological profiles that characterise people prone to consuming misleading information. After running a number of tests on more than 1,400 volunteers, they concluded that people with high scores in schizotypy (a condition not too dissimilar from schizophrenia), paranoia, and histrionism (more commonly known as dramatic personality disorder) are more vulnerable to the negative effects of misleading information. People who do not detect misleading information also tend to be more anxious, suggestible, and vulnerable to strong emotions.

  6. Dec 2022
    1. . Furthermore, our results add to the growing body of literature documenting—at least at this historical moment—the link between extreme right-wing ideology and misinformation8,14,24 (although, of course, factors other than ideology are also associated with misinformation sharing, such as polarization25 and inattention17,37).

      Misinformation exposure and extreme right-wing ideology appear associated in this report. Others find that it is partisanship that predicts susceptibility.

    2. . We also find evidence of “falsehood echo chambers”, where users that are more often exposed to misinformation are more likely to follow a similar set of accounts and share from a similar set of domains. These results are interesting in the context of evidence that political echo chambers are not prevalent, as typically imagined
    3. Aligned with prior work finding that people who identify as conservative consume15, believe24, and share more misinformation8,14,25, we also found a positive correlation between users’ misinformation-exposure scores and the extent to which they are estimated to be conservative ideologically (Fig. 2c; b = 0.747, 95% CI = [0.727,0.767] SE = 0.010, t (4332) = 73.855, p < 0.001), such that users estimated to be more conservative are more likely to follow the Twitter accounts of elites with higher fact-checking falsity scores. Critically, the relationship between misinformation-exposure score and quality of content shared is robust controlling for estimated ideology
    1. highlights the need for public health officials to disseminate information via multiple media channels to increase the chances of accessing vaccine resistant or hesitant individuals.
    2. Across the Irish and UK samples, similarities and differences emerged regarding those in the population who were more likely to be hesitant about, or resistant to, a vaccine for COVID-19. Three demographic factors were significantly associated with vaccine hesitance or resistance in both countries: sex, age, and income level. Compared to respondents accepting of a COVID-19 vaccine, women were more likely to be vaccine hesitant, a finding consistent with a number of studies identifying sex and gender-related differences in vaccine uptake and acceptance37,38. Younger age was also related to vaccine hesitance and resistance.
    3. There were no significant differences in levels of consumption and trust between the vaccine accepting and vaccine hesitant groups in the Irish sample. Compared to vaccine hesitant responders, vaccine resistant individuals consumed significantly less information about the pandemic from television and radio, and had significantly less trust in information disseminated from newspapers, television broadcasts, radio broadcasts, their doctor, other health care professionals, and government agencies.
    1. The Gish gallop /ˈɡɪʃ ˈɡæləp/ is a rhetorical technique in which a person in a debate attempts to overwhelm their opponent by providing an excessive number of arguments with no regard for the accuracy or strength of those arguments. In essence, it is prioritizing quantity of one's arguments at the expense of quality of said arguments. The term was coined in 1994 by anthropologist Eugenie Scott, who named it after American creationist Duane Gish and argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to another debating method called spreading, in which one person speaks extremely fast in an attempt to cause their opponent to fail to respond to all the arguments that have been raised.

      I'd always known this was a thing, but didn't have a word for it.

  7. Nov 2022
    1. Our familiarity with these elements makes the overall story seem plausible, even—or perhaps especially—when facts and evidence are in short supply.

      Storytelling tropes play into our system one heuristics and cognitive biases by riding on the tailcoats of familiar story plotlines we've come to know and trust.

      What are the ways out of this trap? Creating lists of tropes which should trigger our system one reactions to switch into system two thinking patterns? Can we train ourselves away from these types of misinformation?

    2. As part of the Election Integrity Partnership, my team at the Stanford Internet Observatory studies online rumors, and how they spread across the internet in real time.
    3. Something similar! Here it is: https://t.co/x1DPx9dm0P

      — Renee DiResta (@noUpside) November 26, 2022
      <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
    1. Lilienfeld, S. O., Sauvigné, K. C., Lynn, S. J., Cautin, R. L., Latzman, R. D., & Waldman, I. D. (2014). Fifty psychological and psychiatric terms to avoid: a list of inaccurate, misleading, misused, ambiguous, and logically confused words and phrases. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2015.01100

    1. Trope, trope, trope, strung into a Gish Gallop.

      One of the issues we see in the Sunday morning news analysis shows (Meet the Press, Face the Nation, et al.) is that there is usually a large amount of context collapse mixed with lack of general knowledge about the topics at hand compounded with large doses of Gish Gallop and F.U.D. (fear, uncertainty, and doubt).

    2. a more nuanced view of context.

      Almost every new technology goes through a moral panic phase where the unknown is used to spawn potential backlashes against it. Generally these disappear with time and familiarity with the technology.

      Bicycles cause insanity, for example...

      Why does medicine and vaccines not follow more of this pattern? Is it lack of science literacy in general which prevents it from becoming familiar for some?

    3. What if instead of addressing individual pieces of misinformation reactively, we instead discussed the underpinnings — preemptively?

      Perhaps we might more profitably undermine misinformation by dismantling the underlying tropes the underpin them?

    1. And that, perhaps, is what we might get to via prebunking. Not so much attempts to counter or fact-check misinfo on the internet, but defanging the tropes that underpin the most recurringly manipulative claims so that the public sees, recognizes, & thinks:

      And that, perhaps, is what we might get to via prebunking. Not so much attempts to counter or fact-check misinfo on the internet, but defanging the tropes that underpin the most recurringly manipulative claims so that the public sees, recognizes, & thinks:😬

      — Renee DiResta (@noUpside) June 19, 2021
      <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
    1. suspect evaded Colorado’s red flag gun law

      If you read lower in the article you'll see that the headline is a blatant lie.

      The Gov failed to prosecute a violent person, so AP spins it as if this guy "evaded" (which is an action).

      One can't evade a law that is never applied against them.

    1. Under President Joe Biden, the shifting focus on disinformation has continued. In January 2021, CISA replaced the Countering Foreign Influence Task force with the “Misinformation, Disinformation and Malinformation” team, which was created “to promote more flexibility to focus on general MDM.” By now, the scope of the effort had expanded beyond disinformation produced by foreign governments to include domestic versions. The MDM team, according to one CISA official quoted in the IG report, “counters all types of disinformation, to be responsive to current events.” Jen Easterly, Biden’s appointed director of CISA, swiftly made it clear that she would continue to shift resources in the agency to combat the spread of dangerous forms of information on social media.

      MDM == Misinformation, Disinformation, and Malinformation.

      These definitions from earlier in the article: * misinformation (false information spread unintentionally) * disinformation (false information spread intentionally) * malinformation (factual information shared, typically out of context, with harmful intent)

  8. Oct 2022
    1. Today, the people in politics who most often invoke the name of Jesus for their political causes tend to be the most merciless and judgmental, the most consumed by rage and fear and vengeance. They hate their enemies, and they seem to want to make more of them. They claim allegiance to the truth and yet they have embraced, even unwittingly, lies. They have inverted biblical ethics in the name of biblical ethics.
    1. The Internet offers voters in any country an easily accessible and streamlined way to obtain election information, news, and updates. On the other side of that coin lies the opportunity for anti-democratic actors to grow and professionalize digital manipulation campaigns. In the Philippines, a 2020 Oxford Internet Institute survey found that government agencies, politicians, CSOs, and political parties had all personally conducted or hired private firms to conduct digital manipulation campaigns.

      Because of the fact that the Internet is accessible to virtually anyone today makes it among the best exploitable tools for politicians. However that in itself does not explain why it became a pivotal element of the recent elections. Misinformation is believed by many Filipinos due to their inability to fact check information on the Internet. Normally, fact checking should be an ability that individuals can do as a byproduct of critical thinking skills honed through their formal education. Unfortunately the implementation of Philippine education is not conducive to the practice of critical thinking for a number of reasons, such as the lack of teachers and learning facilities. Moreover, socioeconomic factors prevent people from focusing on their education. Substandard education and an unstable economy may explain why, in addition to falling for misinformation, people choose to propagate it in exchange for financial stability, thus increasing the number of perpetrators that those with ulterior motives can use for their own ends.

    1. Edgerly noted that disinformation spreads through two ways: The use of technology and human nature.Click-based advertising, news aggregation, the process of viral spreading and the ease of creating and altering websites are factors considered under technology.“Facebook and Google prioritize giving people what they ‘want’ to see; advertising revenue (are) based on clicks, not quality,” Edgerly said.She noted that people have the tendency to share news and website links without even reading its content, only its headline. According to her, this perpetuates a phenomenon of viral spreading or easy sharing.There is also the case of human nature involved, where people are “most likely to believe” information that supports their identities and viewpoints, Edgerly cited.“Vivid, emotional information grabs attention (and) leads to more responses (such as) likes, comments, shares. Negative information grabs more attention than (the) positive and is better remembered,” she said.Edgerly added that people tend to believe in information that they see on a regular basis and those shared by their immediate families and friends.

      Spreading misinformation and disinformation is really easy in this day and age because of how accessible information is and how much of it there is on the web. This is explained precisely by Edgerly. Noted in this part of the article, there is a business for the spread of disinformation, particularly in our country. There are people who pay what we call online trolls, to spread disinformation and capitalize on how “chronically online” Filipinos are, among many other factors (i.e., most Filipinos’ information illiteracy due to poverty and lack of educational attainment, how easy it is to interact with content we see online, regardless of its authenticity, etc.). Disinformation also leads to misinformation through word-of-mouth. As stated by Edgerly in this article, “people tend to believe in information… shared by their immediate families and friends”; because of people’s human nature to trust the information shared by their loved ones, if one is not information literate, they will not question their newly received information. Lastly, it most certainly does not help that social media algorithms nowadays rely on what users interact with; the more that a user interacts with a certain information, the more that social media platforms will feed them that information. It does not help because not all social media websites have fact checkers and users can freely spread disinformation if they chose to.

    1. "In 2013, we spread fake news in one of the provinces I was handling," he says, describing how he set up his client's opponent. "We got the top politician's cell phone number and photo-shopped it, then sent out a text message pretending to be him, saying he was looking for a mistress. Eventually, my client won."

      This statement from a man who claims to work for politicians as an internet troll and propagator of fake news was really striking, because it shows how fabricating something out of the blue can have a profound impact in the elections--something that is supposed to be a democratic process. Now more than ever, mudslinging in popular information spaces like social media can easily sway public opinion (or confirm it). We have seen this during the election season, wherein Leni Robredo bore the brunt of outrageous rumors; one rumor I remember well was that Leni apparently married an NPA member before and had a kid with him. It is tragic that misinformation and disinformation is not just a mere phenomenon anymore, but a fully blown industry. It has a tight clutch on the decisions people make for the country, while also deeply affecting their values and beliefs.

    1. Trolls, in this context, are humans who hold accounts on social media platforms, more or less for one purpose: To generate comments that argue with people, insult and name-call other users and public figures, try to undermine the credibility of ideas they don’t like, and to intimidate individuals who post those ideas. And they support and advocate for fake news stories that they’re ideologically aligned with. They’re often pretty nasty in their comments. And that gets other, normal users, to be nasty, too.

      Not only programmed accounts are created but also troll accounts that propagate disinformation and spread fake news with the intent to cause havoc on every people. In short, once they start with a malicious comment some people will engage with the said comment which leads to more rage comments and disagreements towards each other. That is what they do, they trigger people to engage in their comments so that they can be spread more and produce more fake news. These troll accounts usually are prominent during elections, like in the Philippines some speculates that some of the candidates have made troll farms just to spread fake news all over social media in which some people engage on.

    2. So, bots are computer algorithms (set of logic steps to complete a specific task) that work in online social network sites to execute tasks autonomously and repetitively. They simulate the behavior of human beings in a social network, interacting with other users, and sharing information and messages [1]–[3]. Because of the algorithms behind bots’ logic, bots can learn from reaction patterns how to respond to certain situations. That is, they possess artificial intelligence (AI). 

      In all honesty, since I don't usually dwell on technology, coding, and stuff. I thought when you say "Bot" it is controlled by another user like a legit person, never knew that it was programmed and created to learn the usual patterns of posting of some people may be it on Twitter, Facebook, and other social media platforms. I think it is important to properly understand how "Bots" work to avoid misinformation and disinformation most importantly during this time of prominent social media use.

  9. Sep 2022
    1. On a volume basis, hydrogen is one of the least energy dense fuels. One liter of hydrogen contains only 25% of the energy of one liter of gasoline and only 20% of the energy of one liter of diesel fuel.

      It's the vapors of Gasoline and Diesel Fuel that burn, so to measure fuel density based on volume of the liquid for fuels that burn in vapor form - but not doing the same for hydrogen is simply dishonest.

      They all burn as a vapor and store and transport well as a liquid. If you compare them all based on the gas volume when they are burned, Hydrogen is still by far the most energy dense.

      It's the capacity to store hydrogen that makes the other fuels fuel at all, so it's hard to improve on 100% hydrogen for doing that.

    1. Could the maintenance of these mythsactually be useful for particularly powerful constituencies? Does the contin-uation of these myths serve a purpose or function for other segments of theAmerican population? If so, who and what might that be?
    1. https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

      Good overview article of some of the psychology research behind misinformation in social media spaces including bots, AI, and the effects of cognitive bias.

      Probably worth mining the story for the journal articles and collecting/reading them.

    2. n a recent laboratory study, Robert Jagiello, also at Warwick, found that socially shared information not only bolsters our biases but also becomes more resilient to correction.
    3. Even our ability to detect online manipulation is affected by our political bias, though not symmetrically: Republican users are more likely to mistake bots promoting conservative ideas for humans, whereas Democrats are more likely to mistake conservative human users for bots.
    4. “Limited individual attention and online virality of low-quality information,” By Xiaoyan Qiu et al., in Nature Human Behaviour, Vol. 1, June 2017

      The upshot of this paper seems to be "information overload alone can explain why fake news can become viral."

  10. www.justine-haupt.com www.justine-haupt.com
    1. vaccines

      Vaccines are "administered primarily to prevent disease." https://www.britannica.com/science/vaccine

      Thusly, a "vaccine" that does not actually prevent disease is, by definition, not a vaccine but marketing spin aimed at enriching the manufacturer's shareholders.

      We know that "vaccines" are not effective as vaccines because it is so common for people who are "fully vaxxed and boosted" to announce that hey have become infected with the pathogen the "vaccine" is claimed to protect against.

      The highly scientific reaction is to respond "Well, just imagine how bad it could have been if they weren't vaccinated" which is an unfalsifiable unscientific claim.

  11. Aug 2022
    1. Kahne and Bowyer (2017) exposed thousands of young people in California tosome true messages and some false ones, similar to the memes they may see on social media
    2. Many U.S.educators believe that increasing political polarization combine with the hazards ofmisinformation and disinformation in ways that underscore the need for learners to acquire theknowledge and skills required to navigate a changing media landscape (Hamilton et al. 2020a)
  12. Jul 2022
    1. An

      Find common ground. Clear away the kindling. Provide context...don't de-platform.

    2. You have three options:Continue fighting fires with hordes of firefighters (in this analogy, fact-checkers).Focus on the arsonists (the people spreading the misinformation) by alerting the town they're the ones starting the fire (banning or labeling them).Clear the kindling and dry brush (teach people to spot lies, think critically, and ask questions).Right now, we do a lot of #1. We do a little bit of #2. We do almost none of #3, which is probably the most important and the most difficult. I’d propose three strategies for addressing misinformation by teaching people to ask questions and spot lies. 
    3. Simply put, the threat of "misinformation" being spread at scale is not novel or unique to our generation—and trying to slow the advances of information sharing is futile and counter-productive.
    4. It’s worth reiterating precisely why: The very premise of science is to create a hypothesis, put that hypothesis up to scrutiny, pit good ideas against bad ones, and continue to test what you think you know over and over and over. That’s how we discovered tectonic plates and germs and key features of the universe. And oftentimes, it’s how we learn from great public experiments, like realizing that maybe paper or reusable bags are preferable to plastic.

      develop a hypothesis, and pit different ideas against one another

    5. All of these approaches tend to be built on an assumption that misinformation is something that can and should be censored. On the contrary, misinformation is a troubling but necessary part of our political discourse. Attempts to eliminate it carry far greater risks than attempts to navigate it, and trying to weed out what some committee or authority considers "misinformation" would almost certainly restrict our continued understanding of the world around us.
    6. To start, it is worth defining “misinformation”: Simply put, misinformation is “incorrect or misleading information.” This is slightly different from “disinformation,” which is “false information deliberately and often covertly spread (by the planting of rumors) in order to influence public opinion or obscure the truth.” The notable difference is that disinformation is always deliberate.
  13. Apr 2022
    1. Michael Armstrong [@ArmstrongGN]. (2021, September 29). NBA player says he doesn’t need vaccine… 40-thousand likes and 1.4 million views. Scientist/doctor corrects NBA player… 4-thousand likes. We’re so screwed… [Tweet]. Twitter. https://twitter.com/ArmstrongGN/status/1443052037160251392

    1. Dr. Deepti Gurdasani [@dgurdasani1]. (2021, October 30). A very disturbing read on the recent JCVI minutes released. They seem to consider immunity through infection in children advantageous, discussing children as live “booster” vaccines for adults. I would expect this from anti-vaxx groups, not a scientific committee. [Tweet]. Twitter. https://twitter.com/dgurdasani1/status/1454383106555842563

    1. Kolina Koltai, PhD [@KolinaKoltai]. (2021, September 27). When you search ‘Covid-19’ on Amazon, the number 1 product is from known antivaxxer Dr. Mercola. 4 out of the top 8 items are either vaccine opposed/linked to conspiratorial narratives about covid. Amazon continues to be a venue for vaccine misinformation. Https://t.co/rWHhZS8nPl [Tweet]. Twitter. https://twitter.com/KolinaKoltai/status/1442545052954202121

    1. SmartDevelopmentFund [@SmartDevFund]. (2021, November 2). A kit that enables users to disable misinformation: The #DigitalEnquirerKit empowers #journalists, civil society #activists and human rights defenders at the #COVID19 information front-line. Find out more: Http://sdf.d4dhub.eu #smartdevelopmentfund #innovation #Infopowered https://t.co/YZVooirtU9 [Tweet]. Twitter. https://twitter.com/SmartDevFund/status/1455549507949801472

    1. Katherine Ognyanova. (2022, February 15). Americans who believe COVID vaccine misinformation tend to be more vaccine-resistant. They are also more likely to distrust the government, media, science, and medicine. That pattern is reversed with regard to trust in Fox News and Donald Trump. Https://osf.io/9ua2x/ (5/7) https://t.co/f6jTRWhmdF [Tweet]. @Ognyanova. https://twitter.com/Ognyanova/status/1493596109926768645

    1. Dr. Jonathan N. Stea. (2021, January 25). Covid-19 misinformation? We’re over it. Pseudoscience? Over it. Conspiracies? Over it. Want to do your part to amplify scientific expertise and evidence-based health information? Join us. 🇨🇦 Follow us @ScienceUpFirst. #ScienceUpFirst https://t.co/81iPxXXn4q. Https://t.co/mIcyJEsPXe [Tweet]. @jonathanstea. https://twitter.com/jonathanstea/status/1353705111671869440

    1. Kit Yates. (2021, September 27). This is absolutely despicable. This bogus “consent form” is being sent to schools and some are unquestioningly sending it out with the real consent form when arranging for vaccination their pupils. Please spread the message and warn other parents to ignore this disinformation. Https://t.co/lHUvraA6Ez [Tweet]. @Kit_Yates_Maths. https://twitter.com/Kit_Yates_Maths/status/1442571448112013319

    1. ReconfigBehSci. (2021, February 17). The Covid-19 pandemic has accelerated the erosion of trust around the world: Significant drop in trust in the two largest economies: The U.S. (40%) and Chinese (30%) governments are deeply distrusted by respondents from the 26 other markets surveyed. 1/2 https://t.co/C86chd3bb4 [Tweet]. @SciBeh. https://twitter.com/SciBeh/status/1362021569476894726

    1. (20) James 💙 Neill—😷 🇪🇺🇮🇪🇬🇧🔶 on Twitter: “The domain sending that fake NHS vaccine consent hoax form to schools has been suspended. Excellent work by @martincampbell2 and fast co-operation by @kualo 👍 FYI @fascinatorfun @Kit_Yates_Maths @dgurdasani1 @AThankless https://t.co/pbAgNfkbEs” / Twitter. (n.d.). Retrieved November 22, 2021, from https://twitter.com/jneill/status/1442784873014566913

    1. Mike Caulfield. (2021, March 10). One of the drivers of Twitter daily topics is that topics must be participatory to trend, which means one must be able to form a firm opinion on a given subject in the absence of previous knowledge. And, it turns out, this is a bit of a flaw. [Tweet]. @holden. https://twitter.com/holden/status/1369551099489779714

    1. Amy Maxmen, PhD. (2020, August 26). 🙄The CDC’s only substantial communication with the public in the pandemic is through its MMW Reports. But the irrelevant & erroneous 1st line of this latest report suggests political meddling to me. (The WHO doesn’t declare pandemics. They declare PHEICs, which they did Jan 30) https://t.co/Y1NlHbQIYQ [Tweet]. @amymaxmen. https://twitter.com/amymaxmen/status/1298660729080356864

    1. Adam Kucharski. (2020, December 13). I’ve turned down a lot of COVID-related interviews/events this year because topic was outside my main expertise and/or I thought there were others who were better placed to comment. Science communication isn’t just about what you take part in – it’s also about what you decline. [Tweet]. @AdamJKucharski. https://twitter.com/AdamJKucharski/status/1338079300097077250