876 Matching Annotations
  1. Jan 2023
    1. The uptake of mis- and disinformation is intertwined with the way our minds work. The large body of research on the psychological aspects of information manipulation explains why.

      In an article for Nature Review Psychology, Ullrich K. H. Ecker et al looked(opens in a new tab) at the cognitive, social, and affective factors that lead people to form or even endorse misinformed views. Ironically enough, false beliefs generally arise through the same mechanisms that establish accurate beliefs. It is a mix of cognitive drivers like intuitive thinking and socio-affective drivers. When deciding what is true, people are often biased to believe in the validity of information and to trust their intuition instead of deliberating. Also, repetition increases belief in both misleading information and facts.

      Ecker, U.K.H., Lewandowsky, S., Cook, J. et al. (2022). The psychological drivers of misinformation belief and its resistance to correction.

      Going a step further, Álex Escolà-Gascón et al investigated the psychopathological profiles that characterise people prone to consuming misleading information. After running a number of tests on more than 1,400 volunteers, they concluded that people with high scores in schizotypy (a condition not too dissimilar from schizophrenia), paranoia, and histrionism (more commonly known as dramatic personality disorder) are more vulnerable to the negative effects of misleading information. People who do not detect misleading information also tend to be more anxious, suggestible, and vulnerable to strong emotions.

  2. Dec 2022
    1. . Furthermore, our results add to the growing body of literature documenting—at least at this historical moment—the link between extreme right-wing ideology and misinformation8,14,24 (although, of course, factors other than ideology are also associated with misinformation sharing, such as polarization25 and inattention17,37).

      Misinformation exposure and extreme right-wing ideology appear associated in this report. Others find that it is partisanship that predicts susceptibility.

    2. . We also find evidence of “falsehood echo chambers”, where users that are more often exposed to misinformation are more likely to follow a similar set of accounts and share from a similar set of domains. These results are interesting in the context of evidence that political echo chambers are not prevalent, as typically imagined
    3. Aligned with prior work finding that people who identify as conservative consume15, believe24, and share more misinformation8,14,25, we also found a positive correlation between users’ misinformation-exposure scores and the extent to which they are estimated to be conservative ideologically (Fig. 2c; b = 0.747, 95% CI = [0.727,0.767] SE = 0.010, t (4332) = 73.855, p < 0.001), such that users estimated to be more conservative are more likely to follow the Twitter accounts of elites with higher fact-checking falsity scores. Critically, the relationship between misinformation-exposure score and quality of content shared is robust controlling for estimated ideology
    1. highlights the need for public health officials to disseminate information via multiple media channels to increase the chances of accessing vaccine resistant or hesitant individuals.
    2. Across the Irish and UK samples, similarities and differences emerged regarding those in the population who were more likely to be hesitant about, or resistant to, a vaccine for COVID-19. Three demographic factors were significantly associated with vaccine hesitance or resistance in both countries: sex, age, and income level. Compared to respondents accepting of a COVID-19 vaccine, women were more likely to be vaccine hesitant, a finding consistent with a number of studies identifying sex and gender-related differences in vaccine uptake and acceptance37,38. Younger age was also related to vaccine hesitance and resistance.
    3. There were no significant differences in levels of consumption and trust between the vaccine accepting and vaccine hesitant groups in the Irish sample. Compared to vaccine hesitant responders, vaccine resistant individuals consumed significantly less information about the pandemic from television and radio, and had significantly less trust in information disseminated from newspapers, television broadcasts, radio broadcasts, their doctor, other health care professionals, and government agencies.
    1. The Gish gallop /ˈɡɪʃ ˈɡæləp/ is a rhetorical technique in which a person in a debate attempts to overwhelm their opponent by providing an excessive number of arguments with no regard for the accuracy or strength of those arguments. In essence, it is prioritizing quantity of one's arguments at the expense of quality of said arguments. The term was coined in 1994 by anthropologist Eugenie Scott, who named it after American creationist Duane Gish and argued that Gish used the technique frequently when challenging the scientific fact of evolution.[1][2] It is similar to another debating method called spreading, in which one person speaks extremely fast in an attempt to cause their opponent to fail to respond to all the arguments that have been raised.

      I'd always known this was a thing, but didn't have a word for it.

  3. Nov 2022
    1. Our familiarity with these elements makes the overall story seem plausible, even—or perhaps especially—when facts and evidence are in short supply.

      Storytelling tropes play into our system one heuristics and cognitive biases by riding on the tailcoats of familiar story plotlines we've come to know and trust.

      What are the ways out of this trap? Creating lists of tropes which should trigger our system one reactions to switch into system two thinking patterns? Can we train ourselves away from these types of misinformation?

    2. As part of the Election Integrity Partnership, my team at the Stanford Internet Observatory studies online rumors, and how they spread across the internet in real time.
    3. Something similar! Here it is: https://t.co/x1DPx9dm0P

      — Renee DiResta (@noUpside) November 26, 2022
      <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
    1. Lilienfeld, S. O., Sauvigné, K. C., Lynn, S. J., Cautin, R. L., Latzman, R. D., & Waldman, I. D. (2014). Fifty psychological and psychiatric terms to avoid: a list of inaccurate, misleading, misused, ambiguous, and logically confused words and phrases. Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2015.01100

    1. Trope, trope, trope, strung into a Gish Gallop.

      One of the issues we see in the Sunday morning news analysis shows (Meet the Press, Face the Nation, et al.) is that there is usually a large amount of context collapse mixed with lack of general knowledge about the topics at hand compounded with large doses of Gish Gallop and F.U.D. (fear, uncertainty, and doubt).

    2. a more nuanced view of context.

      Almost every new technology goes through a moral panic phase where the unknown is used to spawn potential backlashes against it. Generally these disappear with time and familiarity with the technology.

      Bicycles cause insanity, for example...

      Why does medicine and vaccines not follow more of this pattern? Is it lack of science literacy in general which prevents it from becoming familiar for some?

    3. What if instead of addressing individual pieces of misinformation reactively, we instead discussed the underpinnings — preemptively?

      Perhaps we might more profitably undermine misinformation by dismantling the underlying tropes the underpin them?

    1. And that, perhaps, is what we might get to via prebunking. Not so much attempts to counter or fact-check misinfo on the internet, but defanging the tropes that underpin the most recurringly manipulative claims so that the public sees, recognizes, & thinks:

      And that, perhaps, is what we might get to via prebunking. Not so much attempts to counter or fact-check misinfo on the internet, but defanging the tropes that underpin the most recurringly manipulative claims so that the public sees, recognizes, & thinks:😬

      — Renee DiResta (@noUpside) June 19, 2021
      <script async src="https://platform.twitter.com/widgets.js" charset="utf-8"></script>
    1. suspect evaded Colorado’s red flag gun law

      If you read lower in the article you'll see that the headline is a blatant lie.

      The Gov failed to prosecute a violent person, so AP spins it as if this guy "evaded" (which is an action).

      One can't evade a law that is never applied against them.

    1. Under President Joe Biden, the shifting focus on disinformation has continued. In January 2021, CISA replaced the Countering Foreign Influence Task force with the “Misinformation, Disinformation and Malinformation” team, which was created “to promote more flexibility to focus on general MDM.” By now, the scope of the effort had expanded beyond disinformation produced by foreign governments to include domestic versions. The MDM team, according to one CISA official quoted in the IG report, “counters all types of disinformation, to be responsive to current events.” Jen Easterly, Biden’s appointed director of CISA, swiftly made it clear that she would continue to shift resources in the agency to combat the spread of dangerous forms of information on social media.

      MDM == Misinformation, Disinformation, and Malinformation.

      These definitions from earlier in the article: * misinformation (false information spread unintentionally) * disinformation (false information spread intentionally) * malinformation (factual information shared, typically out of context, with harmful intent)

  4. Oct 2022
    1. Today, the people in politics who most often invoke the name of Jesus for their political causes tend to be the most merciless and judgmental, the most consumed by rage and fear and vengeance. They hate their enemies, and they seem to want to make more of them. They claim allegiance to the truth and yet they have embraced, even unwittingly, lies. They have inverted biblical ethics in the name of biblical ethics.
    1. The Internet offers voters in any country an easily accessible and streamlined way to obtain election information, news, and updates. On the other side of that coin lies the opportunity for anti-democratic actors to grow and professionalize digital manipulation campaigns. In the Philippines, a 2020 Oxford Internet Institute survey found that government agencies, politicians, CSOs, and political parties had all personally conducted or hired private firms to conduct digital manipulation campaigns.

      Because of the fact that the Internet is accessible to virtually anyone today makes it among the best exploitable tools for politicians. However that in itself does not explain why it became a pivotal element of the recent elections. Misinformation is believed by many Filipinos due to their inability to fact check information on the Internet. Normally, fact checking should be an ability that individuals can do as a byproduct of critical thinking skills honed through their formal education. Unfortunately the implementation of Philippine education is not conducive to the practice of critical thinking for a number of reasons, such as the lack of teachers and learning facilities. Moreover, socioeconomic factors prevent people from focusing on their education. Substandard education and an unstable economy may explain why, in addition to falling for misinformation, people choose to propagate it in exchange for financial stability, thus increasing the number of perpetrators that those with ulterior motives can use for their own ends.

    1. Edgerly noted that disinformation spreads through two ways: The use of technology and human nature.Click-based advertising, news aggregation, the process of viral spreading and the ease of creating and altering websites are factors considered under technology.“Facebook and Google prioritize giving people what they ‘want’ to see; advertising revenue (are) based on clicks, not quality,” Edgerly said.She noted that people have the tendency to share news and website links without even reading its content, only its headline. According to her, this perpetuates a phenomenon of viral spreading or easy sharing.There is also the case of human nature involved, where people are “most likely to believe” information that supports their identities and viewpoints, Edgerly cited.“Vivid, emotional information grabs attention (and) leads to more responses (such as) likes, comments, shares. Negative information grabs more attention than (the) positive and is better remembered,” she said.Edgerly added that people tend to believe in information that they see on a regular basis and those shared by their immediate families and friends.

      Spreading misinformation and disinformation is really easy in this day and age because of how accessible information is and how much of it there is on the web. This is explained precisely by Edgerly. Noted in this part of the article, there is a business for the spread of disinformation, particularly in our country. There are people who pay what we call online trolls, to spread disinformation and capitalize on how “chronically online” Filipinos are, among many other factors (i.e., most Filipinos’ information illiteracy due to poverty and lack of educational attainment, how easy it is to interact with content we see online, regardless of its authenticity, etc.). Disinformation also leads to misinformation through word-of-mouth. As stated by Edgerly in this article, “people tend to believe in information… shared by their immediate families and friends”; because of people’s human nature to trust the information shared by their loved ones, if one is not information literate, they will not question their newly received information. Lastly, it most certainly does not help that social media algorithms nowadays rely on what users interact with; the more that a user interacts with a certain information, the more that social media platforms will feed them that information. It does not help because not all social media websites have fact checkers and users can freely spread disinformation if they chose to.

    1. "In 2013, we spread fake news in one of the provinces I was handling," he says, describing how he set up his client's opponent. "We got the top politician's cell phone number and photo-shopped it, then sent out a text message pretending to be him, saying he was looking for a mistress. Eventually, my client won."

      This statement from a man who claims to work for politicians as an internet troll and propagator of fake news was really striking, because it shows how fabricating something out of the blue can have a profound impact in the elections--something that is supposed to be a democratic process. Now more than ever, mudslinging in popular information spaces like social media can easily sway public opinion (or confirm it). We have seen this during the election season, wherein Leni Robredo bore the brunt of outrageous rumors; one rumor I remember well was that Leni apparently married an NPA member before and had a kid with him. It is tragic that misinformation and disinformation is not just a mere phenomenon anymore, but a fully blown industry. It has a tight clutch on the decisions people make for the country, while also deeply affecting their values and beliefs.

    1. Trolls, in this context, are humans who hold accounts on social media platforms, more or less for one purpose: To generate comments that argue with people, insult and name-call other users and public figures, try to undermine the credibility of ideas they don’t like, and to intimidate individuals who post those ideas. And they support and advocate for fake news stories that they’re ideologically aligned with. They’re often pretty nasty in their comments. And that gets other, normal users, to be nasty, too.

      Not only programmed accounts are created but also troll accounts that propagate disinformation and spread fake news with the intent to cause havoc on every people. In short, once they start with a malicious comment some people will engage with the said comment which leads to more rage comments and disagreements towards each other. That is what they do, they trigger people to engage in their comments so that they can be spread more and produce more fake news. These troll accounts usually are prominent during elections, like in the Philippines some speculates that some of the candidates have made troll farms just to spread fake news all over social media in which some people engage on.

    2. So, bots are computer algorithms (set of logic steps to complete a specific task) that work in online social network sites to execute tasks autonomously and repetitively. They simulate the behavior of human beings in a social network, interacting with other users, and sharing information and messages [1]–[3]. Because of the algorithms behind bots’ logic, bots can learn from reaction patterns how to respond to certain situations. That is, they possess artificial intelligence (AI). 

      In all honesty, since I don't usually dwell on technology, coding, and stuff. I thought when you say "Bot" it is controlled by another user like a legit person, never knew that it was programmed and created to learn the usual patterns of posting of some people may be it on Twitter, Facebook, and other social media platforms. I think it is important to properly understand how "Bots" work to avoid misinformation and disinformation most importantly during this time of prominent social media use.

  5. Sep 2022
    1. On a volume basis, hydrogen is one of the least energy dense fuels. One liter of hydrogen contains only 25% of the energy of one liter of gasoline and only 20% of the energy of one liter of diesel fuel.

      It's the vapors of Gasoline and Diesel Fuel that burn, so to measure fuel density based on volume of the liquid for fuels that burn in vapor form - but not doing the same for hydrogen is simply dishonest.

      They all burn as a vapor and store and transport well as a liquid. If you compare them all based on the gas volume when they are burned, Hydrogen is still by far the most energy dense.

      It's the capacity to store hydrogen that makes the other fuels fuel at all, so it's hard to improve on 100% hydrogen for doing that.

    1. Could the maintenance of these mythsactually be useful for particularly powerful constituencies? Does the contin-uation of these myths serve a purpose or function for other segments of theAmerican population? If so, who and what might that be?
    1. https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

      Good overview article of some of the psychology research behind misinformation in social media spaces including bots, AI, and the effects of cognitive bias.

      Probably worth mining the story for the journal articles and collecting/reading them.

    2. n a recent laboratory study, Robert Jagiello, also at Warwick, found that socially shared information not only bolsters our biases but also becomes more resilient to correction.
    3. Even our ability to detect online manipulation is affected by our political bias, though not symmetrically: Republican users are more likely to mistake bots promoting conservative ideas for humans, whereas Democrats are more likely to mistake conservative human users for bots.
    4. “Limited individual attention and online virality of low-quality information,” By Xiaoyan Qiu et al., in Nature Human Behaviour, Vol. 1, June 2017

      The upshot of this paper seems to be "information overload alone can explain why fake news can become viral."

  6. www.justine-haupt.com www.justine-haupt.com
    1. vaccines

      Vaccines are "administered primarily to prevent disease." https://www.britannica.com/science/vaccine

      Thusly, a "vaccine" that does not actually prevent disease is, by definition, not a vaccine but marketing spin aimed at enriching the manufacturer's shareholders.

      We know that "vaccines" are not effective as vaccines because it is so common for people who are "fully vaxxed and boosted" to announce that hey have become infected with the pathogen the "vaccine" is claimed to protect against.

      The highly scientific reaction is to respond "Well, just imagine how bad it could have been if they weren't vaccinated" which is an unfalsifiable unscientific claim.

  7. Aug 2022
    1. Kahne and Bowyer (2017) exposed thousands of young people in California tosome true messages and some false ones, similar to the memes they may see on social media
    2. Many U.S.educators believe that increasing political polarization combine with the hazards ofmisinformation and disinformation in ways that underscore the need for learners to acquire theknowledge and skills required to navigate a changing media landscape (Hamilton et al. 2020a)
  8. Jul 2022
    1. An

      Find common ground. Clear away the kindling. Provide context...don't de-platform.

    2. You have three options:Continue fighting fires with hordes of firefighters (in this analogy, fact-checkers).Focus on the arsonists (the people spreading the misinformation) by alerting the town they're the ones starting the fire (banning or labeling them).Clear the kindling and dry brush (teach people to spot lies, think critically, and ask questions).Right now, we do a lot of #1. We do a little bit of #2. We do almost none of #3, which is probably the most important and the most difficult. I’d propose three strategies for addressing misinformation by teaching people to ask questions and spot lies. 
    3. Simply put, the threat of "misinformation" being spread at scale is not novel or unique to our generation—and trying to slow the advances of information sharing is futile and counter-productive.
    4. It’s worth reiterating precisely why: The very premise of science is to create a hypothesis, put that hypothesis up to scrutiny, pit good ideas against bad ones, and continue to test what you think you know over and over and over. That’s how we discovered tectonic plates and germs and key features of the universe. And oftentimes, it’s how we learn from great public experiments, like realizing that maybe paper or reusable bags are preferable to plastic.

      develop a hypothesis, and pit different ideas against one another

    5. All of these approaches tend to be built on an assumption that misinformation is something that can and should be censored. On the contrary, misinformation is a troubling but necessary part of our political discourse. Attempts to eliminate it carry far greater risks than attempts to navigate it, and trying to weed out what some committee or authority considers "misinformation" would almost certainly restrict our continued understanding of the world around us.
    6. To start, it is worth defining “misinformation”: Simply put, misinformation is “incorrect or misleading information.” This is slightly different from “disinformation,” which is “false information deliberately and often covertly spread (by the planting of rumors) in order to influence public opinion or obscure the truth.” The notable difference is that disinformation is always deliberate.
  9. Apr 2022
    1. Michael Armstrong [@ArmstrongGN]. (2021, September 29). NBA player says he doesn’t need vaccine… 40-thousand likes and 1.4 million views. Scientist/doctor corrects NBA player… 4-thousand likes. We’re so screwed… [Tweet]. Twitter. https://twitter.com/ArmstrongGN/status/1443052037160251392

    1. Dr. Deepti Gurdasani [@dgurdasani1]. (2021, October 30). A very disturbing read on the recent JCVI minutes released. They seem to consider immunity through infection in children advantageous, discussing children as live “booster” vaccines for adults. I would expect this from anti-vaxx groups, not a scientific committee. [Tweet]. Twitter. https://twitter.com/dgurdasani1/status/1454383106555842563

    1. Kolina Koltai, PhD [@KolinaKoltai]. (2021, September 27). When you search ‘Covid-19’ on Amazon, the number 1 product is from known antivaxxer Dr. Mercola. 4 out of the top 8 items are either vaccine opposed/linked to conspiratorial narratives about covid. Amazon continues to be a venue for vaccine misinformation. Https://t.co/rWHhZS8nPl [Tweet]. Twitter. https://twitter.com/KolinaKoltai/status/1442545052954202121

    1. SmartDevelopmentFund [@SmartDevFund]. (2021, November 2). A kit that enables users to disable misinformation: The #DigitalEnquirerKit empowers #journalists, civil society #activists and human rights defenders at the #COVID19 information front-line. Find out more: Http://sdf.d4dhub.eu #smartdevelopmentfund #innovation #Infopowered https://t.co/YZVooirtU9 [Tweet]. Twitter. https://twitter.com/SmartDevFund/status/1455549507949801472

    1. Katherine Ognyanova. (2022, February 15). Americans who believe COVID vaccine misinformation tend to be more vaccine-resistant. They are also more likely to distrust the government, media, science, and medicine. That pattern is reversed with regard to trust in Fox News and Donald Trump. Https://osf.io/9ua2x/ (5/7) https://t.co/f6jTRWhmdF [Tweet]. @Ognyanova. https://twitter.com/Ognyanova/status/1493596109926768645

    1. Dr. Jonathan N. Stea. (2021, January 25). Covid-19 misinformation? We’re over it. Pseudoscience? Over it. Conspiracies? Over it. Want to do your part to amplify scientific expertise and evidence-based health information? Join us. 🇨🇦 Follow us @ScienceUpFirst. #ScienceUpFirst https://t.co/81iPxXXn4q. Https://t.co/mIcyJEsPXe [Tweet]. @jonathanstea. https://twitter.com/jonathanstea/status/1353705111671869440

    1. Kit Yates. (2021, September 27). This is absolutely despicable. This bogus “consent form” is being sent to schools and some are unquestioningly sending it out with the real consent form when arranging for vaccination their pupils. Please spread the message and warn other parents to ignore this disinformation. Https://t.co/lHUvraA6Ez [Tweet]. @Kit_Yates_Maths. https://twitter.com/Kit_Yates_Maths/status/1442571448112013319

    1. ReconfigBehSci. (2021, February 17). The Covid-19 pandemic has accelerated the erosion of trust around the world: Significant drop in trust in the two largest economies: The U.S. (40%) and Chinese (30%) governments are deeply distrusted by respondents from the 26 other markets surveyed. 1/2 https://t.co/C86chd3bb4 [Tweet]. @SciBeh. https://twitter.com/SciBeh/status/1362021569476894726

    1. (20) James 💙 Neill—😷 🇪🇺🇮🇪🇬🇧🔶 on Twitter: “The domain sending that fake NHS vaccine consent hoax form to schools has been suspended. Excellent work by @martincampbell2 and fast co-operation by @kualo 👍 FYI @fascinatorfun @Kit_Yates_Maths @dgurdasani1 @AThankless https://t.co/pbAgNfkbEs” / Twitter. (n.d.). Retrieved November 22, 2021, from https://twitter.com/jneill/status/1442784873014566913

    1. Mike Caulfield. (2021, March 10). One of the drivers of Twitter daily topics is that topics must be participatory to trend, which means one must be able to form a firm opinion on a given subject in the absence of previous knowledge. And, it turns out, this is a bit of a flaw. [Tweet]. @holden. https://twitter.com/holden/status/1369551099489779714

    1. Amy Maxmen, PhD. (2020, August 26). 🙄The CDC’s only substantial communication with the public in the pandemic is through its MMW Reports. But the irrelevant & erroneous 1st line of this latest report suggests political meddling to me. (The WHO doesn’t declare pandemics. They declare PHEICs, which they did Jan 30) https://t.co/Y1NlHbQIYQ [Tweet]. @amymaxmen. https://twitter.com/amymaxmen/status/1298660729080356864

    1. Adam Kucharski. (2020, December 13). I’ve turned down a lot of COVID-related interviews/events this year because topic was outside my main expertise and/or I thought there were others who were better placed to comment. Science communication isn’t just about what you take part in – it’s also about what you decline. [Tweet]. @AdamJKucharski. https://twitter.com/AdamJKucharski/status/1338079300097077250

    1. ReconfigBehSci. (2021, April 23). I’m starting the critical examination of the success of behavioural science in rising to the pandemic challenge over the last year with the topic of misinformation comments and thoughts here and/or on our reddits 1/2 https://t.co/sK7r3f7mtf [Tweet]. @SciBeh. https://twitter.com/SciBeh/status/1385631665175896070

    1. Youyang Gu. (2021, May 25). Is containing COVID-19 a requirement for preserving the economy? My analysis suggests: Probably not. In the US, there is no correlation between Covid deaths & changes in unemployment rates. However, blue states are much more likely to have higher increases in unemployment. 🧵 https://t.co/JrikBtawEb [Tweet]. @youyanggu. https://twitter.com/youyanggu/status/1397230156301930497

    1. Dr. Syra Madad. (2021, February 7). What we hear most often “talk to your health care provider if you have any questions/concerns on COVID19 vaccines” Vs Where many are actually turning to for COVID19 vaccine info ⬇️ This is also why it’s so important for the media to report responsibly based on science/evidence [Tweet]. @syramadad. https://twitter.com/syramadad/status/1358509900398272517

    1. ReconfigBehSci on Twitter. (n.d.). Twitter. Retrieved 8 November 2021, from https://twitter.com/SciBeh/status/1444360973750444032

    2. Multiple issues with @scotgov assessment of vaccine passports. 1. No evidence that passports will decrease cases at venues (just an infographic!). This is a complex modelling issue that must also account for waning immunity and possibility of more unvaxxed in other settings
    1. ReconfigBehSci on Twitter: ‘@Holdmypint @ollysmithtravel @AllysonPollock Omicron might be changing things- the measure has to be evaluated relative to the situation in Austria at the time, not Ireland 3 months later with a different variant’ / Twitter. (n.d.). Retrieved 25 March 2022, from https://twitter.com/SciBeh/status/1487130621696741388

    1. ReconfigBehSci [@SciBeh]. (2022, January 28). @ollysmithtravel @AllysonPollock that is a policy alternative one could consider- whether it’s more or less effective, more or less equitable, or even implementable in the current Austrian health care framework would need careful consideration.... None of that saves the argument in the initial tweet [Tweet]. Twitter. https://twitter.com/SciBeh/status/1487043954654818316

  10. Mar 2022
    1. ReconfigBehSci on Twitter: ‘this really is now a disinformation account. I retweeted posts earlier in the pandemic as part of a balanced spread of opinion. But this will be the last one...’ / Twitter. (n.d.). Retrieved 29 March 2022, from https://twitter.com/SciBeh/status/1478485258395951108