18 Matching Annotations
  1. May 2025
  2. Nov 2024
  3. May 2020
  4. Aug 2018
    1. Instead of trying to force their messages into the mainstream, these adversaries target polarized communities and “embed” fake accounts within them. The false personas engage with real people in those communities to build credibility. Once their influence has been established, they can introduce new viewpoints and amplify divisive and inflammatory narratives that are already circulating. It’s the digital equivalent of moving to an isolated and tight-knit community, using its own language quirks and catering to its obsessions, running for mayor, and then using that position to influence national politics.
  5. Jul 2018
    1. "Putin was never very fond of the internet even in the early 2000s," said Andrei Soldatov, a Russian investigative journalist who specializes in security services and cyber issues. "When he was forced to think about the internet during the protests, he became very suspicious, especially about social networks. He thinks there's a plot, a Western conspiracy against him. He believes there is a very dangerous thing for him and he needs to put this thing under control."
    1. RuNet Echo has previously written about the efforts of the Russian “Troll Army” to inject the social networks and online media websites with pro-Kremlin rhetoric. Twitter is no exception, and multiple users have observed Twitter accounts tweeting similar statements during and around key breaking news and events. Increasingly active throughout Russia's interventions in Ukraine, these “bots” have been designed to look like real Twitter users, complete with avatars.
    1. We’ve built an information ecosystem where information can fly through social networks (both technical and personal). Folks keep looking to the architects of technical networks to solve the problem. I’m confident that these companies can do a lot to curb some of the groups who have capitalized on what’s happening to seek financial gain. But the battles over ideology and attention are going to be far trickier. What’s at stake isn’t “fake news.” What’s at stake is the increasing capacity of those committed to a form of isolationist and hate-driven tribalism that has been around for a very long time. They have evolved with the information landscape, becoming sophisticated in leveraging whatever tools are available to achieve power, status, and attention. And those seeking a progressive and inclusive agenda, those seeking to combat tribalism to form a more perfect union —  they haven’t kept up.
    2. As I wrote in “Hacking the Attention Economy,” manipulating the media for profit, ideology, and lulz has evolved over time. The strategies that hackers, hoaxers, and haters have taken have become more sophisticated. The campaigns have gotten more intense. And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.
    3. How many years did it take for the US military to learn that waging war with tribal networks couldn’t be fought with traditional military strategies? How long will it take for the news media to wake up and recognize that they’re being played? And how long after that will it take for editors and publishers to start evolving their strategies?
    1. When messaging is coordinated and consistent, it easily fools our brains, already exhausted and increasingly reliant on heuristics (simple psychological shortcuts) due to the overwhelming amount of information flashing before our eyes every day. When we see multiple messages about the same topic, our brains use that as a short-cut to credibility. It must be true we say — I’ve seen that same claim several times today.
    2. I saw Eliot Higgins present in Paris in early January, and he listed four ‘Ps’ which helped explain the different motivations. I’ve been thinking about these a great deal and using Eliot’s original list have identified four additional motivations for the creation of this type of content: Poor Journalism, Parody, to Provoke or ‘Punk’, Passion, Partisanship, Profit, Political Influence or Power, and Propaganda.This is a work in progress but once you start breaking these categories down and mapping them against one another you begin to see distinct patterns in terms of the types of content created for specific purposes.
    3. Back in November, I wrote about the different types of problematic information I saw circulate during the US election. Since then, I’ve been trying to refine a typology (and thank you to Global Voices for helping me to develop my definitions even further). I would argue there are seven distinct types of problematic content that sit within our information ecosystem. They sit on a scale, one that loosely measures the intent to deceive.
    4. As Danah Boyd outlined in a recent piece, we are at war. An information war. We certainly should worry about people (including journalists) unwittingly sharing misinformation, but far more concerning are the systematic disinformation campaigns.