16 Matching Annotations
  1. May 2020
  2. Aug 2018
    1. Instead of trying to force their messages into the mainstream, these adversaries target polarized communities and “embed” fake accounts within them. The false personas engage with real people in those communities to build credibility. Once their influence has been established, they can introduce new viewpoints and amplify divisive and inflammatory narratives that are already circulating. It’s the digital equivalent of moving to an isolated and tight-knit community, using its own language quirks and catering to its obsessions, running for mayor, and then using that position to influence national politics.
    2. However, as the following diagrams will show, the middle is a lot weaker than it looks, and this makes public discourse vulnerable both to extremists at home and to manipulation by outside actors such as Russia.
  3. Jul 2018
    1. "The internet has become the main threat — a sphere that isn't controlled by the Kremlin," said Pavel Chikov, a member of Russia's presidential human rights council. "That's why they're going after it. Its very existence as we know it is being undermined by these measures."
    2. "Putin was never very fond of the internet even in the early 2000s," said Andrei Soldatov, a Russian investigative journalist who specializes in security services and cyber issues. "When he was forced to think about the internet during the protests, he became very suspicious, especially about social networks. He thinks there's a plot, a Western conspiracy against him. He believes there is a very dangerous thing for him and he needs to put this thing under control."
    1. creating a new international news operation called Sputnik to “provide an alternative viewpoint on world events.” More and more, though, the Kremlin is manipulating the information sphere in more insidious ways.
    1. RuNet Echo has previously written about the efforts of the Russian “Troll Army” to inject the social networks and online media websites with pro-Kremlin rhetoric. Twitter is no exception, and multiple users have observed Twitter accounts tweeting similar statements during and around key breaking news and events. Increasingly active throughout Russia's interventions in Ukraine, these “bots” have been designed to look like real Twitter users, complete with avatars.
    1. We’ve built an information ecosystem where information can fly through social networks (both technical and personal). Folks keep looking to the architects of technical networks to solve the problem. I’m confident that these companies can do a lot to curb some of the groups who have capitalized on what’s happening to seek financial gain. But the battles over ideology and attention are going to be far trickier. What’s at stake isn’t “fake news.” What’s at stake is the increasing capacity of those committed to a form of isolationist and hate-driven tribalism that has been around for a very long time. They have evolved with the information landscape, becoming sophisticated in leveraging whatever tools are available to achieve power, status, and attention. And those seeking a progressive and inclusive agenda, those seeking to combat tribalism to form a more perfect union —  they haven’t kept up.
    2. As I wrote in “Hacking the Attention Economy,” manipulating the media for profit, ideology, and lulz has evolved over time. The strategies that hackers, hoaxers, and haters have taken have become more sophisticated. The campaigns have gotten more intense. And now many of the actors most set on undermining institutionalized information intermediaries are in the most powerful office in the land. They are waging war on the media and the media doesn’t know what to do other than to report on it.
    3. How many years did it take for the US military to learn that waging war with tribal networks couldn’t be fought with traditional military strategies? How long will it take for the news media to wake up and recognize that they’re being played? And how long after that will it take for editors and publishers to start evolving their strategies?
    4. there’s no cost to the administration to be helpful to the media because the people the Trump Administration cares about don’t trust the media anyhow.
    5. News agencies, long trained to focus on reporting information and maintaining a conceptual model of standards, are ill-equipped to understand that they may have a role in this war, that their actions and decisions are shaping the way the war plays out.
    1. When messaging is coordinated and consistent, it easily fools our brains, already exhausted and increasingly reliant on heuristics (simple psychological shortcuts) due to the overwhelming amount of information flashing before our eyes every day. When we see multiple messages about the same topic, our brains use that as a short-cut to credibility. It must be true we say — I’ve seen that same claim several times today.
    2. I saw Eliot Higgins present in Paris in early January, and he listed four ‘Ps’ which helped explain the different motivations. I’ve been thinking about these a great deal and using Eliot’s original list have identified four additional motivations for the creation of this type of content: Poor Journalism, Parody, to Provoke or ‘Punk’, Passion, Partisanship, Profit, Political Influence or Power, and Propaganda.This is a work in progress but once you start breaking these categories down and mapping them against one another you begin to see distinct patterns in terms of the types of content created for specific purposes.
    3. Back in November, I wrote about the different types of problematic information I saw circulate during the US election. Since then, I’ve been trying to refine a typology (and thank you to Global Voices for helping me to develop my definitions even further). I would argue there are seven distinct types of problematic content that sit within our information ecosystem. They sit on a scale, one that loosely measures the intent to deceive.
    4. As Danah Boyd outlined in a recent piece, we are at war. An information war. We certainly should worry about people (including journalists) unwittingly sharing misinformation, but far more concerning are the systematic disinformation campaigns.