25 Matching Annotations
  1. Nov 2018
    1. Entscheidend ist, dass sie Herren des Verfahrens bleiben - und eine Vision für das neue Maschinenzeitalter entwickeln.

      Es sieht für mich nicht eigentlich so aus als wären wir jemals die "Herren des Verfahrens" gewesen. Und auch darum geht es ja bei Marx. Denke ich.

  2. Oct 2017
  3. Sep 2017
  4. Jul 2017
    1. The backfire effect is getting turbocharged online. I think we’re getting more angry and convinced about everything, not because we’re surrounded by like-minded people, but by people who disagree with us. Social media allows you to find the worst examples of your opponents. It’s not a place to have your own views corroborated, but rather where your worst suspicions about the other lot can be quickly and easily confirmed.

  5. Apr 2017
    1. Obviously, in this situation whoever controls the algorithms has great power. Decisions like what is promoted to the top of a news feed can swing elections. Small changes in UI can drive big changes in user behavior. There are no democratic checks or controls on this power, and the people who exercise it are trying to pretend it doesn’t exist

    2. On Facebook, social dynamics and the algorithms’ taste for drama reinforce each other. Facebook selects from stories that your friends have shared to find the links you’re most likely to click on. This is a potent mix, because what you read and post on Facebook is not just an expression of your interests, but part of a performative group identity.

      So without explicitly coding for this behavior, we already have a dynamic where people are pulled to the extremes. Things get worse when third parties are allowed to use these algorithms to target a specific audience.

    3. any system trying to maximize engagement will try to push users towards the fringes. You can prove this to yourself by opening YouTube in an incognito browser (so that you start with a blank slate), and clicking recommended links on any video with political content.

      ...

      This pull to the fringes doesn’t happen if you click on a cute animal story. In that case, you just get more cute animals (an experiment I also recommend trying). But the algorithms have learned that users interested in politics respond more if they’re provoked more, so they provoke. Nobody programmed the behavior into the algorithm; it made a correct observation about human nature and acted on it.

  6. Dec 2016
    1. http://digipo.io/doku.php<br> The Digital Polarization Initiative<br> "The primary purpose of this wiki is to provide a place for students to fact-check, annotate, and provide context to the different news stories that show up in their Twitter and Facebook feeds. It's like a student-driven Snopes, but with a broader focus: we don't aim to just investigate myths, but to provide context and sanity to all the news – from the article about voter fraud to the health piece on a new cancer treatment."

    1. The Web has become an insidious propaganda tool. To fight it, digital literacy education must rise beyond technical proficiency to include wisdom.

      • Double-check every claim before you share.
      • Be wary of casual scrolling.<br> Everything you see affects your attitudes.
      • Don't automatically disbelieve the surreal (or unpleasant).
      • Do not exaggerate your own claims.
      • Be prepared to repeat the truth over and over.
      • Curate good resources, and share updates to them.
        • It will reinforce the previous information.
        • it will boost search engine rankings of the collection.
    1. A survey of voters asked if they remembered seeing a headline, and if so, whether they believed it was true.

      It may come as no surprise that high percentages of Trump voters believed stories that favored Trump or demonized Clinton. But the percentage of Clinton voters who believed the fake stories was also fairly high!

      familiarity equals truth: when we recognize something as true, we are most often judging if this is something we’ve heard more often than not from people we trust.

      ...

      if you want to be well-informed it’s not enough to read the truth — you also must avoid reading lies.

    1. This is our internet. Not Google’s. Not Facebook’s. Not rightwing propagandists. And we’re the only ones who can reclaim it.

      This is our nation, and our world.<br> It is up to us to reclaim it.

  7. Nov 2016
    1. Interview with a man who has run several fake news sites since 2013.

      Well, this isn't just a Trump-supporter problem. This is a right-wing issue.

      ...

      We've tried to do similar things to liberals. It just has never worked, it never takes off. You'll get debunked within the first two comments and then the whole thing just kind of fizzles out.

    1. Journalism faces an 'existential crisis' in the Trump era, Christine Amanpour

      As all the international journalists we honor in this room tonight and every year know only too well: First the media is accused of inciting, then sympathizing, then associating -- until they suddenly find themselves accused of being full-fledged terrorists and subversives. Then they end up in handcuffs, in cages, in kangaroo courts, in prison

      ...

      First, like many people watching where I was overseas, I admit I was shocked by the exceptionally high bar put before one candidate and the exceptionally low bar put before the other candidate.

      It appeared much of the media got itself into knots trying to differentiate between balance, objectivity, neutrality, and crucially, truth.

      ...

      The winning candidate did a savvy end run around us and used it to go straight to the people. Combined with the most incredible development ever -- the tsunami of fake news sites -- aka lies -- that somehow people could not, would not, recognize, fact check, or disregard.

      ...

      The conservative radio host who may be the next white house press secretary says mainstream media is hostile to traditional values.

      I would say it's just the opposite. And have you read about the "heil, victory" meeting in Washington, DC this past weekend? Why aren't there more stories about the dangerous rise of the far right here and in Europe? Since when did anti-Semitism stop being a litmus test in this country?

    1. Paul Horner publishes fake news that is often shared widely. He claims that his stories are intended to be taken as satire like The Onion.

      Honestly, people are definitely dumber. They just keep passing stuff around. Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary. I’ve never seen anything like it.

      My sites were picked up by Trump supporters all the time. I think Trump is in the White House because of me. His followers don’t fact-check anything — they’ll post everything, believe anything. His campaign manager posted my story about a protester getting paid $3,500 as fact. Like, I made that up. I posted a fake ad on Craigslist.

    1. But as managing editor of the fact-checking site Snopes, Brooke Binkowski believes Facebook’s perpetuation of phony news is not to blame for our epidemic of misinformation. “It’s not social media that’s the problem,” she says emphatically. “People are looking for somebody to pick on. The alt-rights have been empowered and that’s not going to go away anytime soon. But they also have always been around.”

      The misinformation crisis, according to Binkowski, stems from something more pernicious. In the past, the sources of accurate information were recognizable enough that phony news was relatively easy for a discerning reader to identify and discredit. The problem, Binkowski believes, is that the public has lost faith in the media broadly — therefore no media outlet is considered credible any longer. The reasons are familiar: as the business of news has grown tougher, many outlets have been stripped of the resources they need for journalists to do their jobs correctly.

      The problem is not JUST social media and fake news. But most of the false stories do not come from mainstream media. The greatest evils of mainstream media are sensationalism, and being too willing to spin stories the way their sources want them to.

    1. But a former employee, Antonio Garcia-Martinez, disagrees and says his old boss is being "more than a little disingenuous here."

      ...

      "There's an entire political team and a massive office in D.C. that tries to convince political advertisers that Facebook can convince users to vote one way or the other," Garcia-Martinez says. "Then Zuck gets up and says, 'Oh, by the way, Facebook content couldn't possibly influence the election.' It's contradictory on the face of it."

    1. Mike Caulfield says Facebook's feed algorithms are far from its only problem. The entire site design encourages sharing of items that users haven't inspected beyond reading the headline.

    1. Facebook hasn’t told the public very much about how its algorithm works. But we know that one of the company’s top priorities for the news feed is “engagement.” The company tries to choose posts that people are likely to read, like, and share with their friends. Which, they hope, will induce people to return to the site over and over again.

      This would be a reasonable way to do things if Facebook were just a way of finding your friends’ cutest baby pictures. But it’s more troubling as a way of choosing the news stories people read. Essentially, Facebook is using the same criteria as a supermarket tabloid: giving people the most attention-grabbing headlines without worrying about whether articles are fair, accurate, or important.

  8. Oct 2016
    1. “Among millennials, especially,” [Ross] Douthat argues, “there’s a growing constituency for whom rightwing ideas are so alien or triggering, leftwing orthodoxy so pervasive and unquestioned, that supporting a candidate like Hillary Clinton looks like a needless form of compromise.”

      ...

      “I don’t see sufficient evidence to buy the argument about siloing and confirmation bias,” Jeff Jarvis,a professor at the City University of New York’s graduate school of journalism said. “That is a presumption about the platforms – because we in media think we do this better. More important, such presumptions fundamentally insult young people. For too long, old media has assumed that young people don’t care about the world.”

      “Newspapers, remember, came from the perspective of very few people: one editor, really,” Jarvis said. “Facebook comes with many perspectives and gives many; as Zuckerberg points out, no two people on Earth see the same Facebook.”

  9. Jun 2016
    1. Automated posts from social media accounts pretending to be real individuals are being used to influence public opinion. (The Chinese government uses regular employees to post "real" messages at strategic times.)

  10. May 2016
  11. Apr 2016
    1. Jon Udell on productive social discourse.

      changeable minds<br> What’s something you believed deeply, for a long time, and then changed your mind about?

      David Gray's Liminal Thinking points out that we all have beliefs that are built on hidden foundations. We need to carefully examine our own beliefs and their origins. And we need to avoid judgment as we consider the beliefs of others and their origins.

      Wael Ghonim asks us to design social media that encourages civility, thoughtfulness, and open minds rather than self-promotion, click-bait, and echo chambers.

  12. Feb 2016
    1. At some dark day in the future, when considered versus the Google Caliphate, the NSA may even come to be seen by some as the “public option.” “At least it is accountable in principle to some parliamentary limits,” they will say, “rather than merely stockholder avarice and flimsy user agreements.”

      In the last few years I've come to understand that my tolerance for most forms of surveillance should be considered in terms of my confidence in the judiciary.