13 Matching Annotations
  1. Mar 2023
    1. A different concern with decentralized moderation is that it will leadto “filter bubbles” and “echo chambers” by which instance members willchoose to only interact with like-minded users.

      Risk of filter bubbles

  2. Dec 2022
    1. despite Twitter’s self-styled reputation as a public town square — where everyone gathers to see the same messages — in practice, the pandemic showed how users segregate to follow mostly those with similar views, argues information scientist Oliver Johnson at the University of Bristol, UK. For instance, those who believed that COVID-19 was a fiction would tend to follow others who agreed, he says, whereas others who argued that the way to deal with the pandemic was to lock down for a ‘zero COVID’ approach were in their own bubble.

      Digital town square meats filter bubble effect

      During the COVID-19 pandemic, Twitter gave voice to researchers, but the platform’s algorithms allowed users to self sort into groups based on what they wanted to hear.

  3. Oct 2022
    1. TikTok starts studying its users from the moment they first open the app. It shows them a single, full-screen, infinitely looping video, then gauges how they react: a second of viewing or hesitation indicates interest; a swipe suggests a desire for something else. With every data point, TikTok’s algorithm narrows from a shapeless mass of content to a refined, irresistible feed. It is the ultimate video channel, and this is its one program.The “For You” algorithm, as TikTok calls it, gradually builds profiles of users’ tastes not from what they choose but how they behave. While Facebook and other social networks rely on their users to define themselves by typing in their interests or following famous people, TikTok watches and learns, tapping into trends and desires their users might not identify.

      TikTok uses user-interaction signals, not stated preferences or friend relationships, in its recommendation algorithm

      The article describes how users are "surprised and unsettled" by the algorithm's choices for next videos. The system rewards interaction by serving up videos that are more desirable to users—a kind of virtuous cycle of surprise and delight.

    2. How TikTok ate the internetThe world’s most popular app has pioneered a new age of instant attention. Can we trust it?By Drew HarwellOct. 14
  4. Jul 2022
    1. In her confusion, Peter wrote an e-mail seeking advice from Rachel Tashjian, a fashion critic who writes a popular newsletter called “Opulent Tips.” “I’ve been on the internet for the last 10 years and I don’t know if I like what I like or what an algorithm wants me to like,” Peter wrote. She’d come to see social networks’ algorithmic recommendations as a kind of psychic intrusion, surreptitiously reshaping what she’s shown online and, thus, her understanding of her own inclinations and tastes. “I want things I truly like not what is being lowkey marketed to me,” her letter continued.

      Recommendations based on your actions or on what the algorithm wants you to see

    1. While Brave Search does not have editorial biases, all search engines have some level of intrinsic bias due to data and algorithmic choices. Goggles allows users to counter any intrinsic biases in the algorithm.
  5. Feb 2021
    1. There is an additional civic value here, one that goes beyond simply preserving professional journalism. For about ten years now, a few of us have been waging a sometimes lonely battle against the premise that the internet leads to political echo chambers, where like-minded partisans reinforce their beliefs by filtering out dissenting views, an argument associated with the legal scholar and now Obama administration official Cass Sunstein. This is Sunstein’s description of the phenomenon:If Republicans are talking only with Republicans, if Democrats are talking primarily with Democrats, if members of the religious right speak mostly to each other, and if radical feminists talk largely to radical feminists, there is a potential for the development of different forms of extremism, and for profound mutual misunderstandings with individuals outside the group

      This is an early reference to the idea of a "filter bubble" dating back to 2004 that predates the 2010 coining of the word by Eli Pariser.

  6. Jul 2020
    1. Beware online "filter bubbles"

      Relevance of right in front of you Internet means different things to different people Algorithms edit the web based on what you have looked at in the past "There is no standard Google anymore" Personalizing news and search results to each user "The Internet is showing us what it thinks we need to see, not necessarily what we need to see" "Filter Bubble"--information you live in online, you don't decide what gets in, but you definitely don't see what gets left out Mainly looking at what you click on first Information junk food instead of information balanced diet Gatekeepers found a new way to gate keep through algorithms What does this do to democracy? What sort of internet/web ethics need to be developed to get us through to the next thing? Algorithms need to be transparent and to give us some control; need a sort of civic responsibility Internet needs to be a tool of democracy and access for ALL

  7. Oct 2018
    1. Like all experts, academics are used to speaking to a specialized audience. That’s true no matter their discipline, from sociology to geotechnical engineering to classics. When you speak to a niche audience among peers, a lot of understanding comes for free. You can use technical language, make presumptions about prior knowledge, and assume common goals or contexts. When speaking to a general audience, you can’t take those circumstances as a given.
  8. Nov 2017
  9. Mar 2017
    1. “Design it so that Google is crucial to creating a response rather than finding one,”

      With "Google" becoming generic for "search" today, it is critical that students understand that Google, a commercial entity, will present different results in search to different people based on previous searches. Eli Pariser's work on the filter bubble is helpful for demonstrating this.

  10. Feb 2015
    1. I am a PhD candidate in the Human Computer Interaction (HCI) group of Computer Science Department at University of Illinois at Urbana-Champaign. I work in the CASCAD Lab, advised by Prof.Wai-Tat Fu. I also work closely with Prof. Bruce Schatz . My research interests broadly lie in the fields of human computer interaction (HCI), social computing, health informatics and cognitive science. Please see bio and projects for more details.
    1. The researchers examined social media patterns for 1.2 million Facebook users and found that nearly 92 percent of those who engage with Italian conspiracy theory pages interact almost exclusively with conspiracy theory pages.

      Oh, no. No. Noooooo.