4 Matching Annotations
  1. Sep 2023
    1. Trending Topicsgate was a tempest that occurred in May 2016, when a content moderator who worked for Facebook’s Trending Topics feature styled himself a whistle-blower, opened up to Gizmodo, and said that Facebook employees were suppressing conservative news.

      To elaborate on my first annotation to this: I think what happened with "Trending Topicsgate" in May 2016 was not good for a few reasons. First, it made people worry about how much power social media like Facebook has. Some said Facebook workers were stopping conservative news from being seen, and that's not good because it might change what people think and believe. Second, it made people think about fairness. If Facebook was really stopping some news and not others, that's not fair. It means they might be picking sides, and that's not what people expect from a big platform like Facebook. And lastly, it made people fight more about politics. When something like this happens, it can make people who like different things not trust each other, and that makes it hard for us to talk nicely to each other. So, "Trending Topicsgate" was not good because it made people worry about how fair and honest social media is, and it made political arguments even worse.

    2. But we’ve known this for a while. Over the past two years, journalists and researchers have assembled an entire lexicon for describing these problems: misinformation, disinformation, computational propaganda. We started having congressional hearings about how algorithms are changing society. And we’ve talked often about filter bubbles on Google, conversational health metrics on Twitter, radicalization on YouTube, and “coordinated inauthentic activity” on Facebook.

      I think it's a good idea to stop false information on the internet, but we have to be really careful not to stop people from speaking their minds. We've known about these problems for a while now. In the past two years, smart people like journalists and researchers have given names to these issues, like misinformation and disinformation. They've even talked to important people in Congress about how computer programs are changing our world. We've also talked a lot about things like filter bubbles on Google, keeping conversations healthy on Twitter, stopping people from getting too extreme on YouTube, and when people pretend to be someone else on Facebook. So, it's good to stop false information, but we have to be sure we don't take away people's freedom to say what they think. Finding the right balance is a big challenge we've been thinking about for a while now.

  2. Aug 2023
    1. And yet, as more news outlets and formats compete for audiences’ time and attention, we continue to see longer-term falls in interest and trust in news across age groups and markets – particularly among younger audiences. Under 35s are the lowest-trusting age groups, with only a third (37%) of both 18–24s and 25–34s across all markets saying they trust most news most of the time, compared with nearly half of those 55 and older (47%). Young people also increasingly choose to avoid the news, with substantial rises in avoidance among social natives since we last asked this question in 2019. Across all markets, around four in ten under 35s often or sometimes avoid the news now, compared with a third (36%) of those 35 and older.

      I find this statistic highly interesting and have a few theories as to why younger people do not trust the news as much as older adults. For starters, I believe television and radio used to be the most convenient main source of live, up to date news that could provide breaking news as well. As technology and time progressed, social media and internet has given younger generations an outlet to voice their opinions and, more importantly, fact check what they hear/see/read.

    2. The social media landscape continues to evolve dramatically, with new social networks like TikTok entering the field as well as existing platforms like Instagram and Telegram gaining markedly in popularity among young audiences. As social natives shift their attention away from Facebook (or in many cases never really start using it), more visually focused platforms such as Instagram, TikTok, and YouTube have become increasingly popular for news among this group. Use of TikTok for news has increased fivefold among 18–24s across all markets over just three years, from 3% in 2020 to 15% in 2022, while YouTube is increasingly popular among young people in Eastern Europe, Asia-Pacific, and Latin America.

      I remember when YouTube had to do something about fake news after the big event on January 6th. They made rules to take down videos with false information. That's good because we want to know the right stuff. Also, TikTok gives users content creation freedom and more freedom of speech, and we can make our own videos and say what we think. But sometimes, that can also be a problem. Since we can post anything, some things might not be true. Like, gossip about famous people or even important things like politics. TikTok does not always check if things are real before they spread however, I do sometimes see warning on the video if the video may cause bodily harm if tried to perform at home.