14 Matching Annotations
  1. Jun 2025
  2. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Luddite

      This source talks about how Luddites were 19th-century English workers who resisted new textile machinery because they believed that it was replacing their skilled labor and lowering their wages. The movement symbolized opposition to rapid technological change. Rather than opposing technology itself, their resistance targeted its use to undermine workers' rights and economic stability.

    1. But even people who thought they were doing something good regretted the consequences of their creations

      This reminds me of a source we read earlier this quarter. It talked about how the inventor of "doom-scrolling" regretted actions because he knew how much harm it would cause. It's honestly really sad that it turned out this way but it is also a good reminder that humans can get so caught up in what we can do that we don't think about if it will even benefit us.

  3. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Olivia Solon. 'It's digital colonialism': how Facebook's free internet service has failed its users.

      This article is about how Facebook offers a free, tester-version of the internet to people around the world. It is criticized and called digital colonialism the service it only provides certain info and restricts its users access. Meta already has a huge hold on the digital market and this is just another way of exploiting people that can't afford to know better.

    1. What if social media sites were governed by their users instead of by shareholders

      Honestly, I don't know if this would be any better. The users with the loudest voice often have the worst intentions. For example, a user created 8chan because they wanted to dictate how they used the site. It became even worse than 4chan. When users have freedom over what they do on a site, that opens up so much harm they could cause. There would be no preventative to stop that behavior.

  4. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Jorgen Harris. Do wages fall when women enter an occupation?

      This article highlights that increased female representation in an occupation lowers wages for men and women. Wage declines and is unlikely to result from unobserved changes in occupations. Wage declines not accounted for by changes in labor supply or returns to skills. Evidence suggests wage declines due to greater hours flexibility and lower prestige.

    1. What if social media sites were publicly funded or crowd-funded (like NPR for radio, and PBS for TV, Wikipedia, Archive of Our Own for fan fiction)? Note: Mastodon is trying to do this.

      This sounds like a good idea, but there is so much room for harm. For example, Wikipedia. Because it's public funded and edited, it is so easy to spread and post misinformation. The good people on Wikipedia have tried so hard to combat this by raising money and dedicating lots of time to editing and making the site more reputable. While this is a really admirable thing and a very useful website, they are running out of money. I really like the idea and I even use Wikipedia sometimes, but I feel like it doesn't (or at least ins't right now) work in the long term.

  5. May 2025
  6. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Meg van Achterberg. Jimmy Kimmel’s Halloween prank can scar children. Why are we laughing?

      The article talks about Jimmy Kimmel’s yearly Halloween prank where parents tell their kids they ate all their candy, film the reactions, and send them in for laughs. While it might seem funny to adults, psychologist Seth Meyers says it can actually mess with younger kids, especially under 10, because they don’t fully get the joke and can feel really hurt or betrayed. It’s not exactly traumatic, but it can still do some emotional damage and maybe isn’t the best move if you’re trying to build trust with your kid.

    1. What do you consider to be the most important factors in making an instance of public shaming bad?

      The most important factors that I think makes public shaming bad is the fact that it's public. There's a saying "correct in private" that I think is really important. Not many people learn when they are just humiliated in public.

  7. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Mike Gavin. Canucks' staffer uses social media to find fan who saved his life.

      I saw this awhile ago when it initially came out. At a game, a fan pointed out a mole on the neck of a Canuck staff and saved his life. He went to the doctor to check it out and it was cancer. The doctor told him he would've only had 4-5 years left if gone unchecked. The staff member ended up finding the fan through social media when his story went viral.

    1. In what ways do you think you’ve participated in any crowdsourcing online?

      I've probably participated in crowdsurfing more times than I think I have. One example would be leaving reviews. I leave reviews on many things I buy, like food at restaurants, eBay, depop, or even ratemyprofessor.

  8. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. David Gilbert. Facebook Is Ignoring Moderators’ Trauma: ‘They Suggest Karaoke and Painting’.

      This article was sad. Content moderators for Facebook are paid a fraction of what other full-time employees pay, and they face traumatic content daily. They are required to sign an NDA, which stops them from speaking up about the harmful content they're seeing. They are penalized for taking down the wrong thing, or not taking down the right one. Content like self harm, child SA, death, etc. No support is offered to these employees despite the graphic content they work with.

    1. Do you think there are ways to moderate well that involve less traumatizing of moderators or taking advantage of poor people?

      I think this is one of the good uses for AI. Instead of subjecting people to such traumatic content, we could use AI to filter and block that type of content. This could be far more efficient and could take down much more content in a smaller amount of time.

  9. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Karen Hao. How Facebook got addicted to spreading misinformation. MIT Technology Review

      This article was about Facebook's largest publicity scandal, where Russian hackers influenced the 2016 election by spreading hate speech and fake news. Employees left in protest, millions of users deleted the app, and Facebook's market value plunged. Mark Zuckerberg apologized for the mistakes, and the heads at Facebook began damage control such as civil rights audits.

    1. Social media sites also might run into legal concerns with allowing some content to be left up on their sites,

      This is interesting because I feel like so much unwanted content is still left up on a lot of sites. For example, now that X is owned by Elon Musk, I'm sure far less is taken down, or at least different things are being monitored now. This is so dangerous because he has the power to shape peoples minds on social media by only showing them one side of the news.