36 Matching Annotations
  1. Jun 2025
  2. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Ted Chiang. Will A.I. Become the New McKinsey? The New Yorker, May 2023. URL:

      The article “Will A.I. Become the New McKinsey?” discusses how artificial intelligence tools are starting to take on roles traditionally held by consulting firms—like offering strategic advice to businesses. One detail that stood out to me was how companies may begin to rely on AI not just for efficiency, but also for decision-making. This raises concerns similar to those mentioned in the social media chapter—like data use, bias, and accountability. It made me think about how AI could shape not only our personal experiences online but also big-picture decisions in society.

    1. how your data gets used or abused

      I found the point about how our data gets used or abused especially important. On social media, we often share things without thinking, but companies can collect that data and use it in ways we don’t expect. It made me reflect on how little control we really have over our online information and how careful we need to be when posting or clicking on things.

  3. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. James Chen. Corner A Market: What it is, How it Works, Legality. Investopedia, April 2022. URL:

      The Investopedia article on “Corner a Market” explains how a company or investor tries to gain enough control over a particular product or stock to manipulate its price. I found it interesting that this strategy isn’t always illegal, but it can be unethical or lead to market instability. One example mentioned is the Hunt brothers' attempt to corner the silver market in the 1980s, which led to major financial fallout. This source helped me better understand how power dynamics can play out in both financial and digital markets, connecting to how big social media platforms dominate attention and limit smaller competitors’ chances.

    1. Other social media sites have used more unique features to distinguish themselves from Facebook and get a foothold, such as Twitter with its character limit (forcing short messages, so you can see lots of posts in quick succession), Vine and then TikTok based on short videos, etc.

      This sentence made me think about how competition leads to innovation. Instead of trying to copy Facebook, platforms like TikTok and Twitter found success by doing something different. It reminds me of how in school or life, just following what others do doesn't always work—you need to find your own strength or unique style. I think this shows how being different can actually be an advantage, not a weakness.

  4. May 2025
  5. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. C. Thi Nguyen. Twitter, the Intimacy Machine. The Raven Magazine, December 2021. URL:

      In “Twitter: The Intimacy Machine,” the author discusses how Twitter creates a strange sense of closeness between strangers, especially when people share personal stories or emotions. One interesting detail was how public shaming can feel intimate because it happens in such a personal tone, even though it’s seen by thousands. This connects with the chapter’s idea that reconciliation and shame are tied together—we’re often hurt most deeply by people or reactions that feel personal, even when they come from the crowd. The article made me realize that technology like Twitter blurs the line between public and private, which complicates the process of repair and reconciliation.

    1. Are there wounds too big to be repaired? Are there evils too great to be forgiven? Is anyone ever totally beyond the pale of possible reconciliation? Is there a point of no return?

      It reminded me of a time when someone deeply betrayed my trust. Even though they apologized, I found it hard to forgive, and it made me question whether some relationships can ever truly be repaired. This part of the reading helped me see that reconciliation isn’t always possible, and that’s okay too. It depends on the situation and the people involved.

  6. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Index on Censorship. Interview with a troll. Index on Censorship, September 2011. URL: https://www.indexoncensorship.org/2011/09/interview-with-a-troll/

      The interview with the anonymous troll really stood out to me. What surprised me most was how casually he talked about causing distress online—almost like it was a game. He admitted to targeting people not out of deep personal hatred, but just to provoke a reaction or gain attention. This made me think about how anonymity can remove a sense of responsibility, and how moderation online has to deal with behavior that’s intentionally disruptive but not always illegal.

    1. Violence in this case refers to the way that individual Natural Rights and freedoms are violated by external social constraints.

      I never thought of rules or restrictions as a kind of violence before, but this made me see it differently. Sometimes when someone or some system puts limits on your freedom “for your own good,” it still feels like something is being taken from you. I’ve experienced this when school rules were enforced in ways that felt unfair—even if they were meant to help us, they didn’t always feel right or respectful.

  7. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Catherine M. Vera-Burgos and Donyale R. Griffin Padgett. Using Twitter for crisis communications in a natural disaster: Hurricane Harvey. Heliyon, 6(9):e04804, September 2020. URL:

      This article reviews how crowdsourcing has been used in the field of pathology, especially for tasks like labeling medical images. One interesting detail is that crowdsourced workers (even non-experts) were often able to produce results with accuracy close to trained pathologists, especially when their answers were aggregated. This shows that crowdsourcing can be useful not just in games like Fold-It, but also in serious areas like healthcare.

    1. Researchers analyzed the best players’ results for their research and were able to publish scientific discoveries based on the contributions of players.

      I think it's really interesting that a video game like Fold-It helped scientists make real discoveries. It shows how powerful crowdsourcing can be, even for serious scientific problems. I’ve never thought of games being useful in science before, but now I wonder if more scientific research could be turned into fun challenges for regular people to help with.

  8. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Alex Heath. Facebook to end special treatment for politicians after Trump ban. The Verge, June 2021. URL:

      This article explains how Facebook decided to end its policy of exempting political figures from normal content moderation rules. Before this change, politicians could say things on the platform that regular users would be penalized for. I think this shift shows that platforms are starting to realize the dangers of giving special treatment to powerful users, especially when it comes to spreading misinformation or hate speech. It connects to the chapter’s point that moderation should not rely only on individual users, but must come from platform-level responsibility.

    1. Sometimes individuals are given very little control over content moderation or defense from the platform,

      I think it’s frustrating that users are often told to just “not read the comments” instead of being given real tools to protect themselves. It feels like the responsibility is shifted away from the platform and onto the person being targeted. I agree that platforms should do more—it shouldn’t just be up to individual users to block or mute others after the damage is already done.

  9. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Rhitu Chatterjee. The new 988 mental health hotline is live. Here's what to know. NPR, July 2022.

      The NPR article explains the launch of the new 988 hotline, which is meant to be an easy-to-remember number for people experiencing a mental health crisis. I think this is a really important step because it makes getting help faster and less intimidating. It also connects people to trained crisis counselors, rather than law enforcement, which could make people feel safer reaching out. This contrasts with the book’s discussion of tech companies trying to detect suicidal behavior through algorithms—it feels more human and supportive.

    1. your employer might detect that you are unhappy [m34], and consider firing you since they think you might not be fully committed to the job

      I think it’s really concerning that an employer could use emotion detection to decide whether someone is committed to their job. People have bad days or go through tough times, and it doesn't always mean they’re not trying. Using algorithms to judge someone's mental state feels like an invasion of privacy and could easily lead to unfair treatment.

  10. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Murder of George Floyd. December 2023. Page Version ID: 1188546892. URL:

      One detail from the Wikipedia article that stood out to me was how quickly the video of George Floyd's death spread on social media, leading to global protests. This directly connects to the chapter’s discussion of replication, because the video was shared, reposted, and quoted with different captions and contexts, which helped turn it into a powerful symbol of racial injustice. It’s a strong example of how digital replication can drive real-world action.

    1. Actions like quote tweeting, or the TikTok Duet feature let people see the original content, but modified with new context.

      I think the example of quote tweeting and TikTok Duets is a great way to show how replication can add new meaning. I’ve seen TikTok Duets where the second video completely changes the mood or message of the original—sometimes turning something serious into a joke. It made me realize how easily content can be reshaped and interpreted in totally different ways online.

  11. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Kurt Wagner. Inside Twitter’s ambitious plan to clean up its platform. Vox, March 2019. URL:

      In the Vox article, Twitter researchers found that people were often encouraged to “dunk” on others—mock or criticize—in order to go viral. This behavior was unintentionally promoted by the platform’s design, which rewards popularity over respectful conversation. I think this connects directly to the chapter’s idea of filter bubbles, because when negativity gets rewarded, people stay in spaces where their views dominate and go unchallenged.

    1. These echo chambers allow people in the groups to freely have conversations among themselves without external challenge.

      This made me think about how social media keeps showing me the same kind of videos or posts I already agree with. I noticed that when I keep liking certain content, the algorithm just gives me more of the same thing, and I don’t see many different opinions. It feels comfortable, but I realize now it might also limit how much I learn or grow.

  12. Apr 2025
  13. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. The Lies and Dangers of "Conversion Therapy".

      This source explains how “reparative therapy,” which claims to change a person’s sexual orientation, has been widely discredited by medical and psychological organizations. One detail that stood out to me was that the American Psychological Association warns this kind of therapy can lead to depression, anxiety, and self-destructive behavior.

    1. This way of managing disabilities puts the burden fully on disabled people to manage their disability in a world that was not designed for them, trying to fit in with “normal” people.

      I really connected with the idea that society puts the full burden on disabled people to adapt to a world not made for them. It made me think about how often systems assume “normal” as the default and expect others to quietly adjust. I wonder what changes would happen if society instead focused on removing barriers instead of making individuals responsible for hiding their struggles.

    1. Emma Bowman. After Data Breach Exposes 530 Million, Facebook Says It Will Not Notify Users. NPR, April 2021. URL:

      The NPR article about Facebook’s 2021 data breach highlights a troubling issue: even when users' personal information is exposed, companies may choose not to alert them. Facebook’s decision not to notify affected users suggests that legal obligations are sometimes valued more than ethical responsibility. It made me think about how data privacy regulations still leave gaps that companies can exploit, and how users are often left unaware and unprotected after major breaches.

  14. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Therefore if someone had access to the database, the only way to figure out the right password is to use “brute force,” that is, keep guessing passwords until they guess the right one

      I was surprised that even large companies sometimes store sensitive information like passwords insecurely. It made me realize that we often trust technology too much without questioning how safe it really is. Just because a company is big doesn’t mean it always handles user data properly. This section reminded me to be more critical and cautious when using online services.

  15. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Greg Miller. Researchers are tracking another pandemic, too—of coronavirus misinformation. Science, March 2020. URL

      The article talks about how false information about COVID-19 spreads like a virus. I learned that scientists are using tools usually used to study diseases to see how misinformation moves through social media. One part that stood out to me was how this false information can actually hurt people—for example, by making them scared of vaccines or not follow health rules. I think this article is important because it shows that fake news isn’t just annoying—it can be dangerous, especially during a health crisis.

    1. By looking at enough data in enough different ways, you can find evidence for pretty much any conclusion you want.

      It reminded me of a time in high school when I was doing a group project on health trends. We found a strong correlation between energy drink sales and student stress levels, and at first, we were convinced one caused the other. But after talking to our teacher, we realized that both could actually be caused by something else—like exams. That experience taught me how easy it is to misread data. When I saw the chart about margarine consumption and divorce rates, it seemed silly at first, but it actually made a serious point. It showed how easy it is to find patterns that look convincing but have no real meaning. It made me think more critically about how data is used to support arguments, and how important it is to ask whether a connection truly makes sense.

  16. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Quinn Norton. Anonymous 101: Introduction to the Lulz. Wired, November 2011. URL:

      The Wired article Anonymous 101 provides a detailed look into the origins and inner workings of the Anonymous group, which is often linked with trolling culture. One detail I found especially interesting is how their use of humor, memes, and chaos isn't just for laughs—it also reflects a kind of protest or activism. This connects to the chapter’s point about trolling evolving from simple pranks to something more organized and community-based, like on 4chan.

    1. These were the precursors to more modern Massively multiplayer online role-playing games (MMORPGS [g15]).

      Reading about how MUDs evolved into MMORPGs made me think about how much online gaming has changed over time. I remember playing games like World of Warcraft when I was younger, and it’s interesting to realize that those games came from such simple text-based beginnings. It’s kind of mind-blowing how far we've come in terms of game design and online interaction.

  17. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Zoe Schiffer. She created a fake Twitter persona — then she killed it with COVID-19. The Verge, September 2020.

      This article tells the story of how a Twitter account pretending to be a female physician of color gained trust online by posting supportive messages about healthcare and science. Eventually, it was revealed the person behind the account was not who they claimed to be. One important detail was that the account had even faked a COVID death to gain sympathy. This shows how online “authenticity” can be manipulated, and how hard it is to verify who someone really is online—even when they seem supportive or professional.

  18. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Authenticity in connection requires honesty about who we are and what we’re doing

      This quote made me think about how I sometimes change how I act just to be liked. For example, I might pretend everything’s fine when I’m actually not, or act interested in something just to fit in. But doing that is tiring, and it doesn’t lead to real friendships. This quote reminded me that people can only connect with the real me if I’m willing to show who I really am. If I always hide, then the connections I make won’t be real either.

  19. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Nicholas Jackson and Alexis C. Madrigal. The Rise and Fall of Myspace

      The author explains how MySpace’s early success came from giving users a high level of customization and creative freedom, but those same features later became part of its downfall. A detail that stood out to me was how MySpace’s pages became messy and hard to navigate, which made users eventually switch to cleaner platforms like Facebook. This source helps me understand how user experience and design play a major role in whether a social media platform survives.

    1. With these blog hosting sites, it was much simpler to type up and publish a new blog entry, and others visiting your blog could subscribe to get updates whenever you posted a new post, and they could leave a comment on any of the posts.

      This sentence reminds me of how we now use platforms like Canvas to write discussion posts and share reflections. As a non-native English speaker, I really appreciate the chance to write and revise my thoughts before sharing. It gives me more confidence to participate. Blog-style platforms made it easier for people like me to join conversations in public spaces.

  20. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Manuela López Restrepo. How the porn bots took over Twitter. NPR, March 2024. URL:

      I read the NPR article about bots on Twitter, and honestly I wasn’t too surprised that so many accounts aren’t real. I’ve noticed weird replies and fake-looking profiles before, but I didn’t realize how hard it is to actually tell what’s real anymore. It’s kind of frustrating, especially when people use Twitter to get news or serious info. Makes me think twice about trusting things I see online, even if the account looks “official.”

    1. What country are you from?

      This section made me realize how often we oversimplify complex real-life situations just to make data easier to handle. For example, the question “What country are you from?” really resonated with me. I’m from Shenzhen, China, but I’ve also spent time living in different places, and sometimes it feels hard to give a single answer. Are we talking about where I was born, where I grew up, or where I currently live? Real life is more complicated than what a data form can capture.

  21. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Sarah Jeong. How to Make a Bot That Isn't Racist. Vice, March 2016. URL: https://www.vice.com/en/article/mg7g3y/how-to-make-a-not-racist-bot (visited on 2023-12-02).

      This article explains how developers could’ve avoided what happened with Tay by using tools that already existed, like slur filters made by other bot creators. One thing that really stood out to me was how Microsoft ignored those tools, even though they were free and easy to access. It made me realize that sometimes the problem isn’t that we don’t know how to prevent these issues—it’s that people don’t take the time to use what’s already out there.

    1. In 2016, Microsft launched a Twitter bot that was intended to learn to speak from other Twitter users and have conversations. Twitter users quickly started tweeting racist comments at Tay, which Tay learned from and started tweeting out within one day.

      Reading about the Tay bot honestly shocked me. I knew AI could reflect biases in data, but I didn’t realize it could go downhill that fast. It reminded me of a time when I posted something pretty neutral online, and it somehow attracted a bunch of toxic or sarcastic replies. It’s weird how fast online spaces can turn negative—and Tay basically just absorbed all of that without any judgment. This really made me think about how important it is for developers to not just focus on what AI can do, but also what environments we’re placing it into. Without the right safeguards, even something meant to be harmless can turn harmful really fast.

    1. “A person is a person through other people.”

      The idea that “a person is a person through other people” made me think about how identity isn’t just individual, but shaped by our relationships. Reading about Ubuntu helped me understand how, especially in post-colonial Africa, this belief supported healing and unity through community.

    2. Being and becoming an exemplary person (e.g., benevolent; sincere; honoring and sacrificing to ancestors; respectful to parents, elders and authorities, taking care of children and the young; generous to family and others).

      Growing up in China, I often followed traditions like showing respect to elders and participating in family rituals, but I didn’t think much about why. This reading made me reflect on how those habits were shaping my ideas of what it means to be a “good person”—not through rules, but through relationships and small everyday actions.