19 Matching Annotations
  1. Last 7 days
  2. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Karen Hao. How Facebook got addicted to spreading misinformation. MIT Technology Review, March 2021. URL: https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/ (visited on 2023-12-08).

      The amount of power these social media sites have with our data is something to ponder about. Facebook had a recent data breach that allowed certain parties to feed content to certain users that could influence them politically. With social media being prevalent now more than ever, this is especially dangerous. I think that these companies should have harsher consequences when these data breaches happen and there should be a government-adjacent but separate entity that ensures the privacy of all social media users.

    1. In unmoderated online spaces who has the most power and ability to speak and be heard? Who has the least power and ability to speak and be heard?

      The people who have the least power are minorities. Minorities are often harassed on social media, but due to moderation, hate speech is usually reported and taken down. If there was no moderation, I think the amount of hate online, especially towards minority groups, will grow exponentially.

  3. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Anya Kamenetz. Selfies, Filters, and Snapchat Dysmorphia: How Photo-Editing Harms Body Image. Psychology Today, February 2020. URL: https://www.psychologytoday.com/us/articles/202002/selfies-filters-and-snapchat-dysmorphia-how-photo-editing-harms-body-image (visited on 2023-12-08).

      It's very obvious that there is a huge issue with social media and body image issues. When people go to get cosmetic procedures, however, there's no way to tell the reason why someone is getting the procedure (without being very invasive). This rules out the idea of doctors asking patients to think carefully about their decisions. There's also the fact that many people do in fact feel a lot better after getting procedures done. That then brings the question of whether there should be some way to overlook this process from a mental health standpoint.

    1. Researchers at Facebook decided to try to measure how their recommendation algorithm was influencing people’s mental health. So they changed their recommendation algorithm to show some people more negative posts and some people more positive posts. They found that people who were given more negative posts tended to post more negatively themselves. Now, this experiment was done without informing users that they were part of an experiment, and when people found out that they might be part of a secret mood manipulation experiment, they were upset [m5].

      A few things to note here. First of all, a big company like Facebook having it's own researchers conclude that their app is harming young girls' mental health and still not doing anything about it is insane. They post this research and still fail to make changes for the better of the masses. In addition to that, running experiments as invasive as the ones described here without consent would definitely feel violating and uncomfortable. I am shocked that there isn't any law against this.

  4. May 2026
  5. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Matt Stopera. Monica Lewinsky Has Been Making Jokes About The Clinton Impeachment For Years, And It Really Is Funny Every Single Time. BuzzFeed, September 2021. URL: https://www.buzzfeed.com/mjs538/monica-lewinsky-twitter-comebacks (visited on 2023-12-08).

      I personally don't share my political opinions on social media and I often find it bizarre to see people have whole internet personas based on them. Monica Lewinsky is a good example here. She gets a wide range of reactions from people, love and hate. Lot's of people retweet and comment on her posts. And there are lots of people that even revisit very old tweets with new or updated opinions. This goes to show how a persona created and posted content is basically online forever. People who create such in depth personas might expect it, but there are many cases where people who have newly rose to fame have gotten old tweets from them dug up to then cause them toruble in the present.

    1. Groups and organizations make their own decisions on what social media content to replicate as well (e.g., a news organization might find a social media post newsworthy, so they write articles about it). Additionally, content may be replicated because of: Paid promotion and ads, where someone pays money to have their content replicated Astroturfing: where crowds, often of bots, are paid to replicate social media content (e.g., like, retweet)

      I recently watched a TikTok that touched on the reasons for a person's online content to be replicated, and one huge reason is to talk a broader and more widespread message. In this case, the message was to warn people about college alcoholism. This creator also talk about how when your content gets replicated in this manner (this manner being the video getting used to show a bad example), it creates a harmful and non-erasable digital footprint.

  6. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Elon Musk [@elonmusk]. Trashing accounts that you hate will cause our algorithm to show you more of those accounts, as it is keying off of your interactions. Basically saying if you love trashing *that* account, then you will probably also love trashing *this* account. Not actually wrong lol. January 2023. URL: https://twitter.com/elonmusk/status/1615194151737520128 (visited on 2023-12-07).

      This honestly makes a lot of sense. I think a trend on social media these days is influencers gaining traction by rage baiting their audience. People engage with negative content and that negative content keeps coming back, feeding into this negative mindset. Now it's clear to me that it's not only users that are pushing this forward, its also the companies designing these algorithms.

    1. Now, how these algorithms precisely work is hard to know, because social media sites keep these algorithms secret, probably for multiple reasons: They don’t want another social media site copying their hard work in coming up with an algorithm They don’t want users to see the algorithm and then be able to complain about specific details They don’t want malicious users to see the algorithm and figure out how to best make their content go viral

      This point honestly shows me how capitalism overpowers morals and transparency. These companies may not want to show the algorithm because it would be revealed just how predatory it might be, though I think the biggest reason is to not let other companies use it. But because they are allowed to not reveal this algorithm, there is also no regulation on this. No regulation can have severe consequences on users of all ages (especially kids).

  7. Apr 2026
    1. Additionally, people with disabilities might change their behavior (whether intentionally or not) to hide the fact that they have a disability, which is called masking and may take a mental or physical toll on the person masking, which others around them won’t realize.

      I think this should make us pause and think about all the consequences of not making the world more flexible and adaptive to disabled people. There are mental health consequences and there are physical consequences when disabled people have to constantly put extra effort into fitting in with normal people to be comfortable. As an alternative solution, society can change the design of settings, tools etc. to be usable by more kinds of people rather than only by people considered "normal".

  8. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. David Robson. The women with superhuman vision. BBC, February 2022. URL: https://www.bbc.com/future/article/20140905-the-women-with-super-human-vision (visited on 2023-12-07).

      When reading this, like the title suggested, I thought this woman was someone who had something like a superpower. As I read through the article, I realized that because of this superpower of hers, some settings might be overstimulating due to excessive color (from her perspective). That's when I came to the further realization that some people would consider this a disability as she might need some accommodation to be comfortable in some spaces. This situation made me realize that "disabilities" aren't necessarily always visible and aren't negative. It also made me realize there are so many types of "disabilities" that I might not have ever even heard of.

    1. Lyra Hale. New Book Says Facebook Employees Abused Access to Track and Stalk Women. The Mary Sue, July 2021. URL: https://www.themarysue.com/facebook-employees-abused-access-target-women/ (visited on 2023-12-06).

      With great power comes great responsibility. In order to their jobs, I bet some engineers need access to certain data. But how can one determine if that person can be responsible with that data. I think this come down to concrete rules and security around data from everyone, even people inside the company. The negligence here lies in facebooks regulations of people's access to certain data. This really shows how users trust in platforms and companies starts with building a robust security system within the platforms and company itself.

  9. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. But while that is the proper security for storing passwords. So for example, Facebook stored millions of Instagram passwords in plain text [i8], meaning the passwords weren’t encrypted and anyone with access to the database could simply read everyone’s passwords. And Adobe encrypted their passwords improperly and then hackers leaked their password database of 153 million users [i9].

      When large data leaks, like the facebook data leak, occur, the people that suffer most are the people whose data got leaked. I am aware that facebook still faces consequences like hurt brand reputation and maybe some compensation losses, however, I don't feel like that is enough of a consequence. I think that in our country, rules and regulations for these big companies are too lax and that puts the greater majority of people at risk.

  10. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Text analysis of Trump's tweets confirms he writes only theAndroid half was published on. Text analysis of Trump's tweets confirms he writes only the (angrier) Android half. August 2016. URL:

      The discrepancy between the tweets written by Trump himself vs the tweets written/revised by his team has become glaringly obvious. With most political figures, you can't tell the difference. But in this case, you very much can. It paints the President as more unprofessional and causes audiences to lose trust in him as they feel like they are not always getting his true thoughts and feelings in every tweet.

    2. Text analysis of Trump's tweets confirms he writes only theAndroid half was published on. Text analysis of Trump's tweets confirms he writes only the (angrier) Android half. August 2016. URL: http://varianceexplained.org/r/trump-tweets/ (visited on 2023-11-24).

      I feel like for most political figures, people don't think about whether the political figure themselves have posted the tweet or if it was their team. This is because their authentic selves are either filtered or match what is expected of them. With Trump however, I feel like people are no longer wondering who wrote what. It is glaringly obvious when he has posted something vs if his team has posted something. Even if some tweets "sound" like him, I feel like I can tell if his team had a hand at editing the tweet. I feel like this discrepancy makes the President seem more untrustworthy and unprofessional. This is to say that discrepancies in how you present yourself cause people to lose trust in you.

  11. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Social media spaces have allowed humor and playfulness to flourish, and sometimes humor and play are not, strictly speaking, honest. Often, this does not bother us, because the kind of connection offered by joke accounts matches the jokey way they interact on social media. We get to know a lot about public figures and celebrities, but it is not usually considered problematic for celebrity social media accounts to be run by publicist teams. As long as we know where we stand, and the kind of connection being offered roughly matches the sort of connection we’re getting, things go okay.

      This is a very interesting point and something I have noticed more on platforms like TikTok. Some people who make these jokes are appreciated while others are bashed for doing basically the same thing. I've also found that women are always the ones who get bullied more for trying to be humorous, especially if people deem it in-authentic. I wonder if there is a deep rooted misogynistic element to who is allowed to show humor in this way and who isn't.

    1. One classic example is the tendency to overlook the interests of children and/or people abroad when we post about travels, especially when fundraising for ‘charity tourism’.

      I think it's a very interesting phenomenon that people post pictures chasing after social rewards without thinking about the ethics of it. Even if people do think about the ethics of it, they may decide that the social rewards are far higher than their moral obligations. This reminds me of when Logan Paul decided to post a video of a deceased person in the woods of Japan, completely disrespecting that person and all subjecting viewers to traumatizing content. In my opinion, any normal person would find that inappropriate to post, but I think at the time, Logan Paul must have thought the social rewards were higher than his moral obligations.

  12. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. The Onion. 6-Day Visit To Rural African Village Completely Changes Woman’s Facebook Profile Picture. The Onion, January 2014. URL: https://www.theonion.com/6-day-visit-to-rural-african-village-completely-changes-1819576037 (visited on 2023-11-24).

      This woman visits Africa and her main comment about it is that her facebook profile will change forever. She also tells her friends that their profile's will also definitely change. She doesn't talk about the nature, the people (not much), or anything else specific. Somehow this reaction feels performative and paints her as a white savior, and doesn't make her seem like someone that actually cares about the people in Africa or someone that appreciates the beauty of Africa.

  13. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Tom Knowles. I’m so sorry, says inventor of endless online scrolling. The Times, April 2019. URL: https://www.thetimes.co.uk/article/i-m-so-sorry-says-inventor-of-endless-online-scrolling-9lrv59mdk (visited on 2023-11-24).

      The creator of the endless scroll feature that many apps use today, Aza Raskin, talks about how regrets creating the feature without thinking much about its consequences. He explicitly says that it was designed to keep people online as long as possible and that he knows that the feature can make users do what "he" wants. The article also talks about the huge jump in teenage depression rates as a potential result of this.

    1. Books and news write-ups had to be copied by hand, so that only the most desired books went “viral” and spread

      I think one thing to note is how much clearer stand out pieces were here. It took real effort to make something go viral, whereas now, I find that it's a lot easier. Even so, one could argue that even amongst the millions of viral videos, there are still standout viral books, videos, movies, etc. I just think it is definitely a lot more saturated now than before.