- Last 7 days
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Social media platforms have some ability to influence what goes viral and how (e.g., recommendation algorithms, what actions are available, what data is displayed, etc.),
Social media platforms are highly biased toward what they want to make viral and what they do not want to make viral. For example, one of my friends had a public TikTok account. Her videos were doing fairly well, but then TikTok started to show up that a promotion was available to get more views. She paid for it once, and her video went insanely viral. But then she did not do it for any of her future videos, and all the videos posted after that did even worse than the old ones, and the only possible thing we could think was that TikTok was indirectly forcing her to pay for promotion for her to go viral again, which is so unfair.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Similarly, in 2011, 13-year-old Rebecca Black made a music video called “Friday,” which spread virally for being cheesy and bad.
Going viral can either be beneficial or it can harm one's life. Such as 13-year-old13 year old, Rebecca Black made a music video as something fun, coming up on such a large platform and showcasing your talent takes a lot of courage, so instead of applauding that, people made her viral for all the bad reasons and gave her backlash, which is so demotivating and can shut off someone for posting in future.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
where people get filtered into groups and the recommendation algorithm only gives people content that reinforces and doesn’t challenge their interests or beliefs.
This seems very true to me such as it seems like I and my friends might get filtered in some groups such as on Tiktok our for you page is 95% similar if we share any Tiktok either she has seen it before or I have, same goes for our Spotify recommendations, this can be because we have really similar playlists. As it mentions interests, this also is very true I believe as I and my friends are always talking about food and traveling and that's what our whole for you page is about.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Recommendations can go poorly when they do something like recommend an ex or an abuser because they share many connections with you.
I have quite frequently experienced this on my own, as I have three accounts: one for my school friends, one for my close friends, and another for my family. Due to these recommendation algorithms, all three accounts show up synced suggesting accounts which can be frustrating. Such as I see my family requesting me on my friend's account, so the algorithm isn't as smart as it might portray to be. Also if someone made an account to stalk someone in a positive way their account might start to pop up in the suggested accounts of the person they are stalking. So these algorithms can have positive and adverse effects. A positive thing could be that if you block someone on Instagram it makes sure to block all other accounts that a user may have or even make.
-
- Oct 2024
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
When designers and programmers don’t think to take into account different groups of people, then they might make designs that don’t work for everyone.
This reminded me of something in contrast, that I saw in my CSE 121 class and how some designers and programmers actually take disabled people into account. To elaborate it was intriguing to see how computer science is such a vast subject that is open to all, even those who are determined to learn as it can incorporate people like Sakib Shaikh who is blind, and others like him and not make them feel unrepresented in any way. As he could code just by speaking and its features such as fast speaking can also strengthen the overall quality and effectiveness of the code for all users.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Additionally, people with disabilities might change their behavior (whether intentionally or not) to hide the fact that they have a disability, which is called masking and may take a mental or physical toll on the person masking, which others around them won’t realize.
This is making me realize how we take so many things for granted and don't have the societal pressure to hide something that is also very natural. Masking is a very understandable behaviour that disabled people do to be accepted in society and to fit in the normal otherwise some inconsiderate people might treat them differently and treat them as abnormal. However, this could potentially affect a person's mental health so as a society we should make such people feel easy and open and welcomed in society and teat them as if they are one of us.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Employees at the company misusing their access, like Facebook employees using their database permissions to stalk women
This can be highly problematic as the employees would basically be logged onto your accounts and can even view your posts which are on a privacy setting "only-me". This reminds me of how someone I know was mistreated by their manager and they had an issue over their wages so right before giving in her resignation letter she leaked the company's database by posting it on Twitter, which included budgeting and the balance sheet.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
When Elon Musk purchased Twitter, he also was purchasing access to all Twitter Direct Messages
This can be concerning as we tend to use social media sites like Instagram to chat with our friends and family, which includes a lot of personal information that we wouldn’t want anyone else to know, such as now that I have read this I will think twice before saying something very personal over social media messages and rather use my phone sms. Because on social media there is always a third party tracking your actions, which sounds like a privacy invasion.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Then Sean Black, a programmer on TikTok saw this and decided to contribute by creating a bot that would automatically log in and fill out applications with random user info, increasing the rate at which he (and others who used his code) could spam the Kellogg’s job applications:
This is a great example of using social media for the right cause and explaining how the context matters. It shows that ethical trolling can be done to get social justice for those who have been wronged, forcing such a big company to act right. It's interesting to see how the company's decision backfired using trolling.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
So social media sites use the data they collect to try and figure out what keeps people using their site, and what can they do to convince those users they need to open it again later.
Social media has achieved this goal long ago as this generation is on their phones all day. Such as every day when I check my screen time it's over 8 hours or more, and 70% time is spent on TikTok. By using data mining the app has fairly figured out what phase of life I am in and every TikTok that i see is relatable so I feel a connection with it. For example, if someone goes through a break-up, their whole FYP will be filled with tiktoks that would be about a break-up on how someone went through something same or something comforting, keeping them hooked to it. As for me whatever I am going through in my life it's like my TikTok knows all of it and shows exact same posts. In this way, I can think how data mining may be used to extract my conversations with my friends or what I like and repost depending on my mood is also being tracked.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Rule 43. The more beautiful and pure a thing is - the more satisfying it is to corrupt it
This is the sole purpose of trolling as if someone or something is confident in who they are and are unbothered by what the people say. That pinches the trollers about how the person is so happy and in peace and harmony so why don't we ruin their mental peace by being negative about them and putting them down and inciting a reaction out of them, what would the fun of bringing down someone who's already low in life that explains the rule " The more beautiful and pure a thing is - the more satisfying it is to corrupt it"
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Parasocial relationships are when a viewer or follower of a public figure (that is, a celebrity) feel like they know the public figure, and may even feel a sort of friendship with them, but the public figure doesn’t know the viewer at all.
I feel like this is something much more serious that exists and should be taken into consideration as when my friend's favourite celebrity died, she got so upset about it that she fully went into depression for a few months, she couldn't accept the fact that he passed away so in this way she felt a relationship with him, but he didn't even know she existed. Similarly, when there is a celebrity couple, people act like they are a part of their family, and when they break up, they feel devasted and sometimes say "We have lost hope in love for ourselves". It is ironic how social media can make us feel so connected to someone we don't even know personally.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Dr. McLaughlin pretended to be a person (@Sciencing_Bi) who didn’t exist.
This is not a good representation on how social media works as something so serious that involved so many emotions of people following Dr.McLaughlin, and when they heard that he passed away, they felt sad and even fact, attended the zoom memorial services makes me believe that how fake social media can be and it can play with peoples minds and misuse their emotions.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Sometimes designers add friction to sites intentionally. For example, ads in mobile games make the “x” you need to press incredibly small and hard to press to make it harder to leave their ad:
I feel like this is a very smart approach by the designers as sometimes the "x" is too small and can't be pressed so it just opens the website instead or it takes a while to show up, in most of the users are too lazy to go back to what they were doing and end up watching the whole ad. This for sure leads to the product getting views and even sales. It is so crazy how the whole world is hooked to social media and their phones all day and the designers just make money out of it
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
In the 1980s and 1990s, Bulletin board system (BBS) provided more communal ways of communicating and sharing messages. In these systems, someone would start a “thread” by posting an initial message. Others could reply to the previous set of messages in the thread.
I find it interesting how complications are simplified overtime if in the 1980s and 1990s, BBS was commonly used, to me it doesn't seem very intriguing as if it looks very dry without any pictures or a comment section where everyone can fight lol, like if we can not post memes on social media what's the point. However now, social media has become so much more simple and fun. We go to social media apps such as TikTok or Instagram to take our mind off from work and if I see the picture of BBS it looks like coding, so I wouldn't really use a site that reminds me of work even in my free time.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Twitter is accusing Musk’s complaint of being an invented excuse to back out of the deal, and the case is now in court.
It's intriguing to see how Musk claims about the percentage of spam bots not being true, and I do feel he was right about it like we see the bot accounts that we read about in the class earlier seemed a lot and there are a ton more hidden ones I am sure to say, despite of all that Twitter pressurized Musk into buying Twitter and now that he has naught it I would be interested in seeing what the actual percentage is.
-
If we are writing down what someone said, we are losing their tone of voice, accent, etc.
This statement made me think how it is something that I have thought about a lot in my subconscious mind to tackle situations, such as how I have been misunderstood over time greatly just because I typed it out. The other person did not hear me say it so, he/she misinterpreted it so I had to explain myself and it would be easy to understand through my tone. It's elementary to misjudge someone's perspective when it's typed out.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Bots present a similar disconnect between intentions and actions
I believe as the world is rushing towards technology and getting almost everything done by bot, it is high time we focus attention on the correlation between bots' intentions and responsibility. Otherwise, it can be highly misused as if one working behind the bots can get away with their bad intentions and actions by just blaming it on the bots' behavior.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
Bay found that 50.9% of people tweeting negatively about “The Last Jedi” were “politically motivated or not even human,” with a number of these users appearing to be Russian trolls. The overall backlash against the film wasn’t even that great, with only 21.9% of tweets analyzed about the movie being negative in the first place.
It's concerning how a bot can harm someone who is so invested in his work. The middle of directing a movie, it can fully break a person's confidence as he cannot distinguish between the positivity and the fake negativity online until later when it was researched, the reality was a lot different.
-
-
social-media-ethics-automation.github.io social-media-ethics-automation.github.io
-
“Rational Selfishness”: It is rational to seek your own self-interest above all else. Great feats of engineering happen when brilliant people ruthlessly follow their ambition.
Practicing this framework can be highly debatable as egoism can be something that almost every individual practices daily in their lives keeping themselves above everyone else, and making themselves their priority, which is totally justified. However when they see other people practicing they are considered selfish who not care about the other person's needs or views, which I feel is hypocritical because, at the end of the day, everyone thinks of themselves first naturally so we shouldn't blame a person for doing something that everyone does on daily basis.
-