- Apr 2023
-
theconversation.com theconversation.com
-
To regain control, we need cognitive strategies that help us reclaim at least some autonomy and shield us from the excesses, traps and information disorders of today’s attention economy.
In today's world, I feel like this is a very true fact. We need something, or someone, that is going to teach us as media consumers how to sift through what is real and what isn't. I feel that there isn't a specific course or person to learn from as of now, but maybe there should be a uniformity of some sort. I just have the sense I am always having to fight to be ahead of the newest change that occurs which allows misinformation to spread.
-
- Mar 2023
-
reutersinstitute.politics.ox.ac.uk reutersinstitute.politics.ox.ac.uk
-
What makes these networks so appealing to some younger audiences?
This is a great question. To answer, there is a huge disconnect with younger audiences and watching the news like people in the past. TikTok and platforms like it tend to summarize news into shorter pieces which appear to be more viewable to those who want a quick encapsulation of what occurred recently in the world.
-
TikTok entering the field
From my personal experience, the addition of TikTok has completely changed the game when it comes to gaining news through media. After doing a short amount of research, a viewer will find that there are algorithms which have been created by TikTok. It has been proven these algorithms change the user's experience by controlling what they see based off many different factors. The same goes for when a new news piece comes out, for it is not for sure the user will even see the videos of those TikTokers they follow. The platform also can easily spread misinformation based off a single video due to the information not being real going viral.
-
- Feb 2022
-
www.wired.com www.wired.com
-
Americans’ free-speech rights weren’t harmed in the takedown of Russian troll pages.
I think this is something that comes up a lot when we are talking about fighting misinformation and disinformation. When we talk about getting rid of misinformation, a lot of people bring up right to freedom of speech. I can understand why some people would be concerned about their rights being violated or infringed upon due to censorship. However, we must not forget the responsibility of having rights. You can say whatever you please, but you are responsible for whatever statements you made if that statement is known to cause harm to others.
-
That’s because setting policy around fake information that’s not seeded by a hostile state actor or a spam page remains an issue platforms are still deciding how to handle. YouTube chose to remove the video; Facebook chose to leave it up, and to leverage the “inform” approach (from its “remove, reduce, inform” framework).
Why are platforms such as Facebook deciding to leave this inaccurate information up on their site? Although Facebook isn't a news site they still should not encourage the spread of doctored information by allowing it to stay public.
-
“If we are not serious about facts and what’s true and what’s not, if we can’t discriminate between serious arguments and propaganda, then we have problems.”
I think something that is not being acknowledged very often when it comes to the conversation of discussing misinformation and disinformation is that there is always an unspoken social pressure or influence to always seem "likeable" "in-the-know" or "always right." Especially considering where social media is a very big resource in today's social climate, many people may not be honest about it, however a lot of us are fearful that if we say the wrong thing or think the wrong things, people may not befriend us, embrace us, or like you. So, there is a certain social engineering that is occurring where people are subscribing to certain ideas simply because that seems to be the group consensus and that causes pressure. We all must be accountable for debunking lies.
-
The velocity of social sharing, the power of recommendation algorithms, the scale of social networks, and the accessibility of media manipulation technology has created an environment where pseudo events, half-truths, and outright fabrications thrive.
As it has been stated by Daniel Kahneman, we all are "cognitively lazy." This a very telling statement that helps to reveal the different reasonings of why we are in a world full of "half-truths" but, deeper than that, why we all continue to accept these half-truths. A lot of times we do not want to take the necessary time it takes to evaluate information instead of just accepting things to be true.
-
- Jan 2022
-
www.nbcnews.com www.nbcnews.com
-
The announcement was the apparent end of one of the most haltingly successful companies to ride a wave of interest in online and directly sold alternative medicines — immunity-boosting oils, supplements, herbs, elixirs and so-called superfoods that, despite widespread concerns over their efficacy and safety, make up a lightly regulated, multibillion-dollar industry.
This is a perfect example of the contradiction that is our system of industries, that in one breath communicates how we need to debunk misinformation, but however still encourages profiting off of the spreading of misinformation and the naivetés of their consumer market.
-
Sellers packed video calls mourning the death of their miracle cure, railing against executives who had taken their money and seemingly run, and wondering how they might recoup the thousands of dollars they paid for BOO that never arrived.
I also had a similar reaction as many to this because it is interesting perspective because obviously if something was not the level of quality or value that it was marketed to be one would be very upset. But, also it is a very prominent example not only of misinformation, but for just how quickly misinformation can be spread. Furthermore than that, this situation highlights the rationale behind why people so easily and quickly believe, defend, and spread misinformation, unfortunately it usually sources from a very prominent trend occurring or a need that is causing desperation for more information on a subject without verifying credible sources.
-