2 Matching Annotations
  1. Feb 2022
    1. People began to feel that tech companies were not just neutral hosts; they bore some responsibility for what their algorithms circulated.

      This sentiment reminds me of the learning materials that we read through in Module 4 regarding Facebook knowing that teen girls were suffering from what algorithms were suggesting them. We even read this week that when it comes to shifting algorithms to stop suggesting misinformation, social media companies "want to do the right thing, but don’t follow through because it hurts engagement on the platform" (Van der Linden, 2021).

  2. Jan 2022
    1. Facebook did not respond to requests for comment on the BOO groups or whether their claims violated the company’s content policies.

      Yet another instance of Facebook's passive allowance of disinformation and outright MLM scams to thrive on their website. One of the case studies we looked at this week noted a similar instance happening when the author found a Black Lives Matter page was actually a personal page raising fraudulent money for false causes. Facebook did not take action until information was made public. Even one of the tools used by the author to uncover this fraud is no longer available to the public (Facebook Graph Search). How does the average internet user scope out disinformation without tools such as this?