6 Matching Annotations
  1. Last 7 days
  2. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Caroline Delbert. Some People Think 2+2=5, and They’re Right. Popular Mechanics, October 2023

      This reminds of of 1984 by George Orwell, where the government brainwashed everyone into believing that 2+2=5 and that is the objective truth. But this article makes me think that 2+2=4 may be the false objective truth that we were all brain washed to believe and that we aren't opening our minds to the subjective truth of reality.

    1. While we don’t have direct access to all the data ourselves, we can imagine that different definitions would lead to different results. And there isn’t a “best” or “unbiased” definition we should be using, since all definitions are simplifications that will help with some tasks and hurt with others.

      I think this is a reason to be skeptical of all statistics that we see or hear. This actually reminds me of all the statistics about how much water AI actually uses, and many of those statistics and numbers are fudged and different because people are measuring things in different ways, like what counts as water usage etc. So I think that its important to take statistics like that with a grain of salt because we never know what is being simplified, and we don't know the truth of the data for ourselves.

  3. social-media-ethics-automation.github.io social-media-ethics-automation.github.io
    1. Steven Tweedie. This disturbing image of a Chinese worker with close to 100 iPhones reveals how App Store rankings can be manipulated. February 2015. URL:

      I really wonder about how many other instances of botting are actually just real human beings who are doing the tedious work that can easily be done by a bot. Seeing that she may be working in bad work conditions also, it makes me feel upset that this may be an actual job that people do day to day. I also dislike how dishonest the whole thing is. Manipulating app store rankings this way seems unethical all the way around.

    1. How are people’s expectations different for a bot and a “normal” user?

      I think people expect bots to not have any free will or thoughtful intent behind its actions, while a human user clearly has both of those things. They might expect a bot to be straight forward and not able to be held accountable for its actions, because it does what it is coded to do, while human users are expected to have accountability and responsibility for what it posts because of their free thinking and ability to have intentions.

  4. Apr 2026
    1. There are absolute moral rules and duties to follow (regardless of the consequences). They can be deduced by reasoning about the objective reality.

      I don't agree that morality is objective. Absolute moral rules do not take into account context, and morality in practice is ultimately subjective to situation. Like the lying is wrong example, lying may be wrong in general, but in a situation where lying might save someone's life for example, lying is actually the morally correct thing to do. So morality is subject to change based on situational context rather than a concrete rule.

    1. I absolutely believe that tech workers should be responsible for thinking through ethical implications of their creations. For example, deep fake technology can easily be weaponized for many terrible things, and I think that the people who create this technology know that, but simply choose to turn a blind eye to it for the sake of their own profits. I can't think of any ways to use that kind of technology ethically or in a way that benefits the world, and in knowing that, the creators are being hugely irresponsible by still releasing that kind of technology to the public.