8 Matching Annotations
  1. Feb 2023
    1. Removing bias from the existing decision process.

      This article is all about how we can utliize data for its purpose. But there are limitations in the way. I think we need to focus on removing bias from existing data before we start collecting new data. Data is useless if there is other data that contradicts it, which is does often. I feel as though I am always using statistics to defend a point, while the opposing side will have statistics that also defends their side. Biases in data make it really hard to utilize it.

    1. that the company over-relied on numbers

      This is what I was talking about in the previous annotation in the last article. The numbers in data, are not the answer to the problem. They are the reason we come up with solutions or answers. It also becomes really distracting to everything else around them, such as Nokia, who was only focused on evaluating there data they were collecting, rather than using there data to realize they are missing something. Numbers can really blindside people to aspects beyond quantitative data.

    1. And this lack of context becomes even more of a liability when accompanied by the kind of marketing hype we see in GDELT and other Big Dick Data projects.

      I think lack of context is a huge part of the reconstruction we need to consider in data collection. Just like how they discussed how the data they collected for the government in Brazil, was really hard to evaluate and draw conclusions on because there were so many questions from the numbers they collected. I think people miss that data collection is a supposed to tell a story. Numbers are only a small part of the big picture that is trying to be drawn.

    1. Algorithmic decision-making based on profitability and potential for cure threatens disabled and chronically ill people’s lives, as physicians increasingly rely on discriminatory algorithms for diagnoses, and insurance companies choose to overrule physicians’

      There was a huge health-care scandal involving an algorithm making decisions on deeming who needs care over others, which then translated into insurance. The algorithm factored in race. It saw white people being more "sick" than black people because white people went to receive care more. The algorithm did not factor the WHY aspect of this: the income disparity. White people make more income than black people, so they have more financial freedom to spend money on healthcare. This algorithm making this decision on who needs care over others, caused an even bigger divide in race hierarchy.

    1. It might be to avoid collecting data on whether someone is cis or transgender, to make all gender data optional, to not collect gender data at all, or even to stick with binary gender categories. Social computation researcher Oliver Haimson has asserted that “in most non-health research, it’s often not necessary to know participants’ assigned gender at birth.”44 Heath Fogg Davis agrees: his book Beyond Trans argues that we don’t need to classify people by sex on passports and licenses, for bathrooms or sports, among other things.45 By contrast, J. Nathan Matias, Sarah Szalavitz, and Ethan Zuckerman chose to keep gender data in binary form for their application FollowBias, which detects gender from names, in order to avoid making a person’s gender identity public against their wishes.

      I think evaluating gender data is important for us as woman. Did you know that most menstrual cycle research is done and based on the male anatomy? The information I was receiving for years never made sense to me and my period, especially someone with PCOS. I recently just bought a book called "The Female Factor" by Dr.Hazel Wallace, where she writes the reason she wrote this book is because "the male body has always been the default body in biomedics research". So, she created this book using studies and research that only pertain to woman.

      I feel as if not taking gender into data, puts more of a weight on us as woman. We need to build statistics and numbers around us. I also feel as if this goes for the non-binary community. The answer is not to erase gender, it is to embrace gender.

    1. They also found that there was still no reporting mechanism for ensuring that hospitals follow national safety standards, as is required for both hip surgery and cardiac care.

      The healthcare system is very under the radar, especially in terms of tracking its data. One of my clients advocates for black nurse protection, so I have learned a lot about the abuse of the health care system. I am not sure, if it is the lack of data they are tracking, OR the lack of data they are receiving. Hosptial's tend to want to keep everything hushed, and have a lot of orders in place to make it that way. I am curious to know if it is the lack of data or the lack of information that is being collected.

    1. What we see here instead is an ongoing social process in which scientific knowledge, technological invention, and corporate profit reinforce each other in deeply entrenched patterns that bear the unmistak­able stamp of political and economic power.

      This is a very powerful statement, that well defines our society's progression. Unfortunately, the people pay the price of evolution of technology within our society. Just like how the mechanical tomato harvester cost 32,000 jobs. It really makes me question the intentions behind progressing technology. I feel as if only wealthy people can truly benefit off of technology, which creates bigger divide in our financial system.

    1. the value of personal data ultimately occlude the legal and economic structures, material conditions, and conceptual assumptions that make the capture and exploitation of digital data

      This point refers to how our "value[s] of personal data" is what leads to the "exploitation of digital data". Us as users, give away our personal data almost every day, without questioning who has access to it and where it is going. For example, if a stranger walked up to you and asked for your address, you would most likely decline. When any new apps, websites, game, etc ask for personal data, such as phone numbers, address, emails, etc, we give it without hesitation. This is because we value our personal digital data differently than without it being digital.