30 Matching Annotations
  1. Last 7 days
    1. Beauty-based discrimination is rarely addressed, but one such instance was the fuss about Facebook's "feeling fat" emoji. It was framed that Facebook was the guilty party here, but I would argue that the offended people were the guilty party, as they're the ones who made the leap from "fat" to "sub-human". In some cultures, being fat is acceptable or even desirable.
    2. "Because beauty will be so readily accessible, and skin color and features will be similar, prejudices based on physical features will be nearly eradicated. Prejudice will be socioeconomically based.”
    3. films such as Who Framed Roger Rabbit and Beauty and the Beast also touch on the important topic of discrimination
    4. Saying for instance that someone's good deed makes them a "beautiful person"; or saying that a "beautiful 8-year-old girl is missing" encourages the idea that their beauty is some kind of symbol of their innocence or perhaps their status as a human
    1. A safe name

      "a safe name" could mean that the name was used to protect her for discrimination, A name that they see as is safe to use in their eyes.

  2. Mar 2020
    1. If these asset owners regarded the “robots” as having the same status as guide dogs, blind people or default human citizens, they would undoubtedly stop imposing CAPTCHA tests and just offer APIs with reasonable limits applied.
    2. Robots are currently suffering extreme discrimination due to a few false assumptions, mainly that they’re distinctly separate actors from humans. My point of view is that robots and humans often need to behave in the same way, so it’s a fruitless and pointless endeavour to try distinguishing them.
    3. In order to bypass these discriminatory CAPTCHA filters
    1. The digital divide is real, and in the coming months, those without internet access or devices that can run newer software will be shut out of many of the digital communities we’re building to support one another.
    1. The deception enabled by dark pattern design not only erodes privacy but has the chilling effect of putting web users under pervasive, clandestine surveillance, it also risks enabling damaging discrimination at scale.
  3. Dec 2019
    1. In low-income countries the vast majority are unwilling to pay for effective drugs simply because they are unable to pay. Low-income nations need more price discrimination—and vastly lower prices—if they are ever to afford the world's most effective medicines.

      Does price discrimination help poor countries here? Which countries have more price-inelastic demand? Does PD increase social welfare for this case?

    1. She found a German seller offering packs of the same nappies she buys in Luxembourg for the same price she normally pays. Looking more closely at the unit price, however, Nadine realised that the German packs contained 140 nappies, whereas the packs in Luxembourg had only 90, making them much more expensive. She switched straight away to buying all her nappies from the German shop.

      If this was price discrimination... which country's consumers likely had the higher price elasticity?

  4. Aug 2019
    1. and annotation can tell us why that alternative view matters..d-undefined, .lh-undefined { background-color: rgba(0, 0, 0, 0.2) !important; }.d-undefined, .lh-undefined { background-color: rgba(0, 0, 0, 0.5) !important; }1Troy Hicks With this potential social function, we are reminded that annotation is not neutral as it helps those who add notes to texts produce new discourses and knowledge.

      I wonder how better, big data being overlaid on virtual reality may be helpful to the currently marginalized in the future? Would it be useful to have shared data about businesses and practices that tend to marginalize people further? I recall an African-American comedian recently talking about the Confederate Flag in a (Netflix?) comedy special. They indicated that the flag actually had some worthwhile uses and reminisced driving on rural highways at night looking for a place to stay. When they saw that flag flying over a motel, they knew better to keep driving and stay at another hotel further down the road. In this case, the flag over the hotel not-so-subtly annotated the establishment itself.

      I perceive a lot of social slights and institutionalized racism as being of a marginal sort which are designed to be bothersome to some while going wholly unnoticed by others. What if it were possible to aggregate the data on a broader basis to bring these sorts of marginal harms to the forefront for society to see them? As an example, consider big companies doing marginal harms to a community's environment over time, but going generally unnoticed until the company has long since divested and/or disappeared. It's hard to sue them for damages decades later, but if one could aggregate the bigger harms upfront and show those annotated/aggregated data up front, then they could be stopped before they got started.

      As a more concrete example, the Trump Management Corporation was hit with a consent decree in the early 1970's for prejudicial practices against people of color including evidence that was subpoenaed showing that applications for people of color were annotated with a big "C" on them. Now consider if all individuals who had made those applications had shared some of their basic data into a pool that could have been accessed and analyzed by future applicants, then perhaps the Trumps would have been caught far earlier. Individuals couldn't easily prove discrimination because of the marginal nature of the discrimination, but data in aggregate could have potentially saved the bigger group.

  5. Jun 2019
    1. CBS SEX LAWSUIT GETS CLASS ACTION 

      An example of a 1996 lawsuit involving discrimination in the workforce. "Boys will be boys" Women are not staying silent the years behind the fight should lead to more open doors and audience acknowledgment.

  6. Mar 2019
    1. crises of discrimination, particularly around such identity-based facets as gender, race and ethnicity

      crises of discrimination in open projects

  7. Feb 2019
    1. Nearly half of FBI rap sheets failed to include information on the outcome of a case after an arrest—for example, whether a charge was dismissed or otherwise disposed of without a conviction, or if a record was expunged

      This explains my personal experience here: https://hyp.is/EIfMfivUEem7SFcAiWxUpA/epic.org/privacy/global_entry/default.html (Why someone who had Global Entry was flagged for a police incident before he applied for Global Entry).

    2. Applicants also agree to have their fingerprints entered into DHS’ Automatic Biometric Identification System (IDENT) “for recurrent immigration, law enforcement, and intelligence checks, including checks against latent prints associated with unsolved crimes.

      Intelligence checks is very concerning here as it suggests pretty much what has already been leaked, that the US is running complex autonomous screening of all of this data all the time. This also opens up the possibility for discriminatory algorithms since most of these are probably rooted in machine learning techniques and the criminal justice system in the US today tends to be fairly biased towards certain groups of people to begin with.

    3. It cited research, including some authored by the FBI, indicating that “some of the biometrics at the core of NGI, like facial recognition, may misidentify African Americans, young people, and women at higher rates than whites, older people, and men, respectively.

      This re-affirms the previous annotation that the set of training data for the intelligence checks the US runs on global entry data is biased towards certain groups of people.

    1. In 1863, her medical credentials were finally accepted, so she moved to Tennessee, where she was appointed as a War Department surgeon

      The phrasing of this appears to be somewhat biased. It sounds like her credentials weren't up to snuff or something but really, the military was low on surgeons at that time and simply didn't want a woman. https://hyp.is/vAWzXCtjEem5j1tLLCQ8dg/cfmedicine.nlm.nih.gov/physicians/biography_325.html

    2. Because of her credentials, she didn't want to be a nurse, either, so she chose to volunteer for the Union Army.

      This is some what conflicting information. According to https://hyp.is/vAWzXCtjEem5j1tLLCQ8dg/cfmedicine.nlm.nih.gov/physicians/biography_325.html she did work as a Nurse, she just wasn't paid.

    1. in 1863 she was briefly appointed surgeon in an Ohio Regiment.

      She finally was appointed a surgeon near the end of the war.

    2. At the outbreak of the Civil War, she volunteered in Washington to join the Union effort, and worked as a nurse in a temporary hospital set up in the capital.

      She worked as an unpaid nurse because she was not allowed to join as a surgeon in the US military.

  8. Jan 2019
    1. Of course men haven't been discriminated against as much a women in the work place. Men are "meant" to do jobs in STEM, while women aren't really seen in the STEM program as much. Women deserve to be recognized in anything as much as men are they're just as good.

  9. Oct 2018
  10. Feb 2017
    1. The simple truth is that

      What follows is a good description of structural "everyday" inequalities that are not intentional but still examples of racism and sexism.

  11. Jan 2017
    1. camps served as the special prisons of the secret police

      The camps were used to keep Jews because Nazi's were against their beliefs.

  12. Nov 2016
    1. 19 May 2016. Republicans defeated an amendment by Rep. Sean Maloney D-NY, aimed at upholding an executive order that bars discrimination against LGBT employees by federal contractors. Seven Republicans switched their votes under pressure from House leaders. Final vote 213-212.

  13. Oct 2016
    1. Facebook is allowing advertisers to exclude users based on race.

      The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American, or Hispanic people.

      When we showed Facebook’s racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, “This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.”

  14. Nov 2015
    1. "Our nation's leaders need to speak out against this type of anti-Muslim hate. The American Muslim community is a small minority and we by ourselves, we can't push back against the tide of anti-Muslim sentiment," said Hooper. "What we're seeing is the end result of the mainstreaming of Islamophobia by leading public officials, such as Ben Carson and Donald Trump.

      Incidents of discrimination and harassment against Muslims have increased since the attacks in Paris on November 13th. It has received little or no attention in the mainstream press.

      talkingpointsmemo.com "The leader of a group of armed anti-Muslim protesters in Texas posted the addresses of dozens of local Muslims and 'Muslim sympathizer(s)' to Facebook on Tuesday."