17 Matching Annotations
  1. Aug 2019
    1. and annotation can tell us why that alternative view matters..d-undefined, .lh-undefined { background-color: rgba(0, 0, 0, 0.2) !important; }.d-undefined, .lh-undefined { background-color: rgba(0, 0, 0, 0.5) !important; }1Troy Hicks With this potential social function, we are reminded that annotation is not neutral as it helps those who add notes to texts produce new discourses and knowledge.

      I wonder how better, big data being overlaid on virtual reality may be helpful to the currently marginalized in the future? Would it be useful to have shared data about businesses and practices that tend to marginalize people further? I recall an African-American comedian recently talking about the Confederate Flag in a (Netflix?) comedy special. They indicated that the flag actually had some worthwhile uses and reminisced driving on rural highways at night looking for a place to stay. When they saw that flag flying over a motel, they knew better to keep driving and stay at another hotel further down the road. In this case, the flag over the hotel not-so-subtly annotated the establishment itself.

      I perceive a lot of social slights and institutionalized racism as being of a marginal sort which are designed to be bothersome to some while going wholly unnoticed by others. What if it were possible to aggregate the data on a broader basis to bring these sorts of marginal harms to the forefront for society to see them? As an example, consider big companies doing marginal harms to a community's environment over time, but going generally unnoticed until the company has long since divested and/or disappeared. It's hard to sue them for damages decades later, but if one could aggregate the bigger harms upfront and show those annotated/aggregated data up front, then they could be stopped before they got started.

      As a more concrete example, the Trump Management Corporation was hit with a consent decree in the early 1970's for prejudicial practices against people of color including evidence that was subpoenaed showing that applications for people of color were annotated with a big "C" on them. Now consider if all individuals who had made those applications had shared some of their basic data into a pool that could have been accessed and analyzed by future applicants, then perhaps the Trumps would have been caught far earlier. Individuals couldn't easily prove discrimination because of the marginal nature of the discrimination, but data in aggregate could have potentially saved the bigger group.

  2. Jun 2019
    1. CBS SEX LAWSUIT GETS CLASS ACTION 

      An example of a 1996 lawsuit involving discrimination in the workforce. "Boys will be boys" Women are not staying silent the years behind the fight should lead to more open doors and audience acknowledgment.

  3. Mar 2019
    1. crises of discrimination, particularly around such identity-based facets as gender, race and ethnicity

      crises of discrimination in open projects

  4. Feb 2019
    1. Nearly half of FBI rap sheets failed to include information on the outcome of a case after an arrest—for example, whether a charge was dismissed or otherwise disposed of without a conviction, or if a record was expunged

      This explains my personal experience here: https://hyp.is/EIfMfivUEem7SFcAiWxUpA/epic.org/privacy/global_entry/default.html (Why someone who had Global Entry was flagged for a police incident before he applied for Global Entry).

    2. Applicants also agree to have their fingerprints entered into DHS’ Automatic Biometric Identification System (IDENT) “for recurrent immigration, law enforcement, and intelligence checks, including checks against latent prints associated with unsolved crimes.

      Intelligence checks is very concerning here as it suggests pretty much what has already been leaked, that the US is running complex autonomous screening of all of this data all the time. This also opens up the possibility for discriminatory algorithms since most of these are probably rooted in machine learning techniques and the criminal justice system in the US today tends to be fairly biased towards certain groups of people to begin with.

    3. It cited research, including some authored by the FBI, indicating that “some of the biometrics at the core of NGI, like facial recognition, may misidentify African Americans, young people, and women at higher rates than whites, older people, and men, respectively.

      This re-affirms the previous annotation that the set of training data for the intelligence checks the US runs on global entry data is biased towards certain groups of people.

    1. In 1863, her medical credentials were finally accepted, so she moved to Tennessee, where she was appointed as a War Department surgeon

      The phrasing of this appears to be somewhat biased. It sounds like her credentials weren't up to snuff or something but really, the military was low on surgeons at that time and simply didn't want a woman. https://hyp.is/vAWzXCtjEem5j1tLLCQ8dg/cfmedicine.nlm.nih.gov/physicians/biography_325.html

    2. Because of her credentials, she didn't want to be a nurse, either, so she chose to volunteer for the Union Army.

      This is some what conflicting information. According to https://hyp.is/vAWzXCtjEem5j1tLLCQ8dg/cfmedicine.nlm.nih.gov/physicians/biography_325.html she did work as a Nurse, she just wasn't paid.

    1. in 1863 she was briefly appointed surgeon in an Ohio Regiment.

      She finally was appointed a surgeon near the end of the war.

    2. At the outbreak of the Civil War, she volunteered in Washington to join the Union effort, and worked as a nurse in a temporary hospital set up in the capital.

      She worked as an unpaid nurse because she was not allowed to join as a surgeon in the US military.

  5. Jan 2019
    1. Of course men haven't been discriminated against as much a women in the work place. Men are "meant" to do jobs in STEM, while women aren't really seen in the STEM program as much. Women deserve to be recognized in anything as much as men are they're just as good.

  6. Oct 2018
  7. Feb 2017
    1. The simple truth is that

      What follows is a good description of structural "everyday" inequalities that are not intentional but still examples of racism and sexism.

  8. Jan 2017
    1. camps served as the special prisons of the secret police

      The camps were used to keep Jews because Nazi's were against their beliefs.

  9. Nov 2016
    1. 19 May 2016. Republicans defeated an amendment by Rep. Sean Maloney D-NY, aimed at upholding an executive order that bars discrimination against LGBT employees by federal contractors. Seven Republicans switched their votes under pressure from House leaders. Final vote 213-212.

  10. Oct 2016
    1. Facebook is allowing advertisers to exclude users based on race.

      The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American, or Hispanic people.

      When we showed Facebook’s racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, “This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.”

  11. Nov 2015
    1. "Our nation's leaders need to speak out against this type of anti-Muslim hate. The American Muslim community is a small minority and we by ourselves, we can't push back against the tide of anti-Muslim sentiment," said Hooper. "What we're seeing is the end result of the mainstreaming of Islamophobia by leading public officials, such as Ben Carson and Donald Trump.

      Incidents of discrimination and harassment against Muslims have increased since the attacks in Paris on November 13th. It has received little or no attention in the mainstream press.

      talkingpointsmemo.com "The leader of a group of armed anti-Muslim protesters in Texas posted the addresses of dozens of local Muslims and 'Muslim sympathizer(s)' to Facebook on Tuesday."