3 Matching Annotations
  1. Oct 2017
    1. The company that markets Compas says its formula is a trade secret.“The key to our product is the algorithms, and they’re proprietary,” one of its executives said last year. “We’ve created them, and we don’t release them because it’s certainly a core piece of our business.”

      Again, this goes back to the same issue of protecting algorithms for business purposes, but that also has implications for legal settings. In the near future, I predict that there will be a better, more legitimate way to validate wha these algorithms look at from a third party point of view.

    2. The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”

      This is a fascinating way to create a potential list for high-risk individuals. Like I mentioned in a previous article annotation, I think it has potential to warn us of people who could be trouble, but to target someone or accuse them without evidence is unfair and unjust.

    3. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.

      The fact that these algorithms are secret make this particular example a little bit sketchy. I do think there's a place for this kind of technology in the judicial system, but when there's no way to check/challenge how this algorithm works, then I think it's fair for people to question its validity. I understand why they wouldn't be able to just expose the algorithm, but some way of making sure its legitimate would make sense.