58 Matching Annotations
  1. Last 7 days
    1. ellenson says that there are much simpler ways of discovering a person’s sexual prefer-ence, such as looking at their social-media accounts. “The idea that a government would need a DNA test to figure out if someone is gay is ridiculous,” he say

      This is wild but shows the lengths people will go to use data in such a negative way.

    2. We are still using these algorithms called humans that are really biased,” says Ghani. “We’ve tested them and known that they’re horrible, but we still use them to make really important decisions every day.”these unregulated tools can harm individu-als and society, causing anxiety, unneces-sary medical expenses, stigmatization and worse. “It’s the Wild West of genetics,” says Erin Demo, a genetic counsellor at Sibley Heart Center Cardiology in Atlanta, Georgia. “This is just going to get harder and harder.”Bellenson posted his app on GenePlaza, an online marketplace for DNA-interpretation tools, in early October. For US$5.50, a person could upload their genetic data — as supplied by consumer DNA sequencing companies such as 23andMe of Mountain View, Califor-nia — and the app would place them along a Nature | Vol 574 | 31 October 2019 | 609©2019SpringerNatureLimited.Allrightsreserved.©2019SpringerNatureLimited.Allrightsreserved.

      Scary

    3. Developers should routinely run tests such as those performed by Obermeyer’s group before they deploy an algorithm that affects human lives, says Rayid Ghani, a computer scientist at Carnegie Mellon University in Pittsburgh, Pennsylvania.

      I think this a good direction toward finding a solution. But, how often is this used?

    4. Developers should routinely run tests such as those performed by Obermeyer’s group before they deploy an algorithm that affects human lives, sa

      I think this is a good direction toward making sure there is some sort of regulation into these algorithms....

    5. had to be sicker than white people before being referred for additional help. Only 17.7% of patients that the algorithm assigned to receive extra care were black. The researchers calculate that the proportion would have been 46.5% if the algorithm was unbiased.

      This is all so sad.

    6. The scientists speculate that this reduced access to care is due to the effects of systemic racism, ranging from distrust of the health-care system to direct racial discrimination by health-care providers

      So even when data is there to suggest who needs priority care, they choose look the other way....

    7. The researchers found that the algorithm assigned risk scores to patients on the basis of total health-care costs accrued in one year. They say that this assumption might have seemed reasonable because higher health-care costs are generally associated with greater health needs. The average black person in the data set that the scientists used had similar health-care costs to the average white person

      This is a good example and breakdown of the algorithm

    8. tumbled across the problem while examin-ing the impact of programmes that provide additional resources and closer medical super-vision for people with multiple, sometimes overlapping, health problems.

      Sad that they had to "stubble" across this data.

    9. But smaller studies and anecdotal reports have documented unfair and biased decision-making by algorithms used in everything from criminal justice to education and health care

      I also think that it's important to have more representation in our healthcare systems to bring more of this information to light.

    10. . Hospitals and insurers use the algorithm and others like it to help to manage care for about 200 million people in the United States each year

      I had no idea but it's interesting to see how algorithms are used for everyday rights such as healthcare.

    11. The study, published in Science on 24 Octo-ber, concluded that the algorithm was less likely to refer black people than white people who were equally sick to programmes that aim to improve care for patients with complex medical need

      Again, what type of efforts are being made to create some form of regulation on these biased algorithms? Data is being utilized to prevent certain groups from accessing proper healthcare, and surely this type of data manipulation is done in other forms

    12. n algorithm widely used in US hospi-tals to allocate health care to patients has been systematically discriminat-ing against black people, a sweeping analysis has foun

      Very manipulative

    1. And here’s one more thing about algorithms: they can leap from onefield to the next, and they often do. Research in epidemiology can holdinsights for box office predictions; spam filters are being retooled toidentify the AIDS virus. This is true of WMDs as well. So if mathematicalmodels in prisons appear to succeed at their job—which really boils downto efficient management of people—they could spread into the rest of theeconomy along with the other WMDs, leaving us as collateral damage

      Could lead to serious damage in our economy too.

    2. It’s that so manysuffer. These models, powered by algorithms, slam doors in the face ofmillions of people, often for the flimsiest of reasons, and offer no appeal.They’re unfair.

      Why there is work to be done.

    3. The scoring algorithm is hidden. Acouple of the other WMDs might not seem to satisfy the prerequisite forscale. They’re not huge, at least not yet.

      Good point.

    4. So to sum up, these are the three elements of a WMD: Opacity, Scale,and Damage. All of them will be present, to one degree or another, in theexamples we’ll be coverin

      Goo way to understand the tone of this rest of the readings and chapters.

    5. . Thepenal system is teeming with data, especially since convicts enjoy evenfewer privacy rights than the rest of us. What’s more, the system is somiserable, overcrowded, inefficient, expensive, and inhumane that it’scrying out for improvements. Who wouldn’t want a cheap solution likethis

      Seems like there needs to always be some type of regulation though.

    6. Here, the LSI–R again easily qualifies as a WMD. The people putting ittogether in the 1990s no doubt saw it as a tool to bring evenhandednessand efficiency to the criminal justice system.

      More of these models are needed to bring evenhandedness....

    7. like Google, Amazon, andFacebook, these precisely tailored algorithms alone are worth hundredsof billions of dollars.

      Something that I never thought about...but makes sense.

    8. nd yet many companies go out of their way to hide the results of theirmodels or even their existence. One common justification is that thealgorithm constitutes a “secret sauce” crucial to their business.

      I think as time passes and technology becomes more advanced accessible, this type of data won't be easy to hide. Or at least I hope.

    9. Andthey keep quiet about the purpose of the LSI–R questionnaire. Otherwise,they know, many prisoners will attempt to game it, providing answers tomake them look like model citizens the day they leave the joint.

      SO messed up. Using this type of data and justifying it as a form of subconscious racism

    10. e hypothetical family meal model. If mykids were to question the assumptions that underlie it, whether economicor dietary, I’d be all too happy to provide them.

      I think in this model, it is important to know and back up your data

    11. They are transparent andcontinuously updated, with both the assumptions and the conclusionsclear for all to see.

      The most important aspect--continuously updated

    12. Thequestionnaire does avoid asking about race, which is illegal. But with thewealth of detail each prisoner provides, that single

      Type of data that is not necessarily manipulated, but created due to the inevitable surroundings of historically marginalized youth.

    13. recidivism models. These help judges assess thedanger posed by each convict. And by many measures they’re animprovement. They keep sentences more consistent and less likely to beswayed by the moods and bi ases of judges.

      A good way to stabilize the model

    14. That pattern isn’t unique to Texas.According to the American Civil Liberties Union, sentences imposed onblack men in the federal system are nearly 20 percent longer than thosefor whites convicted of similar crimes. And though they make up only 13percent of the population, blacks fill up 40 percent of America’s prisoncells.So you might think that computerize

      Again, data that is not spoken about by those who seem to act on it.

    15. a press release, he declared: “It isinappropriate to allow race to be considered as a factor in our criminaljustice system....The people of Texas want and deserve a system thataffords the same fairness to everyone.

      Absolutely agree. This is just one example of how those in power manipulate data (or their understanding of it) to further perpetuate racism.

    16. is powered by haphazard data gathering and spuriouscorrelations, reinforced by institutional inequities, and polluted byconfirmation bias.

      Beautifully said

    17. Whether it comes fromexperience or hearsay, the data indicates that certain types of people havebehaved badly. That generates a binary prediction that all people of thatrace will behave that same way.

      wow, interesting perspective.

    18. . Ittakes some time to gather new data about the child and adjust theirmodels

      Important to keep up with data over time. Time changes, data changes, yet some policies lack the new knowledge it takes to make necessary changes due to relying on old data.

    19. This is true of internal models as well. You can often see troubles whengrandparents visit a grandchild they haven’t seen for a whil

      Great example. Makes sense.

    20. Preferences would count for little or nothing. By contrast, if mykids were creating the model, success might feature ice cream at everymeal.

      Definitely depends on who is creating the model.

    21. evaluates teacherslargely on the basis of students’ test scores, while ignoring how much theteachers engage the students, work on specific skills, deal with classroommanagement, or help students with personal and family problems. It’soverly simple, sacrificing accuracy and insight for efficiency

      I see this and think of the harm done by following this model. Just like teachers' success can't be measured by numbers, neither can students. It completely eliminates the importance of qualitative data.

    22. No model can include all of the real world’scomplexity or the nuance of human communication. Inevitably, someimportant information gets left out. I

      SO why use it?

    23. I would also include parameters, or constraints. Imight limit the fruits and vegetables to what’s in season and dole out acertain amount of Pop-Tarts, but only enough to forestall an openrebellion. I also would add a number of rules. This one likes meat, thisone likes bread and pasta, this one drinks lots of milk and insists onspreading Nutella on everything in sight

      Ridiculous and tiresome to even go through all these options. A little glimpse into her mind as a mathematician

    24. The better solution would be to train the model over time, enteringdata every day on what I’d bought and cooked and noting the responsesof each family member

      This is how we can get as close to the ideal model as possible.

    25. n for me? Or whatif my friend who has kids wants to know my methods? That’s when I’dstart to formalize my model, making it much more systematic and, insome sense, mathematical. And if I were feeling ambitious, I might put itinto a computer program.Ideally, the program would include all of the available food options,their nutritional value and cost, and a complete database of my family’s

      I think it's important to do all this but perhaps may not turn out to be the ideal "model"

    26. Moreover, their data is highly relevant to the outcomes they aretrying to predict. This may sound obvious, but as we’ll see throughoutthis book, the folks building WMDs routinely lack data for the behaviorsthey’re most interested in

      can the same be said for any sport? I think of the times where underdogs win. I watch a lot of volleyball and never thought A&M would win the national championship. But their game throughout the season became increasingly better. Were player statistics getting to the level of a #1 team?

    27. Shifting defenses is only one piece of a much larger question: Whatsteps can baseball teams take to maximize the probability that they’llwin? In their hunt for answers, baseball statisticians have scrutinizedevery variable they can quantify and attached it to a value.

      Makes sense why there are baseball statisticians...can't remember the movie where stats was heavily used in baseball.....

    28. In other words, he was thinking like a data scientist. He had analyzedcrude data, most of it observational: Ted Williams usually hit the ball toright field. Then he adjusted

      Sports data amazes me and I see how coaches make so much. Not only do they have to be experts in the game, but they also have to constantly monitor trends on game days and make critical decisions, especially during high stakes games.

    29. But a mathteacher named Sarah Bax continued to push the district administrator, aformer colleague named Jason Kamras, for details. After a back-and-forth that extended for months, Kamras told her to wait for an upcomingtechnical report. Bax responded: “How do you justify evaluating peopleby a measure for which you are unable to provide explanation?

      It shouldn't be this hard to seek explanation. How are decision makers able to back data when they cant explain or justify it?

    30. This underscores another common feature of WMDs. They tend topunish the poor. This is, in part, because they are engineered to evaluatelarge numbers of people

      hard to read but true

    31. ttempting to calculate the impact that one person may have onanother over the course of a school year is much more complex. “Thereare so many factors that go into learning and teaching that it would bevery difficult to measure them all,”

      100% agree...teachers shouldn't be punished for scoring low on what seems to be an unreliable algorithm. It's like saying there's so much more value in quantitative data rather than qualitative data.

    32. The going theory was that the students weren’t learning enoughbecause their teachers weren’t doing a good job. So in 2009, Rheeimplemented a plan to weed out the low-performing teachers. This is thetrend in troubled school districts around the country, and from a systemsengineering perspective the thinking makes perfect sense: Evaluate theteachers. Get rid of the worst ones, and place the best ones where theycan do the most good. In the language of data scientists, this “optimizes”the school system, presumably ensuring better results for the kids. Exceptfor “bad” teachers, who could argue with that

      WILD! And perfectly explains the term Weapons of Math Destruction

    33. And they tended to punish the poor and the oppressedin our society, while making the rich richer

      Funny how this is true to this day. How can it stop when the benefits have been so lucrative to those in power?

    34. And increasingly they focused not on the movements of globalfinancial markets but on human beings, on us. Mathematicians andstatisticians were studying our desires, movements, and spending power.They were predicting our trustworthiness and calculating our potential asstudents, workers, lovers, criminals

      I think this clearly explains how math is used to influence bad economic decisions by individuals.

    35. The housing crisis, the collapse of major financial institutions,the rise of unemployment—all had been aided and abetted bymathematicians wielding magic formulas. What’s more, thanks to theextraordinary powers that I loved so much, math was able to combinewith technology to multiply the chaos and misfortune, adding efficiencyand scale to systems that I now recognized as flawed

      I can't imagine what it was working during this time especially in her field.