193 Matching Annotations
  1. Last 7 days
  2. Sep 2020
    1. Consider, for instance, the footage that has been circulating from a New York City Council hearing, held over Zoom in June, which shows Krug in her Afro-Latinx pose. She introduces herself as Jess La Bombalera, a nickname apparently of her own making, adapted from Bomba, an Afro-Puerto Rican genre of music and dance. Broadcasting live from “El Barrio,” and wearing purple-tinted shades and a hoop in her nose, she lambasts gentrifiers, shouts out her “black and brown siblings,” and twice calls out “white New Yorkers” for not yielding their speaking time. What stands out, though, is the way Krug speaks, in a patchy accent that begins with thickly rolled “R”s and transitions into what can best be described as B-movie gangster. This is where desire outruns expertise. The Times, in a piece on Krug’s exposure, last week, nonetheless called this a “Latina accent,” lending credence to Krug’s performance. (The phrase was later deleted.) The offhand notation is a tiny example of the buy-in Krug has been afforded her entire scholastic career, by advisers and committee members and editors and colleagues. They failed to recognize the gap not between real and faux, so much, as between something thrown-on and something lived-in. That inattentiveness was Krug’s escape hatch.

      If nothing else, this is indicative of human cognitive bias. We'll tend to take at face value what is presented to us, but then once we "know" our confirmation bias will kick in on the other direction.

      I'm curious if there were examples of anyone calling out her accent contemporaneously? We're also stuck with the bias of wanting to go with the majority view. When you're the lone voice, you're less likely to speak up. This is also evinced in the story of her previous colleagues who had "gut feelings" that something was wrong, but didn't say anything or do any research at the time.

    1. Stuaert Rtchie [@StuartJRitchie] (2020) This encapsulates the problem nicely. Sure, there’s a paper. But actually read it & what do you find? p-values mostly juuuust under .05 (a red flag) and a sample size that’s FAR less than “25m”. If you think this is in any way compelling evidence, you’ve totally been sold a pup. Twitter. Retrieved from:https://twitter.com/StuartJRitchie/status/1305963050302877697

    1. Over time we tend to develop confirmation bias, forever seeking evidence that reinforces what we already believe, and downplaying or dismissing what doesn’t. We’re also designed, both genetically and instinctively, to put our own safety first, and to avoid taking too much risk. Rather than using our capacity for critical thinking to assess new possibilities, we often co-opt our prefrontal cortex to rationalize choices that were actually driven by our emotions.
    1. loss aversion. We are way more scared of losing what we have than excited about getting something new.
    2. This super-sketchy experiment had one final phase, how-ever: reconciliation. After successive scenarios were deployed where the Rattlers and the Eagles had common goals (unblock-ing a shared water supply, repairing a truck, etc.) they grew closer, even splitting drinks at the end (malts, come on people). In our work, we may not call them Rattlers and Eagles. Instead, we may call them IT and Legal and Marketing. Or “weird-code-name product-team one” versus “weird-code-name product-team two”. But if organizations incentivize based on scarcity and self-interest, we might as well just call it what it is, a scaled version of the Robbers Cave experiment. And to mitigate the siloing and combat ingroup bias, we’ll have to consider following a different approach.

      How can we do this for the democrats and the republicans?

    Tags

    Annotators

    1. WEIRD people have a bad habit of universalizing from their own particularities. They think everyone thinks the way they do, and some of them (not all, of course) reinforce that assumption by studying themselves. In the run-up to writing the book, Henrich and two colleagues did a literature review of experimental psychology and found that 96 percent of subjects enlisted in the research came from northern Europe, North America, or Australia. About 70 percent of those were American undergraduates. Blinded by this kind of myopia, many Westerners assume that what’s good or bad for them is good or bad for everyone else.

      This is a painful reality. It's also even more specific to the current Republican party. Do as we say, not as we do.

      This is the sort of example that David Dylan Thomas will appreciate.

    1. “Motivation conditions cognition,” Jonathan Rauch, a senior fellow at the Brookings Institution and a contributing writer at The Atlantic, wisely told me. Very few Trump supporters I know are able to offer an honest appraisal of the man. To do so creates too much cognitive dissonance.
  3. Aug 2020
    1. A fascinating viewpoint on social media, journalism, and information. There are some great implied questions for web designers hiding in here.

    2. In discussing the appeal of the News Feed in that same interview with Kirkpatrick, Zuckerberg observed, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” The statement is grotesque not because it’s false — it’s completely true — but because it’s a category error. It yokes together in an obscene comparison two events of radically different scale and import. And yet, in his tone-deaf way, Zuckerberg managed to express the reality of content collapse. When it comes to information, social media renders category errors obsolete.

      How can we minimize this sort of bias? How can we help to increase the importance of truly important things?

    1. The real enemy of independent thinking is not any external authority, but our own inertia. We need to find ways to counteract confirmation bias – our tendency to take into account only information that confirms what we already believe. We need to regularly confront our errors, mistakes, and misunderstandings. 
    1. name-pronunciation effect. And it’s exactly what you would expect. People with names you find easy to pronounce are viewed more favorably than those with names deemed difficult to pronounce, which can lead to pro-motions, votes, and more.
    2. http://bkaprt.com/dcb/02-33/

      I've been wondering about what I perceive as a dreadful editorial choice in how these footnote links have been done. Given the book however, I'm also now wondering if this is consciously done by design to provide a blindfold of sorts to prevent bias either for or against these sources.

      Either way I, still wish they were more traditionally done and/or presented. I at least wish I had the added context about them on their respective pages.

    3. Student evaluations of teachers are notoriously biased against women, with women routinely receiving lower scores than their male counterparts.

      I recall some work on this sort of gender bias in job recommendations as well. Remember to dig it up for reference as well.

    4. The framing effect, which is the bias the above examples exploit, is in my opinion the most dangerous bias in the world.
    5. What if, as in the case of anonymous résumés, the DA had no clue about the race of the accused? For that matter, what if you also removed identifying information on the victim and even the location of the crime? In 2019, the San Francisco DA’s office began anonymous charging, removing potentially biasing information from crime reports DAs use to decide whether or not to bring charges (http://bkaprt.com/dcb/02-30/). It’s too soon to tell the outcome of that experiment but, again, the removal of a decisive element may enhance an experience rather than detract from it.

      Another way to potentially approach this is to take the biasing information and reduce the charging by statistical means to negate the biased effects?

      Separately, how can this be done at the street level to allow policing resources to find and prosecute white collar criminals who may be having a more profoundly deleterious effect on society?

    6. As designers, we’re used to finding clever ways to reveal information to the user. But anonymous résumés challenge us with the notion that sometimes less information can lead to better decisions. We need to find artful ways to conceal infor-mation that might be biasing, even if true.
    7. mere-exposure effect, which occurs when you like something simply because you’ve seen it before. What’s remarkable about this effect is that it works even if you don’t remember seeing the thing before!
    8. you voted for Obama AND Hillary, fer cryin’ out loud!)

      There's some implicit statistical bias here because this likely isn't true for about half of the readership, presuming that they're American in the first place, which is another bias...

    9. The stories added meaning that couldn’t be matched by facts and figures about the items for sale. Meaning can be very difficult to pull off in design, but sto-ries create cognitive fluency around meaning. Our minds love narratives because they love patterns; stories are like really well-packaged patterns. Beginning, middle, and end. Connect-ing that pattern to an object or action in your design can be achieved, in part, by making sure your design accommodates story—whether in the metaphorical sense of how the page is structured (i.e., the page has a clear beginning, middle, and end) or the more literal sense of actually making sure the design leaves room for text that tells a story.

      This can also be leveraged to help improve one's memory.

    10. If the original price is horizontally farther away on the page from the sale price (Fig 2.8), the customer is more likely to view it as a better deal, even if the dollar amounts do not change. We equate visual distance with fiscal distance (http://bkaprt.com/dcb/02-19/).
    11. we aren’t gullible so much as efficient. We tend to believe things that are easy for our minds to process.
    12. By the way, just to get back to notational bias for a sec, the term “dark pattern” is problematic for reasons that should be clear if you think about it for a minute or two so let’s collectively start working on better language for that. Mmmmkay?

      Naming is hard, but it would have been nice to have a suggestion or two of alternates here.

    13. Immune neglect describes another failure of affective fore-casting, specifically around predicting our ability to cope with adverse outcomes.
    14. Our memories protect at all costs the idea that we’re good decision-makers.
    15. The interaction has been architected to benefit the grocer. It could just as easily have been architected to benefit the customer by putting the freshest fruit on top.

      There's also another bias going on here. We're biased to buy more when the shelves are heavily stocked, even if a lot of the food will ultimately go bad. So the grocer looses out because they often will sell far less than they stock.

    16. There’s even a bias called the bias blind spot,where you think you’re not biased but you’re sure everybody else is.
    17. Confirmation bias is pretty much what you think it is. You get an idea in your head and you go looking for evidence to confirm that it’s true. If any evidence comes up to challenge it, you cry “Fake news!” and move on with your life.
    18. We call these errors cognitive biases.

      or maybe even heuristics gone bad....

    19. At worst, they might actively harm them.

      Interesting that I've been listening to a series on behavioral economics this week and there've been a few examples of how to use people's cognitive bias against them.

      It can also be helpful for us to know our own biases so we can prevent people from using them against us as well.

    20. This is a book about people. Because design is about peo-ple. We design with and for people. The better we understand people, the more effective we’ll be at our jobs. In particular, this book is about the decision-making part of people. That’s where bias comes in.
    1. If a prominent magazine like The Lancet is publishing such rubbish, who is to say smaller and less well financed magazines aren’t doing the same on a langer scale?

  4. Jul 2020
  5. Jun 2020
    1. Winman, A., Hansson, P., & Juslin, P. (2004). Subjective Probability Intervals: How to Reduce Overconfidence by Interval Evaluation. Journal of Experimental Psychology: Learning, Memory, and Cognition, 30(6), 1167–1175. https://doi.org/10.1037/0278-7393.30.6.1167

  6. May 2020
  7. Apr 2020
    1. Other languages, German for example, are notorious for very long compunds like this and this, that are made up and written as one word directly. Perhaps the way your native language deals with compounds explains your (or other authors') personal preference and sense of "right"?
    1. From the eponymous Dunning of the Dunning-Kruger effect

      In our work, we ask survey respondents if they are familiar with certain technical concepts from physics, biology, politics, and geography. A fair number claim familiarity with genuine terms like centripetal force and photon. But interestingly, they also claim some familiarity with concepts that are entirely made up, such as the plates of parallax, ultra-lipid, and cholarine. In one study, roughly 90 percent claimed some knowledge of at least one of the nine fictitious concepts we asked them about. In fact, the more well versed respondents considered themselves in a general topic, the more familiarity they claimed with the meaningless terms associated with it in the survey.

      An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge. This clutter is an unfortunate by-product of one of our greatest strengths as a species. We are unbridled pattern recognizers and profligate theorizers. Often, our theories are good enough to get us through the day, or at least to an age when we can procreate. But our genius for creative storytelling, combined with our inability to detect our own ignorance, can sometimes lead to situations that are embarrassing, unfortunate, or downright dangerous—especially in a technologically advanced, complex democratic society that occasionally invests mistaken popular beliefs with immense destructive power (See: crisis, financial; war, Iraq). As the humorist Josh Billings once put it, “It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.”

      The way we traditionally conceive of ignorance—as an absence of knowledge—leads us to think of education as its natural antidote. But education, even when done skillfully, can produce illusory confidence. Here’s a particularly frightful example: Driver’s education courses, particularly those aimed at handling emergency maneuvers, tend to increase, rather than decrease, accident rates. They do so because training people to handle, say, snow and ice leaves them with the lasting impression that they’re permanent experts on the subject. In fact, their skills usually erode rapidly after they leave the course. And so, months or even decades later, they have confidence but little leftover competence when their wheels begin to spin.

      In these Wild West settings, it’s best not to repeat common misbeliefs at all. Telling people that Barack Obama is not a Muslim fails to change many people’s minds, because they frequently remember everything that was said—except for the crucial qualifier “not.” Rather, to successfully eradicate a misbelief requires not only removing the misbelief, but filling the void left behind (“Obama was baptized in 1988 as a member of the United Church of Christ”). If repeating the misbelief is absolutely necessary, researchers have found it helps to provide clear and repeated warnings that the misbelief is false. I repeat, false.

  8. Feb 2020
    1. people assume that by asking someone a question privately, they are doing everyone else a favor by bothering the fewest amount of people.
    2. We encourage team members to consider making private issues public wherever possible so that we can all learn from the experience, rather than requiring a small group to spend effort translating those learnings in the future.
    1. Wrong solutions can be fixed, but non-existent ones aren’t adjustable at all.
    2. It's important that we keep our focus on action, and don't fall into the trap of analysis paralysis or sticking to a slow, quiet path without risk.
    3. Decisions should be thoughtful, but delivering fast results requires the fearless acceptance of occasionally making mistakes; our bias for action also allows us to course correct quickly.
  9. Jan 2020
  10. Nov 2019
    1. It’s not enough to check the stuff that is suspicious: if you apply your investigations selectively, you’ve already lost the battle.

      This made me reflection on biases. If our research methodology is biased from the very beginning, then everything that comes after will also be biased.

  11. Sep 2019
    1. Isafilmlessgoodifitisproducedby arapist,arolelessexpertly performedifperformedby aharasser,aroutinelessfunny ifanexploitativeexhibitionistperformsit?

      so important, it can be a claim biased by the authors opinion.

  12. Jul 2019
  13. Apr 2019
  14. Feb 2019
    1. It’s about the student and his or her feelings and thoughts, though often articulated clumsily and from an as yet unthought through position.

      The advice to separate self from role is good... but let's think about this as a reaction to the student above who says they feel like the instructor doesn't allow equal opportunities to contribute in the class. Sometimes, despite all best efforts, the faculty member may be wrong, and deep listening and learning has to allow for that possibility. Don't take it personally, but model the kind of leadership which recognizes the need for personal change.

    2. For example, when discussing how women’s remarks are often ignored in business settings, the class or the instructor may be ignoring the remarks of women in the class. Seeing this and talking about it in the moment can enhance people’s understanding of the issue.

      True, but how does this interact with the power differential in the classroom? Can students really be expected to productively call out faculty members' biased behavior? It seems like an option not discussed in this paper is finding external facilitators to help navigate some of these issues.

  15. Jan 2019
    1. Likewise, merely telling students that motivated reasoning has an impact on their information processing is apt to yield mixed results because students who view themselves as intelligent, fair-minded people will likely meet this revelation with a level of disconfirmation bias.

      Students and faculty both. Many disciplines are reluctant to introduce critical perspectives on disciplinary publishing too early, feeling that students need grounding in accepted information flows before branching out into active debates.

    1. Of course men haven't been discriminated against as much a women in the work place. Men are "meant" to do jobs in STEM, while women aren't really seen in the STEM program as much. Women deserve to be recognized in anything as much as men are they're just as good.

    1. he would extend this to "science" tout court-does not use value-free lan-guage, that value-free language does not exist, and that we cannot posit a purely transparent language devoid of distracting ornament, through which we transact business with pure facts.

      This reminds me of an article I read in my Feminist Epistemologies class, "The Egg and the Sperm: How Science Has Constructed a Romance Based on Stereotypical Male-Female Roles," which shook me to my core. It argues that science and culture are intertwined and that they influence and reinforce one another. The scientific descriptions of egg, sperm, reproduction, and ovulation she provides to support her argument show how dangerous the perpetuation of the idea of "value-free" and/or unbiased language can be (and is).

    1. if packages and their elements are essential tools, then it makes a considerable difference that some are more readily available than others. Making sense of the world requires an effort, and those tools that are developed, spotlighted, and made readily accessible have a higher probability of being used

      i.e. the most available and accessible frames are the most likely to influence public opinion

    Tags

    Annotators

  16. Nov 2018
    1. how does misrepresentative information make it to the top of the search result pile—and what is missing in the current culture of software design and programming that got us here?

      Two core questions in one? As to "how" bad info bubbles to the top of our search results, we know that the algorithms are proprietary—but the humans who design them bring their biases. As to "what is missing," Safiya Noble suggests here and elsewhere that the engineers in Silicon Valley could use a good dose of the humanities and social sciences in their decision-making. Is she right?

    1. Humans participate in social learning for a variety of adaptive reasons, such as reducing uncertainty (Kameda and Nakanishi, 2002), learning complex skills and knowledge that could not have been invented by a single individual alone (Richerson and Boyd, 2000; Tomasello, Kruger, and Ratner, 1993), and passing on beneficial cultural traits to offspring (Palmer, 2010). One proposed social-learning mechanism is prestige bias (Henrich and Gil-White, 2001), defined as the selective copying of certain “prestigious” individuals to whom others freely show deference or respect in orderto increase the amount and accuracy of information available to the learner.Prestige bias allows a learner in a novel environment to quickly and inexpensively choose from whom to learn, thus maximizing his or her chances of acquiring adaptive behavioral so lutions toa specific task or enterp