9 Matching Annotations
  1. Mar 2017
    1. The messages you send should not demoralize students, and dissemination strategies should ensure that students are able to access interventions with relative ease. Craft messages in the right way, and ensure interventions are accessible to target populations.

      This is a nice blanket statement. Would love to see some empirical data on this.

    2. Many consider their models and algorithms proprietary, meaning institutions are not involved in the design process or are deliberately kept out. You should make transparency a key criterion when choosing to work with any vendor.

      Oh the joys of working with vendors. Q:"What's inside that black box?" A: "Oh, nothing"

    3. It is crucial to address bias in predictive models, ensure the statistical significance of predictions beyond race, ethnicity, and socioeconomic status, and forbid the use of algorithms that produce discriminatory results.

      We should not look at predictors that will not help us to make accurate interventions. Statistics should not be used to support bias.

    4. Communicate with students, staff, and others whose data are collected about their rights, including the methods used to obtain consent to use the data for predictive analytics and how long the information will be stored.

      Seems to completely skim over the issue of obtaining consent for predictive analytics...."Oh yeah, make sure that you have consent"

    5. The plan should also include a discussion about any possible unintended consequences and steps your institution and its partners (such as third-party vendors) can take to mitigate them.

      Need to create a risk management plan associated with the use of predictive analytics. Talking as an organization about the risks is important - that way we can help keep each other responsible for using analytics in a responsible way.

    1. The user needs to know which dashboard answers what questions, as well as understand the functions within each view and how to ask the right questions in the first place.

      Great argument for data awareness.

    2. “The results made it very clear that successful course completion at the faculty level should be our first dashboard to be released.”

      Dashboard priorities were made by the institution asking the question "What is the main pain point?"

    3. have helped Pierce College significantly improve its three-year graduation rates from 22 percent to 31 percent among new degree-seeking students and 21 percent to 30 percent among first-generation students.

      Causal statement, but with a qualification.

    4. Chancellor Michele Johnson said the dashboard was met with “such resistance even though we tried to frame it as being committed to student success.” But, she added, “We had to be patient enough to let some people mess around in the data.”

      Great evidence regarding faculty are initially resistant to analytics data.