- Jul 2019
-
eliterate.us eliterate.us
-
we first need to cultivate an academic ecosystem that can make proper use of better tools
This is putting the tool before the horse, if you'll let me mix two metaphors Michael uses. Could the academic ecosystem be renewed in such a way that the existing product category we are trying to justify here was unnecessary?
-
One could also imagine colleges and universities reorganizing themselves and learning new skills to become better at the sort of cross-functional cooperation for serving students.
Yes, let's imagine (and actually work toward) this, rather than new kinds of vendor relationships.
-
They believed that students might learn the skill from the product.
Perhaps students could better learn skills from another human.
-
Academically at risk students often are not good at knowing when they need help and they are not good at knowing how to get it.
In the back of my mind is how some students at one community college I know that deployed Starfish called it "Star Snitch" because they always felt it was ratting them out, not helping them.
-
"We're getting really good at predicting which students are likely to fail, but we're not getting much better at preventing them from failing."
Predicting is a product category, but preventing is not (unless you count the EDUs themselves, which is probably where the responsibility for prevention should be located).
-
each installation of the product would require a significant services component, which would raise the cost and make these systems less affordable to the access-oriented institutions that need them the most
Again, the commerciality of their social deployment would make it more difficult for them to tailor for success OR address ethical concerns.
-
You can lead a horse to water and all that.
This is where the social deployment of tech/practices can start to take over. First, commercial interests are now in charge of surfacing the ethical dilemma, which may not be in their best commercial interests to prioritize. Second, their customers have outsourced concerns and so are less interested/committed/resourced to address them.
-
the student data privacy challenge
Seems that this first challenge exists wether humans or machines are watching/intervening, even if machines might provide different options (eg, being able to watch private data and surface it without revealing it).
-
This is not a "Facebook" problem.
I'm not sure what a "Facebook" problem is. It seems to me that many of the same ethical questions arise in the social media realm — or at least related questions, equally important.
-
Already, we are in a bit of an ethical rabbit hole here.
Indeed. And it should be of no surprise because the social deployment of a technology/practice pretty much always leads to ethical considerations.
-
The people who are authorized to see the data
When data exists, it is also often seen by unauthorized people.
-
One way to think about a way that this sensitive information could be handled is like a credit score.
This is a warning sign to me. Do we really want to use credit scores as a model for ANY activity in education?
-
These are basically four of the same very generic criteria that any instructor would look at to determine whether a student is starting to get in trouble. The system is just more objective and vigilant in applying these criteria than instructors can be at times, particularly in large classes (which is likely to be the norm for many first-year students).
This is key: the machines aren't really doing anything better than humans, but they do make it easier to scale. Are there studies that compare student success for larger classes assisted by machines with smaller classes with smaller teacher to student ratios? Would human participation and intervention be better than machines? Would it be cheaper or more expensive? What ofter effects, desirable or not, would the machine and/or human models have?
-
a good case study in why tool that could be tremendously useful in supporting students who need help the most often fails to live up to either its educational or commercial potential
From the start, the educational and commercial fate of predictive analytics are intertwined.
-
we can make recommendations to the teacher or student on things they can do to increase their chances of success.
For me, a first question about how such information would actually be used is whether it might not be used just as often to deter students from entering specific classrooms (or even institutions) as used to help them succeed once there.
Behind any technology/practice being deployed today in EDU is the larger circumstances of trying to scale/optimize, and often, privatize, education. There are strong winds blowing stuff that could be used to help in other directions.
-