- Sep 2023
-
www.chronicle.com www.chronicle.com
-
Perhaps others are different.
I find that experts in an area can coach an AI to produce expert-level work, but novices in an area cannot. That has big implications for teaching novices, since we must help them develop expertise whether or not they use AI tools along the way. The really interesting question is, where can AI tools help develop that expertise?
-
Writing is the laboratory in which the basic structure of an insight or connection is worked out.
Writing is one such laboratory. I would argue there are others, like conversation. Perhaps even conversation with an AI-powered chatbot.
-
my criteria
"I'll know it when I see it" isn't really a criterion.
-
a student to notice something important in a poem that has never before been noticed in quite that way.
If one really wants a truly novel observation about a piece of writing, one might need to survey everything that has ever been written about that piece of writing, a task that AI might be good at.
If one is more interested in an observation that has never occurred to the current reader, then that's a different matter.
-
or generated — so far as I understand the principles underlying natural language processing — by processing thousands of essays on literature.
There is some debate over what kind of "creativity" generative AI has. I put that term in quotes because it clearly doesn't create like some humans do, but there are aspects of creativity or the creative process where AI can be useful. See, for instance, https://bsky.app/profile/jonippolito.bsky.social/post/3k7bymrziux2e and https://aiandacademia.substack.com/p/ai-for-futures-thinking.
-
What is a striking feature? Something that strikes a person. What is surprising? Something that surprises.
I'm guessing Clune doesn't have a rubric for assessing his students' writing. "I know it when I see it" isn't particularly useful for students who are learning these skills.
-
even after eliminating grade inflation
Again, Clune seems to have a working idea of grade inflation and what it would mean to eliminate it, an idea that he's not articulating well here.
-
Given that turning a mass of data into a coherent order seems eminently automatable
I would agree that some forms of this task are quite automatable, but turning data into actionable intelligence is still pretty complex work. I suspect Clune just isn't fully aware of what this work means in a field other than his own.
-
It seems to me that every discipline will have to engage in their own process of discerning what they want from student writing, and what — in a world where most low-level cognitive tasks may soon be automated — they want to teach students.
This is entirely sensible. Given the affordances of generative AI and its potential roles in a variety of professions, I think it's incumbent on higher education faculty to engage in this kind of discernment.
-
By my second year as a professor, I concluded that such papers have no conceivable educational or intellectual value — for myself, the student, the college, or the world.
And yet he's still getting these papers (a third of all that he's read!) nearly 18 years later! As John Warner pointed out on Twitter, perhaps he should have changed his teaching practices years ago.
-
ChatGPT has transformed the problem of grade inflation — which professors have been moaning about for decades — from a minor corruption to an enterprise-destroying blight.
Clune hasn't defined grade inflation, nor has he made the case why grade inflation means "merely competent" work is getting As and Bs. He seems to be assuming his reader already knows all about grade inflation and why it's a problem.
-
to give up on assigning substantial student papers
Some, however, are calling for more substantial work from students, given that AI can handle the easy stuff. See, for instance, Ted Underwood's excellent piece on this topic: https://tedunderwood.com/2023/07/31/we-can-save-what-matters-about-writing-at-a-price/.
-
The more-common response among instructors, as we begin the first academic year of AI writing’s ubiquity, has been to see in this advance in automated writing the destruction of liberal-arts education.
I've recently been polling faculty during workshops about their stance on AI this fall. There are plenty of "red lights," that is, faculty who are prohibiting students from using generative AI in their courses, but about the same number of "green lights" (allowing all kinds of AI use) and far more "yellow lights" (permitting AI with limitations).
-
if in fact there’s anyone in the world so perverse as to desire such a thing.
This is a key point. If these essays aren't interesting to anyone anywhere, why do we have students write them? And, for that matter, why is it useful that ChatGPT and its peers can write them? Are there perhaps other genres of writing that are both "merely competent" (and thus in ChatGPT's wheelhouse) and also useful?
-
Because I will often give a merely competent essay, in this era of grade inflation, a B.
So does Clune feel some pressure to assign a high grade to work that he clearly sees as inferior? Has the system really boxed him in like this, or is he just experiencing an external locus of control?
-
- Apr 2023
-
lse.ascb.org lse.ascb.org
-
The authors examined how groups composed of students with different problem-solving styles (i.e., adaptors and innovators) addressed a project.
Is problem-solving style something that can be easily deduced from early-semester surveys? Are students able to assess their own problem-solving styles through survey questions?
-
group interaction predicted student performance more strongly than did either student ability or the overall ability composition of a group.
Upshot: How the groups work together matter more than the group composition, at least along the achievement axis.
-
Specifically, they found that low-achieving students demonstrated stronger outcomes when placed in mixed-ability groups, whereas mid-achieving students demonstrated stronger outcomes when working in homogenous ability groups. No difference was seen for high-achieving students.
Upshot: Don't create groups that consist entirely of low-achieving students.
-
-
www.lifescied.org www.lifescied.org
-
for simpler tasks that require recall, definitions, or looking up information, students exhibit greater gains when they work on their own
This is interesting... Sometimes, the overhead costs of group work actually inhibit student learning compared to solo work.
-
students-as-producers
For more on the students-as-producers approach, see this book chapter: https://ir.vanderbilt.edu/handle/1803/9446
-
overbearing students
See this blog post for a few strategies for the overbearing student problem: https://derekbruff.org/?p=4047
-
Generally, groups that are gender balanced, are ethnically diverse, and have members with different problem-solving approaches have been shown to exhibit enhanced collaboration
Here's the short answer to "How should I form student groups?" The longer answer is that there is research showing the effects of some decisions, while other choices aren't well researched.
-
tasks should not be able to be completed by just one or two group members, but rather should require contributions from all group members
In short, if the task is too easy or too simple, there's not a reason for students to work in groups on the task.
-
deeply researched teaching approaches in the college classroom
This paragraph is a nice high-level summary of reasons to use group work.
-
These summaries are organized by teaching challenges, and actionable advice is provided in a checklist for instructors.
Co-author Cynthia Brame is all about problem solving in teaching!
-
- Feb 2023
-
www.lifescied.org www.lifescied.org
-
However, studies indicate that this approach focusing on depth shows no decrease in performance, but rather an improvement in many cases
This is key! The depth-first approach actually does handle the breadth. Why do you think that is?
-
It is precisely this kind of competency that is difficult to acquire through an exclusively content-coverage approach but can be readily practiced and learned through learner-centered teaching techniques
This is key... If the things we value don't benefit from the coverage approach, there's lots more motivation to move away from that approach.
-
Our experience as developers of teaching skill in faculty indicates that many instructors have difficulty prioritizing, because they believe all of their course material is important.
True! That's why I like the "worth being familiar with" category in the Wiggins & McTighe framework. It provides a place to put low-priority topics without dismissing them entirely.
-
For example, “membrane transport” may be a topic from a textbook, whereas a related core concept might be that a molecule’s movement across a cell membrane is affected by its size, biochemical properties, and the electrochemical gradient.
Thanks for the concrete example! I might categorize this core concept as an "enduring understanding" in the Wiggins & McTighe framework.
-
Once the shift from content coverage to student learning is made, instructors can move to the second step
I find a lot of instructors are already good with step 2, but are struggling with step 1.
-
Primary pedagogy is to focus in depth on a few chosen topics related to core concepts and competencies.
I wonder if this focus on topics and concepts doesn't go far enough. If we focus on competencies (e.g. thinking like a scientist), that might do more to shift away from coverage.
-
Indeed, just because a concept is mentioned or covered in class does not mean students have learned it.
To quote Angelo & Cross, "Teaching without learning is just talking."
-
By advocating for fewer concepts presented in greater depth, the AAAS has uncoupled the definition of an effective instructor from the idea of content coverage.
That's a pretty big move by an influential organization, and yet... I'm not sure it's changed the biology major.
-
Michael (2007) notes the fear of a negative perception by colleagues as another barrier that can push faculty toward a content-coverage teaching approach
Q for the FLC: Do you ever feel this negative perception? More generally, how do your peers' views on this topic affect your teaching choices?
-
research has begun to suggest
See also Theobald et al. (2020), https://www.pnas.org/doi/10.1073/pnas.1916903117.
-
We use the term “content” to refer to all of the material—often in the form of facts—that students are responsible for learning and mastery of which may be assessed.
I try to avoid using the term "content" because it has the connotation that the material is to be passively consumed, like a YouTube video.
-
continuous exposition
There's that "continuous exposition" language from Freeman et al. (2014).
-
- Dec 2022
-
www.insidehighered.com www.insidehighered.com
-
involve collaboration across disciplines
This is also key. I've known some faculty who are fairly snobbish toward their colleagues in writing and rhetoric, but the W&R folks are way ahead of other disciplines in this area. We need to learn from them.
-
AI writing and AI research assistants
Most of the noise these past weeks has been about AI writing generators (and early, AI image generators). But Marc is pointing to a different category here: AI research assistants. Let's not conflate the two.
GitHub Copilot is yet another category: AI coding assistants. And I imagine there are many others that educators might want to know about.
-
We don’t want professionals like doctors and pharmacists using language models without understanding their limitations, and we definitely don’t want law enforcement or judges relying on biased AI in their decision-making.
This is a very strong data / info literacy argument.
That torch has usually been carried more by librarians than faculty. I wonder if the noise around AI will motivate more faculty to incorporate info literacy into their teaching.
-
One example is a counterargument generator, which allows students to explore different perspectives on topics they are interested in.
I need to check this out! This is the kind of targeted use of AI that has great potential for enhancing the writing process. And I can imagine other targeted uses in other disciplines. (I regularly use the Merlin app from the Cornell Lab of Ornithology to ID the birds I see.)
-
when future employers will likely want to them to have a range of AI-related skills and competencies?
This is key. We need to prepare students for a professional world where these tools are used, just as we prepare engineering students for a professional world where tools like Matlab are used.
-
- Oct 2022
-
-
Would doing so, signal that we are ready to normalize hyflex instruction?
I feel that the challenges of hybrid / hyflex teaching are significant. I'm not sure the benefits are worth it. I'd rather have two sections--one in-person and one online.
-
when I inevitably have to go back to teaching in person in a traditional room, it will be hard, and I will whine about it
I have taught in more traditional classrooms after teaching in more flexible ones. I find myself sighing a lot in the traditional classrooms.
-
We later agreed that because of the quality of the camera and microphones, it was almost like actually being present in the room, or being present as a hologram.
In my experience, this is the only way "hybrid" teaching works well. You need quality cameras and microphones so the distant student can hear and see well.
-
but then relocate and roam around freely the rest of the time
Susan Hrach (author of Minding Bodies) recently tweeted about the need for instructors to have "spatial proficiency" and I think Robert is demonstrating that here. It's not enough to have a create learning space, we need to know how to navigate it with our bodies.
-
in a matter of seconds
Don't underestimate this. There are plenty of classrooms where it's theoretically possible to move the furniture around, but practically time consuming.
Also note that there's space in the room to move things! That's not always the case
-
so that what students did in class meetings simulated the actual working practices of real mathematicians
This is really interesting. An active learning space might help students work in the ways that experts in the discipline work. That's an angle on ALCs I haven't heard before.
-
- Mar 2022
-
www.insidehighered.com www.insidehighered.com
-
We have retired comments and introduced Letters to the Editor
All well and good, but IHE hasn't blocked Hypothesis!
-
because these methods of evaluation are baked into the fabric of higher education
That's likely why grading wasn't called out in the report. So many people just assume that's an unchangeable element of higher education. Why mention changing it?
That said, it seems that ungrading is having a moment. Why is that? Or is it just having a moment for me?
-
Changing our orientation to grades in no way means we need to abandon academic standards
This is a short essay, so I don't fault Eyler for leaving this point somewhat unsupported. But suppose a faculty member pushes back on the idea that we can "ungrade" while maintaining "rigor." What would you say to that person?
-
these practices
I would add what's sometimes called "shadow grading," where students get a grade for their own feedback, but the course is listed as pass/fail on the transcript and for GPA calculations. MIT does this for first-semester students to ease them into the rigors of college. https://registrar.mit.edu/classes-grades-evaluations/grades/grading-policies/first-year-grading
-
major research studies
I like Eyler's approach here, leading with research studies. I've heard some ungrading advocates lead with their own personal stories about the damaging effects grades had on their motivation or learning. Those stories don't resonate for me--grades were often very positive motivators for my education--and personal anecdotes shouldn't matter anyway. Most of our students aren't like us, so our personal experiences with grades aren't that relevant to our teaching.
-
identifies the pressure to get good grades as the most significant factor leading to these mental health issues
Wow. I have to admit, this is a little hard to relate to. But I'm a grown adult now with a lot more perspective on grades. And back in high school, I got very good grades, so grading didn't cause me that much anxiety.
-
Scarlet Letters
This is such a great title.
-
heavy workloads
Just earlier this week, I heard the Dean of the First-Year Commons here mention that perhaps we should help students avoid taking particular combinations of courses that are known to have incredibly high workloads.
-
latest institution to confront the necessity of finding better ways to care for the mental well-being of students
I'm glad our previous chancellor started a push for more resources for mental health and student well being a few years ago. The challenge isn't gone, but our campus has more resources for it than it used to.
-
-
www.jessestommel.com www.jessestommel.com
-
Use words like "ask" or "invite," rather than "submit" or "required."
What verbs do you use to describe student assignments? Do you think these words really matter?
-
What happens with almost every single student is that any assumption I might make about them is squashed by what they write about themselves and their work.
This is a really powerful statement. How much do we know our own students' motivations and commitments? (How much does Stommel actually know his students?)
-
The statement about assessment from my own syllabi:
This link is dead, but the Wayback Machine tells me this syllabus is for a documentary film course (featuring both creation and critique of films). Is ungrading a lot easier in a small course in the arts and humanities than in other settings? I'm thinking, for instance, of a large-enrollment, required chemistry course.
-
Incessant surveillance
I know faculty who are all in on "incessant surveillance" as a way to maintain the integrity of their grading. These tend to be faculty who see themselves as gatekeepers for their disciplines and their courses as weed-out courses. How do we reach faculty at this end of the spectrum?
-
Who are students are is exactly relevant, and their specific challenges need to be accounted for in our approach to assessment.
Might blind grading be a useful approach to counteracting bias in some circumstances? Do I need to account for my students' "specific challenges" when I'm grading one of their mastery quizzes in my mathematics course? (These are quizzes that students get to retake until they pass.)
-
“Research shows three reliable effects when students are graded: They tend to think less deeply, avoid taking risks, and lose interest in learning itself.”
Kohn has been on-record as anti-grades for decades, typically leaning on the research around intrinsic and extrinsic motivation. Where I can't quite square the circle is that I often see my students thinking deeply, taking risks, and showing an interest in learning. Am I just recalling the exceptional students? Or are there contextual variables that need to be considered for sweeping statements like Kohn's?
-
They are very recent technology.
So is the flipped classroom and critical race theory. Recent isn't necessarily problematic. (Cathy Davidson often makes a similar argument, noting that letter grades come from the evaluation of meat quality, but the origin of an invention doesn't necessarily determine its current usefulness.)
-
Some foundational questions about assessment:
Here's where the nuance starts. These are all good and important questions to ask.
-
"Deciding to ungrade has to come from somewhere, has to do more than ring a bell, it has to have pedagogical purpose, and to be part of a larger picture of how and why we teach."
This is true of grading, as well.
-
What all those grades mostly weren’t measuring: student learning and/or content knowledge.
As usual, Stommel starts with a provocative generalization that lacks nuance. (The nuance will come later.)
-
- Jul 2020
-
cft.vanderbilt.edu cft.vanderbilt.edu
-
This may be one of the more challenging activities to conduct in a hybrid and socially distanced classroom.
Having tested out a classroom last Friday, I can confirm that classwide discussion will be challenging.
-
- May 2020
-
vanderbilt.app.box.com vanderbilt.app.box.com
-
Professor Derek Bruff
Hey, that's me.
-
- Jun 2019
-
www.vanderbilt.edu www.vanderbilt.edu
-
Dinner
I'm really hungry.
-
Making digital pedagogy fun
When I plan a cryptography-themed escape room for my students, I list it as "Flex Day" in the syllabus so they don't know what's coming. One semester, the students took a group selfie for me, all flexing their biceps after they solved the clues.
-