- Jan 2025
-
link.springer.com link.springer.com
-
One critical element thatmust be considered is whether university administrationswill be able to support the changes in instructional designand assessment that are necessary to ensure integrity in thelearning process,
When generative AI became part of the learning process, I feel it showed a glaring issue with in the classrooms, from kindergarten all the way through college courses. Many of the conversations I hear about AI, as it pertains to students, is how students will mis-use it to cheat. I read an article from an earlier class, and I wish I could remember what class and the article but I covered the issue of AI and how college professors would use AI detection tools to see if students used AI, and there are some truly horrible stories of students almost getting kicked out of their courses based on a false positive. The article also touched on the reason a student would need or want to cheat in the first place. What I have learned about the learning process has really opened my eyes.
-
Projects thatrequire creative thinking, application of knowledge tonew situations, or the solving of real-world problemscan be more indicative of a student’s own work andunderstanding
I teach a class that asks students to use creativity, think outside of the box, and try new ideas. I have been in this role for the past four years and I have seen some amazing thinking come from students. I have also seen some awesome learning happening from students who traditionally are given a negative label because they do not learn best in a traditional setting. Another thing I feel helps with this is allowing students to explain what they are thinking in their own words. This helps them to convey what they are thinking, and ultimately what they have learned, to me. I have developed a few projects that involve real world applications, which help students to see how this affects people they know. This helps students to be more engaged and hopefully learn a little bit of empathy.
-
Educate students about the ethical use of AI, includingdiscussions about academic integrity, the limitations ofAI, and the importance of original work.
This is very important for our students to know and understand digital and media literacy. The students I work with, 6th grade middle schoolers in a rural area, struggle in this area. Students need to learn how to navigate the digital world and use critical thinking skills in order to correctly analyze what they are looking at.
-
The focus should not beto try and design GenAI out of the learning experience, ornecessarily to design it into the learning experience, but sim-ply to design instruction so that students actually learn. Thestrategies suggested above, and others, may be productivepaths to consider in this regard.
I found this to be an interesting idea. During my time in this master's degree, I have begun seeing the positive side of AI in learning. Before it all, I was one of those people that said "AI is bad and students are never going to learn because they will probably just cheat with AI." Yet learning to use it to help inside of lessons has shown me how it can be a positive source for students to use. Yet this section telling us not to necessarily include it or exclude it in teaching, but to simply to design instruction for our students to learn makes me reset my head a bit. I am reading this as "add AI if it helps to enhance the learning experience. Do not add it just because you can. Do not force it." It makes me think more when developing lessons that I will only want to use it if it comes to my mind when figuring out what to do.
-
There are emerging technolo-gies designed to detect whether a piece of writing wasgenerated by AI. Incorporating these tools may helpeducators identify work created by GenAI. However,the accuracy of these programs, both with respect tofalse negatives (i.e., GenAI was used but not detected)and false positives (i.e., GanAI was not used, but thestudent is accused of using it) is wanting. Educatorswho wish to incorporate AI monitoring tools shouldstay informed on the capabilities and limitations ofthese technologies in order to use them responsiblyand effectively
I have heard that students are learning to get around AI detection tools. This is quite interesting to me considering AI is still so new overall. I wonder if learning to use AI detection tools is time consuming in order to use them correctly. Would this make it more of a chore for teachers to learn to use?
-
I literally had a student this week decide to use ChatGPT on a fun social studies assignment. They are currently learning about the colonization of the United States and they needed to write a letter to King James requesting a charter to the new world. This student decided to use ChatGPT but made the mistake of asking it to "write him a charter" instead of "write a letter requesting to be granted a charter". His letter contained many words that I even myself struggled to pronounce at first, leading me to see that he had clearly used AI to write it. I like the use of AI for some projects, but it makes me sad when students choose to use it even on the more entertaining assignments that are not difficult to do. Yes, the technology helped him write it so he didn't have to, but it being used incorrectly hurt him.
-
Frequent, Low-Stakes Assessments:
I think the solutions that will work best to reduce the usage of generative AI by students are dependent on the group of learners you are working with. As an elementary teacher, I feel that the best strategies are those that ensure class questions includes connection to self or the world. Typically, my younger students are not going to be able to do the higher level thinking required to get generative AI to complete a sensical response to a question like this.
However, if you were a high school AP teacher, your students would be more likely able to utilize generative AI to build answers to complex questions. For this group, frequent, low stakes assessments may be a better direction to go in because the students in these classes are likely driven by success. If they feel they will not be able to achieve their desired score without the use of AI, they may be more likely to cheat. Lowering the stress and pressure may prevent their use of AI.
In determining what strategies will help prevent the misuse of AI within your classroom setting, I think it is important to investigate the motivation of your group. Why are they inclined to use AI, and then pick an appropriate mitigation strategy based on that discovery.
-
They do notthink; they create human-like responses based on prob-abilities and, in doing so, also tend to make things up (i.e.,hallucinate).
I think this is a very terrifying notion. Currently many students are relying on generative AI to assist if not complete their school work. I think it would be unrealistic to assume that these individuals will not take that practice into their careers as well. This could cause a generation of people to join the workforce and perpetuate incorrect information and "fake news" without even knowing it.
Regardless of a person's position on generative AI usage in the classroom, I think a new role of educators (particularly secondary educators) will be teaching students how to question the information that AI provides to them and fact check it through research. While this type of instruction will generally happen at the secondary level, I think elementary teachers have a role as well in teaching digital citizenship skills along with working to decrease student apathy.
-
generate original written work that is virtuallyindistinguishable from that of human authors.
I think this is an interesting statement. While this original work may be indistinguishable from some individuals work and in some settings, I think in a K-12 face to face classroom setting, it isn't nearly as difficult to distinguish as this quote implies.
First, most teachers have some level of familiarity with their students writing voices and styles. I often find ChatGPT writing to be very formulaic. ChatGPT also uses phrasing that would be uncharacteristic of a student. For instance, I just asked ChatGPT to write a book report for Catcher and the Rye. This was a sentence from that report: "Salinger captures the feeling of alienation and confusion that accompanies adolescence with remarkable sensitivity." When considering the typical abilities of a high school student, this writing level seems like it should raise suspicions and indicate to an educator that artificial intelligence may be involved.
I can see how determining AI vs student responses may be far more challenging for fully virtual classes where teachers may not be as familiar with a student's voice outside of technology. I also believe it would be far more difficult to distinguish at a college level, as some students writing voices become more complex and advanced. Fortunately, as a third grade teacher, I am able to disagree with this statement, as it would be extremely simple for me to distinguish my students writing from artificial intelligence writing.
-
-
www.hypeinnovation.com www.hypeinnovation.com
-
If you've spent time watching your customers, understanding what task they need to perform, and posing questions about the circumstances and potential solutions, experimentation becomes a natural step to test those ideas.
This approach also saves time and helps to prevent potential redundancies. It also helps to identify and focus on exactly what you should be solving for.
-