- Oct 2024
-
er.educause.edu er.educause.edu
-
U.K.-based WPP is working with Microsoft to develop advanced audio description tools built on GPT4. This technology generates enhanced audio descriptions of user-uploaded videos and images. The company is also working collaboratively with the Rijksmuseum, the national museum of the Netherlands, to provide enhanced audio descriptions for its collection of nearly one million works of art, opening the door to libraries with extensive special collections.Footnote6 This tool is expected to be available soon.
This tool will allow students to put in audio descriptions and the tool will be able to find videos and photos they will need.
-
A 2023 survey of assistive technology users found that fewer than 7 percent of respondents with disabilities believe there is adequate representation of their community in the development of AI products—although 87 percent would be willing to provide end-user feedback to developers.Footnote2
I would love to see what the 87% of the people think. I hope they find it useful and helpful.
-
These are just a few of the exciting developments emerging from the intersection of AI and accessibility. Many people in the higher education community are rightfully cautious about the use of AI. However, numerous products and services that promise more equity and inclusion for people with disabilities are currently available or in development.
Hopefully in 4 years, this Ai is a mastered tool to use. I will be able to help students with learning disabilities while also helping the other students.
-
The University of Illinois is working with Microsoft, Google, Amazon, and several nonprofit organizations on the Speech Accessibility Project, an interdisciplinary initiative "to make voice recognition technology useful for people with a range of speech patterns and disabilities." University researchers are recording individuals who have Parkinson's disease, Down syndrome, cerebral palsy, stroke, and amyotrophic lateral sclerosis. Those recordings are used to train an AI automatic speech recognition tool. According to the Speech Accessibility Project website, "Before using recordings from the Speech Accessibility Project, the tool misunderstood speech 20 percent of the time. With data from the speech accessibility project, this decreased to 12 percent."Footnote19
This new tool helps students with speech problems to learn. Researchers are trying to teach Ai how to respond to different voices and different or hard to understand speeches. These students will be able to talk into Ai and Ai will be able to understand them and then help them with whatever they need.
-
In 2023, Microsoft partnered with OpenAI to develop Be My AI, a digital visual assistant within the Be My Eyes app. Be My AI is powered by OpenAI's Vision API, which contains a dynamic new image-to-text generator. Be My AI users can send images and ask questions via the Be My AI app. An AI-powered virtual volunteer answers any questions about the images and provides instantaneous visual assistance for a variety of tasks.Footnote7 This technology provides enhanced opportunities for learners who are blind or have low vision.
This allows blind or low vision students to still learn and be a part of the class. These students can put an image into Ai and Ai will explain what the picture looks like.
-
These advancements afford people with disabilities more equitable access to the same educational services and resources offered to students without disabilities. Ironically, students with disabilities—who stand to gain the most from emerging AI tools and resources—are often the most disadvantaged or least able to use them.Footnote
Students with learning disabilities should use AI and be able to use it more than the other students. This can help those students correct and make sure words are correct. They could also use it for searching new ideas and outlines.
-
-
www.forbes.com www.forbes.com
-
Despite these concerns, U.S. educators seem optimistic about the potential of AI in the classroom. Acknowledging that artificial intelligence will likely play an expanding role in education, most teachers have already begun to integrate AI tools into their daily work routines.
This might be the future of education, and I am glad that teachers and educators are already on board for it.
-
Ninety-eight percent of our survey respondents identified a need for at least some education on ethical AI usage. More than 60% recommended comprehensive education.
Teachers and educators should be informed on what Ai is and how it works. This will prevent students to walk all over the teachers.
-
Academic dishonesty tops the list of educators' concerns about AI in education. Teachers also worry that increased use of AI may mean learners receive less human contact.
This definitely true. Teachers should definitely use the detecting app though. The teachers can see how much of AI the students used.
-
More than half of the teachers who responded to Forbes Advisor’s survey said they believe AI has had a positive effect on the teaching and learning process. Less than 1 in 5 cited a negative effect.
Surprising!! I thought teachers would not like it.
-
Teachers have long recognized the value of play-based learning, and schools have used educational computer games—such as The Oregon Trail, first released in 1974—since the early days of computer gaming. Today’s AI-powered games can deliver targeted learning thanks to user-responsive programming.
Teachers and educators should take advantage of Ai in this sense. Students love to take brain breaks and play games.
-
In October 2023, Forbes Advisor surveyed 500 practicing educators from around the U.S. about their experiences with AI in the classroom.
This was defiantly needed. I am glad that researchers thought of this and did it.
-
-
pmc.ncbi.nlm.nih.gov pmc.ncbi.nlm.nih.gov
-
a human, animal, or AI system deserves moral consideration to exactly the extent it is capable of pleasure or pain.
Ai should not be in the same category as a human or animal. It is not living.
-
If technology continues on its current trajectory, we will increasingly face morally confusing cases like this. We will be sharing the world with systems of our own creation, which we won’t know how to treat. We won’t know what ethics demands of us.
Yes, this is so true. That is why we should have people overseeing these systems.
-
it might be useful to create oversight committees analogous to institutional review boards (IRBs) or institutional animal care and use committees (IACUCs) for evaluation of the most advanced AI research.25
They should create these committees, and they should put them to use. People should be watching Ai closely so it doesn't become what people thinks it's going to be.
-
The Emotional Alignment Design Policy: Design AI systems that invite emotional responses, in ordinary users, that are appropriate to the systems’ moral standing.
This is the best way to explain what AI is. Ai responds to us with the best answer that they think or have researched.
-
Do we actually grant rights to our most advanced AI systems? How much should we take their interests, or seeming interests, into account?
No, we don't grant rights. Ai is technology, not a real or living thing. We as humans don't need to take anything away or with Ai.
-
David Chalmers, for example, reviews theories of consciousness as applied to the likely near-term capacities of large language models.12 He argues that it is “entirely possible” that within the next decade AI systems that combine transformer-type language model architecture with other AI architectural features will have senses, embodiment, world- and self-models, recurrent processing, global workspace, and unified goal hierarchies—a combination of capacities sufficient for sentience according to several leading theories of consciousness.
This could be true but also not be true. AI could take over and become the "new" humans, it might run our jobs. Ai will always stay on the computers.
-
-
www.proquest.com www.proquest.com
-
OpenAI's GPT-3.5, which was released in March 2022, only managed to score in the 10th percentile on the bar exam, but GPT-4.0, introduced a year later, made a significant leap, scoring in the 90th percentile.
This is just the new technology. Students are also getting smarter with AI. AI allows you to search ideas and examples for them
-
Decreased social connection. There is a risk that more time spent using Al systems will come at the cost of less student interaction with both educators and classmates.
I don't think AI alone will do this. I think technology has already accomplished that.
-
Student cheating. Students might use Al to solve homework problems or take quizzes. Algenerated essays threaten to undermine learning as well as the college-entrance process. Aside from the ethical issues involved in such cheating, students who use Al to do their work for them may not be learning the content and skills they need.
This is definitely a worry that most schools systems have. Even without Ai, students would easily cheat but with AI its more easily too. Educators and teachers should encourage students to only to use AI when they are stuck or need some ideas.
-
Parents can use Al to generate letters requesting individualized education plan (I EP) services orto ask that a child be evaluated forgifted and talented programs.
This is a good idea. Parents are already busy; they should be able to get help with letters. Forming a letter for the schools can be difficult.
-
The department had conducted listening sessions in 2022 with more than 700 people, including educators and parents, to gauge their views on Al. The report noted that "constituents believe that action is required now in order to get ahead of the expected increase of Al in education technology-and they want to roll up their sleeves and start working together." People expressed anxiety about "future potential risks" with Al but also felt that "Al may enable achieving educational priorities in better ways, at scale, and with lower costs."
This was really good research that the AI department had to do. Before I read why I thought the educators probably would not like it. It does make sense that educators and parents are scared of it.
-
Using generative Al systems such as ChatGPT, Bard, and Claude 2 is relatively simple
The people that created AI make is easy and assessable for the users that need it.
-