While the term ‘artificial intelligence’ (AI) may be exhausted in its quantity of mentions, the UNM community is just getting started with exploring this impactful phrase.
One of the many reasons you may not want to write off the concept is because it may be partially responsible for your children one day.
Literacy Professor Mary Rice in the College of Education & Human Sciences (COEHS) is exploring the role of AI and education. From future teachers to current educators to students of all ages, it’s a connection worth understanding.
“I still don't know all the answers. I think the place where we should be centering is thinking about how to help teachers and students learn what those sorts of tools can and cannot do in terms of how they where the information comes from and how it operates and things–then they can decide,” Rice said.
Ph.D. student Jegason Phosphorus Diviant and Ph.D. candidate Lou Ellis Brassington are part of this cautiously optimistic area of study alongside Rice.
“I think it’s essential that we consider what the right ages might be for incorporating generative AI into our educational strategies,” Diviant said. “We need to consider whether its use is developmentally appropriate and whether it aligns with the curriculum standards and learning objectives for the class. We need to teach students how to critically assess AI-generated output and help them to strengthen their digital, information, and media literacy skills.”
Putting the A-I in equItAble
Rice approaches the potential positives of AI in the classroom with what she refers to as “techno-skepticism.”
“I think it's always good to be critical of whatever kinds of technologies that people are using, especially what we are asking young people to use. Oftentimes teachers are just commanded to take a tool and use it with the kids. Instead of saying ‘good luck’, I think we should be asking important questions about what the kids are really going to get out of it,” Rice said.
The questions she has asked so far focus on two key realms. The first, as published in Computer in the Schools, is how AI can help children with disabilities. Better yet, how can it help all kids be on an equal playing field, no matter where their district is, or what funding it receives?
“We should talk a little bit about the whole notion of identification with disability because some people feel different ways about it,” Rice said. “Sometimes using that term is the only way to get access to services, so that would be important, but then also there is a notion of how come we don't just make a society that's accessible? How come we have to have a society where some people get sorted into having abilities and others don't?”
While the definition of disability is often controversial, there was a focus on a variety of angles when analyzing the use of AI. This included cases of Autism Spectrum Disorder, dyslexia, intellectual disabilities, languages and learning disabilities.
“My research has indicated that students with disabilities certainly benefit from the usage of AI in multiple areas. I think that, with guidance and plenty of professional development workshops, lesson planning for students with disabilities, gifted students, and students in general can be tailored to enhance the academic experience for all classrooms of all grade levels,” Brassington said.
“AI can be used to analyze students’ strengths and weaknesses and suggest resources and activities specific to students’ individual needs. It can help us to identify students who may be at risk of falling behind so that we can intervene sooner and improve the likelihood of the student having a positive educational outcome. AI can automate grading and feedback, which could allow teachers more time to devote to students.” – PhD student Jegason Phosphorus Diviant
Diviant highlights beyond the scope of niche lesson planning, there are additional, concrete technologies that can be allocated to assist students who need that extra boost.
“AI technologies may be able to provide speech and language support, visual and auditory support, and enhance assistive technologies such as screen readers and voice recognition technologies,” he said.
AI, Rice emphasizes, really ensures students get true, even-handed learning and empowered accessibility, which could promote a more unified classroom.
“Instead of focusing on educational situations on what people cannot do or making kids do things the hardest things in the hardest ways possible, why are we not thinking about how to reposition them in ways that would help them be optimal, be successful? Let's spend money on that kind of stuff,” she said.
The give and take of large language models
This is the same mindset this group in published research applies to large language models (LLMs) and English literacy. LLMs, like the infamous ChatGPT are themselves built around language scraped from existing human writing and communication.
“Generative AI can be used for personalized learning strategies. LLMs can tailor content to meet an individual’s needs, adapting to their pace and learning style. It may be able to help students with developing social and communication skills,” Diviant said. “I feel that each large language model has its own unique strengths and limitations, and depending on what our objectives are, some LLMs might be more suitable for a given task than others. It’s vital that we ask ourselves whether using an LLM would enhance our learning and productivity or potentially interfere with either. ”
While there are arguments that ChatGPT could equal the death of jobs, dedicated learning and creativity, LLMs like ChatGPT are known to come directly from us and our own inputs. They adapt to our verbiage, our questions, our biases, therefore uniqueness is not in a vacuum.
“Although we are in the early stages of AI development, I have a fairly positive perspective on the benefits of ChatGPT, Bard, Gemini, etc. I think that educators, if informed through professional development workshops, can find many ways to enhance writing, research, and creativity in the classroom through lesson plans that are designed around the implementation of AI as a tool and complement to curriculum content,” Brassington said.
While some school districts, Rice mentions, have outright banned these generative LLMs, she thinks that’s a mistake, when not everyone is doing the same.
“Some New York City schools just outright banned generative AI at the beginning, which I don't know if it is a very practical policy. When you ban it, then teachers can't get access to professional development about it,” Rice said. “What they should have done is start by asking more questions. Who is this affecting? Who is going to get left out by this? What kinds of things do we need to be doing? Now we’ve got some policy landscape craziness going on.”
Building teachers up, not replacing them
Beyond student examples, other AI applications may supplement teacher work for tasks like assessment and grading. This is not a replacement, however, but rather a way to get more personal with students’ strengths and weaknesses.
“Educators must do their homework and be well-versed in the negative and positive effects of AI so that when they lesson-plan, they can make sure that AI is used productively and affirmatively within their classrooms. As with any classroom protocol, the teacher needs to carefully and judiciously implement research and content practice,” Brassington said.
That’s something Diviant says he experienced firsthand.
“I was a graduate assistant for a class this semester that took about 10-12 hours each week to grade assignments. I’ve been thinking about how much more meaningful it might have been if I could have hosted more review sessions and had the time for more one-on-one meetings with the students who were struggling the most,” he said. “I feel that many teachers would welcome the opportunity to have more time to interact with students personally and AI can help to achieve that.”
Diviant also thinks it has use for children of all ages. While his 6-year-old son was experiencing how AI could generate new stories and creative art and be used to identify and describe the flora and fauna during their nature walks together, Diviant was able to use generative AI to converse with his thesis manuscript and role-play simulated thesis defense sessions from the roles of a student and a virtual thesis committee member.
“My experience using generative AI has been overwhelmingly positive, engaging, and rewarding. It can be an excellent tool for brainstorming, role-playing, tutoring, interactive learning, planning, researching, reflection, and feedback. I think for younger age groups, it can be used to enhance creativity and allow students’ imaginations to flourish,” Diviant said.
Caveats in the classroom
The fatigued debate over AI also means its tools and every positive idea referenced so far must be taken with a grain of salt.
“I am still optimistic that all of us can benefit from the realities of our posthuman experience, whether it be AI, new practices, new assessments, and new modalities of teaching,“ Brassington said.
“There's nothing that says that we have to fully embrace AI. People always talk about technology as an advancement, which I'm not sure that it always is, and that it's inevitable, which I'm not sure that it has to be so. Right now is about the movement.” – COEHS Professor Mary Rice
It’s a very delicate balancing act; by knowing too little, students run the risk of using AI improperly; by using too much, educators and students run the risk of losing work ethic.
“Over-reliance on AI technologies can interfere with student learning and diminish independent and critical thinking. It can also diminish the role of an educator. The value of human interaction and personalized mentorship and guidance is crucial in education. Therefore, the approach of introducing AI into the learning environment needs to be carefully planned,” Diviant said. “There’s also a risk that if we don’t introduce and teach the ethical and responsible use of AI, students may learn bad habits involving the use of generative AI on their own that interferes with their learning, and these bad habits may lead to poor educational outcomes.”
Ethics themselves also play a role. That coincides with dutiful planning, which is why Rice pushes so hard for open discussion.
“A lot of people right now are trying to come up with ethical frameworks for AI and a lot of those ethical frameworks are theoretical, but they're not starting with foundational thinking or principles. They're just trying to make lists. Then teachers aren't going to have very good guidance the same way they didn't have very good guidance in the pandemic,” Rice said.
Everyone teaching and learning
Those solutions to make AI and students successful, Rice, Diviant and Brassington agree, start with very intense transparency.
“When you have something like online learning or AI come along, you can't do incremental changes. You get people together and ask the questions. You have to talk, be open, come up with the foundation, principles and then do big overhauls,” Rice said.
Then of course, discussions regarding the ethics of it all, like UNM has begun, must continue before future teachers even get their diploma.
“Classes should include language in the syllabi that discusses the ethical and responsible use of AI, and examples should be provided about what clearly is and is not ethical and responsible use. Also, instructors should discuss early in the semester a spectrum of AI use in the classroom, what is or is not allowed, and what can be done to make the use of generative AI for assignments transparent. I encourage assessments to be designed that require the use of generative AI so that we can demonstrate how these tools can be used to enhance learning and productivity without sacrificing creativity, independent thinking, and critical thinking. I also encourage students' entire generative AI chat histories to be turned in with the assessments so that they can demonstrate ethical and responsible use in a transparent manner,” Diviant said.
“It is necessary to stay current with new trends in technology and their applications within the classroom,” Brassington said.
Much like schools which once had to readjust to include computer labs, educators have long since had to adapt to new technologies-from chalkboards to tablets.
“How are we going to make sure that if we're going to use AI, how do we make sure that the benefits of those are distributed equally?” Rice asked. “Teachers should not just be given a tool and be told ‘use this.’ What we should be asking is whether this is empowering the students?”