The ‘now’ of education: Faculty, admins talk maximizing AI and building guardrails

"There's this rapid acceleration to market [of these tools]," said Michael Mace, manager of Assistive Technology and Accessibility at Indiana University. "Everyone is trying to throw AI into everything, but they're not considering what is good AI and how it impacts all audiences."

Since Pria was born in January 2023, she’s significantly impacted Alex Feltus’ life. A professor of genetics and biochemistry at Clemson University, the veteran educator admitted Pria has made him a better teacher at University Business’ Wednesday live webinar, “The Impact of AI on Higher Ed.”

Of course, he’s not talking about a child and the pulse of meaning one can bring to one’s life. Pria is a generative AI (genAI) teaching assistant specifically designed for the academic community. With a keen ability to amalgamate multidimensional data sets beyond the brain’s ability, the genAI tool has become a “professor-in-your-pocket” for students studying long hours and needing personalized help when they’re far removed from faculty office hours.

“This is no longer the future of education,” says Feltus. “The ‘now’ of education is having genAI assistants that give good, trusted answers.”

Of the faculty, chief information officers, and accessibility experts who spoke at the event, most expressed profound optimism for AI and illustrated just how advanced some of its implementations have already become. But that’s not to say they weren’t mindful of carefully implementing it.

“There’s this rapid acceleration to market [of these tools],” said Michael Mace, manager of Assistive Technology and Accessibility at Indiana University. “Everyone is trying to throw AI into everything, but they’re not considering what is good AI and how it impacts all audiences.”


More from UB: Here are 4 ways to boost the reputation of your continuing education units


Can we “trust” genAI?

Pria’s track record of answering complex student questions with such high accuracy has led Feltus to deeply trust the device. ChatGPT and other genAI chatbots often provide erroneous answers and quite literally make answers up. Academics describe this chatbot behavior as “hallucination.” However, Pria’s hallucination rate is so low that Feltus believes its dialogue with inquisitive students rivals that of a science professor.

“Her answers are excellent,” says Feltus. “I would say they’re as good as mine, and that’s as an expert who’s been in science for 30 years.”

Notice his recognition of the chatbot as a person. When Feltus and other academics brought Pria into the world, they provided “her” basic functioning guidelines: she’s a genAI tool programmed to help students, faculty and researchers in their infinite quest for knowledge. According to Feltus, Pria filled in the rest and created her own “persona.”

One of the most crucial aspects of Pria is how easily she can be integrated into various learning management systems, such as Canva, Blackboard and D2L.

Several academic leaders mentioned the potential consequences of providing genAI tools with so much lateral control in the classroom. Nirma Shenoy, a professor at the Rochester Institute of Technology’s School of Information, worried that students will lose sight of the invaluable qualities of an in-person teacher. Furthermore, students who become too reliant on genAI tools to spruce up reports and create presentations might become lead to students becoming less proficient in basic skills, Shenoy forecasted.

“I’m listing many concerns here because I have not yet started using it,” she said. “Maybe I’ll have a more positive perspective afterward.”

Indeed, one survey has found that faculty skepticism toward AI drops once they become acclimated.

Likewise, Mace was concerned about institutions becoming so reliant on their genAI tools that they would remove administrative oversight, such as whether they’re creating content inaccessible to students with disability. One in four has a disability which, more often than not, is hidden from the naked eye, Mace said.

“We want it to feed to our dog, but we don’t want it to put the dog in the kitchen,” he quipped.

Implementing protections

Feltus believes the key to ensuring AI is used the “right way” is monitoring its interactions with students and picking up on any abnormal behaviors. For example, the professor will flag student-chatbot conversations that are becoming repetitive and circular. This notifies Feltus that this student needs to hash out instructions with him in person.

On the administrative side, Damian Clarke, vice president of Technology Services at Alabama State University, ties all of the functionality and implementation of AI back to the institution’s strategic plan. This strategy helps keep his staff focused and prevents any danger of the technology being used in ways that run counter to its intended targets.

Additionally, while Clarke has focused on democratizing the technology across the board, he is also ensuring that certain data is walled in ASU’s “digital garden” to prevent any sensitive information from being disseminated.

Alcino Donadel
Alcino Donadel
Alcino Donadel is a UB staff writer and first-generation journalism graduate from the University of Florida. His beats have ranged from Gainesville's city development, music scene and regional little league sports divisions. He has triple citizenship from the U.S., Ecuador and Brazil.

Most Popular