Season 1 – Episode 40 – AI in Higher Education
The use of artificial intelligence (AI) in higher education is expanding, and with it comes a discussion about its potential biases, accuracy, and grading methods. While AI can offer benefits such as pre-written information, human-like language, and sophisticated chatbots, it also presents some drawbacks in the context of higher education. Modern AI can help with academic inquiries, grade assignments, and identify plagiarism. Nevertheless, it is important to provide faculty with adequate training on how to properly utilize these tools and evaluate student performance based on their use.
Click to expand/collapse
Darren Gaddis: From CITI Program, I’m Darren Gaddis, and this is On Campus. Today, how artificial intelligence can impact higher education, what potential influences artificial intelligence could have on grading practices, and is there a need to change grading practices within higher education?
I spoke with Fred Martin, who invents and studies new technologies to enable teaching and learning in computer science, data science, and artificial intelligence. He creates partnerships for bringing these technologies to learners in school and out of school, focusing on K-12 teachers and students.
He collaborates with researchers in other fields, particularly in education and psychology. He supports faculty and students in the Kennedy College of Sciences focusing on experimental learning and student success at the University of Massachusetts Lowell.
As a reminder, this podcast is for educational purposes only. It is not intended to provide legal advice or guidance. You should consult with your organization’s attorneys if you have questions or concerns about relevant laws and regulations discussed in this podcast. Additionally, the views expressed in this podcast are solely those of their presenter, and do not represent the views of their employer.
Hi, Fred. Thank you for joining me today.
Fred Martin: Hi, Darren. It’s nice to meet you.
Darren Gaddis: Fred, to help us ground our conversation today, what is artificial intelligence, or AI? And how is it broadly used within higher education?
Fred Martin: Sure. So I used to define AI as something more simple, as simply automated decision making. So AI will evaluate a bunch of data, make a decision. Or it’ll evaluate a situation and make a choice.
And so for a long time, that’s a pretty good definition of AI. I think in the last couple of years, and particularly in the last six months, we’ve come across this new category of AI. Which existed before, but now it’s affecting us and we’re seeing it. So that’s generative AI. So AI that is a creative agent, and makes things that are really compelling.
There’s three sort of major ways that AI is used in higher ed. One of those is for data analytics, and often for student success types of applications. So looking at student grades and attendance in a large-scale, systemic way. And really to help faculty and staff support students.
So it’s data analytics and predictive analytics to sort of identify which students are doing great and which students can use more attention. So that’s one big area of AI.
Another way it’s used is in grading systems. Faculty will use AI technology to sort of pre-process student exams, and categorize things, and it becomes a grading assistant to faculty. So I think the most prevalent use is where the AI will categorize student answers, and then faculty can apply rubrics to these groupings of answers. And it really becomes a way to support faculty in doing grading of exams.
And then the third way is basically computer-aided instruction. Where there are systems that students directly do homework, and then the underlying AI system is building a model of what the student knows and doesn’t know, and can then give them the next steps of instructional material based on that.
There’s a math education system called Newton that is pretty successful in supporting students in that way. To me, I really most am partial to the systems that are supports for faculty. And in the case of the advising, also support for staff in assisting the student. Even the Newton math system, there’s a faculty who’s part of reviewing the student work.
Because I think it’s really important in higher ed, and in all sorts of teaching and learning, that students know there are people who care about them and are paying attention to them.
Darren Gaddis: And with that information, in the past year, how has AI changed the landscape of higher education?
Fred Martin: So ChatGPT burst onto the scene, I think it was last summer and certainly into the fall, as this technology that’s really good at writing essays. Basically, you ask questions and it’s a conglomerated, all of the knowledge-slash-incorrect or biased information on the web. And then it generates these pretty well-written essays.
And so I think we’re actually yet to see the large scale change that will be happening in the coming months and years. At my school, we organize info discussion session at the very beginning of the semester. So end of January, 2023, one of the faculty co-led the session, she’s head of our college writing program. Which of course will be obviously impacted by a system that can write essays.
And she basically said, we just are getting through Covid, two years of disruption and redesign, can we only take a breath before we have to do it again? And kind of the answer is no. We are going to have to do it again because it is such a disruptive technology. We’re yet to see the direct impacts, but we’re already starting to make accommodations across the higher ed landscape.
Darren Gaddis: How has AI impacted how students learn, but also how faculty members grade?
Fred Martin: Basically, is ChatGPT friend or foe? From one standpoint, we could try to fight it off and basically ban it. And I know that some K12 school districts have done that, and I know that some schools have said that the default policy is students aren’t allowed to use it.
I think that’s a mistake. I think that ChatGPT and other generative AI, I mean we know already Microsoft and Google are integrating them into the everyday tools that we’re going to use. It’s going to be integrated into word, it’s going to be integrated into email. It’s going to be integrated in all the digital tools that we routinely use. And it’s a pretty useful thing.
So I think in higher ed, we need to find ways to make peace with it, and make friends with it and teach our students how to use it. When they leave school, they’re going to be in the working world, the professional world, and we’re going to be using those tools. So it’s incumbent upon us in higher-ed to figure out how to support students in appropriate use of tools like ChatGPT.
Darren Gaddis: With this understanding, is there a need to change grading practices given the nuances of AI?
Fred Martin: I was just interacting with a colleague this week who had a student who, English isn’t her first language. And in this class, the student is writing lab reports. That’s the primary sort of product that’s being assessed in the class. And at the beginning of the semester, the student’s work was, the writing was not fabulous. There were lots of errors, and this person’s learning to use professional english.
And then they turned in a lab report that was grammatically perfect. So the faculty member suspected that ChatGPT was involved. My role in my college, one of the things I do is I support faculty with academic dishonesty cases, and then I have to adjudicate in the cases where students appeal. So I encouraged her to report this and to meet with the student.
So the faculty member met with the student, and the student admitted using it. They were under a lot of stress, and the thing was due, and so they admitted using it. The faculty member is going to give them a chance to rewrite the report on their own.
That one is a clear-cut case that had a happy ending because of the compassion of the faculty member. So faculty are on alert for work that doesn’t feel like the student did it themselves.
This makes it a lot easier for students to take these shortcuts in their learning. And so the pressure on faculty is, they care about academic quality, care about student learning. It’s another thing to be aware of and worry about.
But I think the next step is for faculty to think, okay, how can I bring ChatGPT, and the things that follow it? And again, it’s going to be integrated into our everyday writing tools. So it’s something we’re going to have to figure out, how to support students in using and learning from.
Darren Gaddis: What are some ethical considerations when grading assignments which might have utilized AI technology?
Fred Martin: I mean, a lot of the work that happens in education is students are making products that are assessed. And there’s this dual nature to the work that they do.
So they’re making something that’s intended to be a learning process, and then the result of that work is assessed. And sometimes the thing that they write up is not the product itself. There could be some other work that then they’re documenting. Or maybe the writing product is the main output, and the thinking that went in to make that writing product is where the learning happens.
I mean, it adds work to faculty because you need to evaluate more of the process the student goes through. So there could be more checkpoints where the faculty scaffold students work. So the process is more exposed, and students can more directly reveal the process by which they’re learning and producing.
I do think it will imply changes. And broadly speaking, those changes add to the amount of work that faculty have to do to support their students.
There’s also an ongoing conversation about authorship. So what does it mean to create work where you’ve used these tools? I mean, that’s at a professional level, that’s a conversation. I think it folds back into teaching and learning.
I think the use of tools needs to be disclosed. It’s sort of like one of the trends in scientific work is, in journal articles, multi-author articles, to have a narrative of who did what. So it’s not just a list of names on the article, but it is that, and it’s a conversation about how each person contributed to being co-author of a given paper.
So I think that’s a good analogy for what needs to go on with these AI tools that will be supporting our creative work. That when someone creates something they need to describe, how did they make use of slash collaborate with these AI systems?
Darren Gaddis: What else should we know about the usage of AI in grading within higher education?
Fred Martin: When we started out, I was talking about one of the three uses, which is AI supportive grading. I think that’s another dimension, is that faculty can use these tools as well.
So it’s really, these are professional tools in our world. And I think it’s important that we all learn how to use them to become more productive. Right?
Someone will have a conversation about jobs that will be lost because of AI, and then even more jobs will be created. I think the main thing is that so much work will be affected by these AI systems, and the opportunity of accelerating our human creativity and productivity with the use of them.
So I think in higher ed, it’s both faculty will be using these tools for themselves. They’ll be using them to support student work. That was another dimension of what I shared at the beginning. That a lot of AI, it can surface information in about an individual student or a group of students that then faculty can act on.
So I really believe that we need to make these systems our friends, and be able to be more successful because of that relationship that we have with these digital tools.
Darren Gaddis: Fred, thank you for joining me today. It’s been a pleasure.
Fred Martin: Thank you, Darren. Thank you so much.
Darren Gaddis: Be sure to follow, like, and subscribe to On Campus with CITI Program to stay in the know. If you enjoyed this podcast, you may also be interested in other podcasts from CITI Program, including On Research and On Tech Ethics.
You can listen to all of our podcasts on Apple Podcast, Spotify, and other streaming services. I also invite you to review our content offerings regularly, as we are continually adding new courses, subscriptions, and webinars that may be of interest to you, like CITI Program’s new Environmental Health and Safety Subscription.
The Environmental Health and Safety Subscription provides organizations with key content areas related to health and safety. All of our content is available to you anytime through organizational and individual subscriptions.
You may also be interested in CITI Program’s, AI and Higher Education: An Overview webinar. Please visit CITI Program’s website to learn more about all of our offerings.
How to Listen and Subscribe to the Podcast
You can find On Campus with CITI Program available from several of the most popular podcast services. Subscribe on your favorite platform to receive updates when episodes are newly released. You can also subscribe to this podcast, by pasting “https://feeds.buzzsprout.com/1896915.rss” into your your podcast apps.
- Episode 39: Study Abroad Programs
- Episode 38: Graduate Student Advising
- Episode 37: Mental Health and Student Health Services
- Episode 36: Data Management and Research
Meet the Guest
Fred Martin, PhD – University of Massachusetts Lowell
As Associate Dean, Fred Martin supports faculty and students in the Kennedy College of Sciences, focusing on experiential learning and student success.
Meet the Host
Darren Gaddis, Host, On Campus Podcast – CITI Program
He is the host of the CITI Program’s higher education podcast. Mr. Gaddis received his BA from University of North Florida, MA from The George Washington University, and is currently a doctoral student at Florida State University.