An English instructor at the University of Victoria is one of many B.C. educators trying to figure out how artificial intelligence can be used as a tool to help students while limiting its potential to be harmful to their learning.
In associate professor Erin Kelly’s young adult fiction class, she allows her students to use AI to write a 500-word essay in one of the assignments. The class is given the choice if they want to experiment with ChatGPT and generate a short essay, but after students are tasked with writing another essay critiquing the bot’s work.
“There are a number of students who have opted to pick those choices for assignments because in a lot of cases what they’re saying is, ‘This is something I hear a lot about. I’ve never used it. I’m interested in experimenting with it a little bit,” Kelly said.
“Sometimes I will even get something like, ‘I am an English major interested in becoming a professional writer and researcher and I’m anxious about ChatGPT because of all the things I hear,’ and this is a new chance to finally get to play with it and see what it can do.”
Kelly said that allowing her students to use ChatGPT lets them engage with the course material from a different perspective, but is also giving them an informed sense of what type of tool the software is.
The associate professor also works as the director of the academic and technical writing program at UVic and says she’s had instructors come to her suspecting a student could be using AI instead of their own writing, violating UVic’s AI tool guidelines. Within the guidelines it states, “handing in a paragraph generated by ChatGPT as an example of your writing is a form of plagiarism and thus would violate UVic’s Academic Integrity Policy.”
According to the policy, plagiarism can result in either a failing grade for the assignment or even a failing grade for the course.
Derek Murray, an education developer at Greater Victoria’s Camosun College, said AI has brought both opportunities and challenges to post-secondary education. He told Black Press Media in a statement that easy access to software like ChatGPT can make it tempting for students to take shortcuts which has made academic integrity a major and valid concern for the college.
Murray also found that students had a hard time identifying when it is okay to use AI.
“The lack of clarity and consistency around appropriate use of AI is something we need to address as educators,” said Murray in a statement.
Camosun College’s policy on plagiarism is similar to UVic’s: “plagiarism is the intentional or unintentional presentation of work, ideas and expression of ideas that is other than one’s own.”
Depending on the instructor, if a student uses AI at the college the student can fail the paper and have it go on their permanent record, according to Bronwen Welch, an English instructor at the school.
Welch tends to heavily discourage the use of AI in her writing and literature classes because she wants students to learn how to think critically for themselves. She says AI might be used to come up with ideas for students, but that could get in the way of the creative and analytical process.
“As far as I’m concerned, we should start brainstorming on our own before we seek out things which will do the brainstorming for us.”
Welch said she and her colleagues are at their limit because AI is being used as a tool for plagiarism constantly in their classes. Recently, Welch had to speak to eight students in one day about how AI-generated material is plagiarism. She said some of the students intentionally plagiarized while others were confused.
It’s become a cause of frustration for her and other instructors, as well as an enormous problem that takes several steps to weed out.
“We have to locate the plagiarized portion of the paper, but then we have to make sure that we can demonstrate that this is indeed AI. So it means we have to go back over the student’s previous writing. It means we have to have a conversation with the student.
“It all takes hours and it’s wasting so much time.”
At UVic, not only does using AI-generated material come with penalties, but all the assignments Kelly has seen that contain large amounts of AI-written content are not even passable.
“I have not seen a single piece of student work – where we believed that it was possible that a student might have been using an AI tool – where that piece of work was actually a successful, effective, good passing assignment,” Kelly says.
Allyson Hadwin, a professor in educational psychology, is also one of the many professors at UVic figuring out how the software can be used in schools.
But she says it’s on the educators to help the students with the skills and strategies to figure out when, how and why to leverage AI in their work.
“How can we really use these tools to improve human capacity to learn, to work, to be creative, those are things that are really going to matter more as AI can step in to do more menial types of things,” Hadwin said.
Using AI as a research tool is what Hadwin has been specifically exploring. Semantic Scholar, Inciteful and Research Rabbit are some of the software she uses to compare to more traditional research tools. Hadwin has found that AI is good at finding a fair amount of research, but not as good at finding accurate research.
“When we’re using these types of tools and we find those articles, then how do we know if they’re good? How do we judge the quality of the research that we’re reviewing.”
Second-year psychology student Ziya Cassam has used ChatGPT in a similar way to research topics.
“It helps me get more information on topics that I’m kind of unsure about. If there’s a project and I want to get more information on something I’ll tell them to explain this topic and then I can use some of that in my work, but not directly.”
Cassam is also using ChatGPT to help her connect topics within her writing.
“I won’t copy exactly what they say, but I’ll just ask if it sounds well or if I should change anything in the sentence.”
However, the student has also noticed the flaws of ChatGPT, with the writing sometimes sounding unnatural or “really artificial.”
Hadwin said some of the examples ChatGPT will provide are unclear, but if the right questions are asked the AI can find better-quality answers. It can also be used to connect ideas while researching and writing.
“If you come at it enough ways you can start to see the patterns and sometimes it can help students who are new to research to figure out what the right terms are to be finding more information on that topic.”
Assistant professor in educational psychology Mariel Miller said that to use AI responsibly students need to define what they’re aiming to achieve before jumping in and using the software.
She said they need to be aware of the capabilities – and limitations – of AI, so it’s about building that literacy.
“AI is not more inherently trustworthy or less prone to error than humans. The information might be inaccurate or out of date.”
Miller also emphasizes how important it is for students to understand the ways AI can be effectively and ethically used.
“Students need to ask themselves, how is AI helping me to learn in this situation? And on the other hand, how is it actually taking away from my learning?”
Instructors are uncertain how the future of AI will exactly fit into the education system. Regardless Kelly believes that this will impact several departments and areas of study across UVic. She is certain that working with a variety of departments will get everyone closer to reaching a consensus on how this new technology can be responsibly used.
“It really forces us all to talk to each other and to learn from each other and to have a bigger sense of what’s going on across the university in order to come up with good responses and policies. So I actually think that this is an opportunity for universities to think at of a grander scale about what it is we do and how to do it well.”
READ MORE: ‘Dead star’ reputation of white dwarfs challenged by UVic astronomer’s study