The Lens
Generative AI:
A plagiarism machine, or a creator of equal opportunity?
With students across the world using Generative AI, is it fair that it benefits some students more than others? How can students use it for success?
By Louie Cornish
As detailed in QS’ report about attitudes towards Generative AI, academics and students alike are aligned: Generative AI is here to stay, and it’ll probably be a good thing for students’ learning experience.
Not everyone’s using it equally though. While many areas of the world hover around the global average, only 45 percent of North American students looking to study abroad have used Generative AI technologies.
What’s different about prospective students in North America that sees them using Generative AI less? The most common study destinations cited in the QS International Student are English-language speaking – the US, the UK, Australia and Canada. Large Language Models – ChatGPT as one example – create text that reads as if it was written by a native speaker. This benefits students who aren’t fluent English speakers far more than it would than those who speak English as a first language.
How are students faring in the classroom if they’re using ChatGPT to show stronger language skills? Is it fair?
It has an impact before ‘Apply now’ is even hit
Regardless of its equity, AI is affecting prospective students' career ambitions, their course choice, and even what country they want to study in.
According to the QS International Student Survey 2024, 30 percent of international students looking to study in the UK agree or strongly agree that AI is affecting their career choices. For those looking to study in Australia or the US, that number rises to 32 percent and 34 percent respectively.
Adapting the classroom
As explored in last month’s QS Insights Magazine , examples of academics using Generative AI have been popping up across the world. But, when first faced with students using the technology, there was alarm. When ChatGPT first hit the mainstream, widespread concern hit academia – there are 22,000 results on Google Scholar when you search for “ChatGPT academic integrity”.
So, have those fears been realised and are students cheating? Cecilia K. Y. Chan, Professor at the University of Hong Kong, and Chief Expert of Future Readiness and AI Literacy in Higher Edcuation for UNESCO has written multiple academic papers on the topic and says there’s no definitive answer; the technology is too new. “Is using the internet to research a topic considered cheating when, in the past, we mainly used books or journals?” she says.
Igor Grosmann PhD, professor of Psychology at the University of Waterloo, Canada, offers a similar sentiment - it depends on the definition of cheating. “If the rules for a class dictate that no technological assistance should be used, be it a calculator or ChatGPT, and a student still uses it, it would clearly constitution a violation of conduct from an equity standpoint,” according to Professor Grosmann.
Students are using it, but know the tech’s risks
“My anecdotal experience has been that the uptake of Gen AI technology has increased quite dramatically, and especially among students from outside of Canada,” Professor Grosmann says. This aligns with QS International Student Survey 2024 data, where 65 percent of international students interested in studying in Canada said they’d used or interacted with Generative AI technology.
But using it does not necessarily mean abusing it. Professor Chan introduces QS Insights Magazine to her concept of “AI-guilt”, which is a psychological phenomenon where individuals feel guilt or moral discomfort when they use AI tools and worry about their reliance on the technology. This ties into another of Professor Chan’s concepts, called “AI-giarism”, combining AI and plagiarism - “the unethical practice of using artificial intelligence technology, particularly generatively language models, to generate content that is plagiarised...without appropriate acknowledgement of the original sources or AI’s contribution” (Chan, 2023).
According to her study of 393 students in Hong Kong, “students demonstrated a nuanced understanding of AI-giarism” and that “students viewed the outright use of AI tools to generate and copy content as a significant instance of academic misconduct”.
If Professor Chan’s study is to be applied more widely, students understand that using Generative AI might be plagiarism. There are still opportunities for students to use it ethically and equitably, according to Professor Grosmann. “Where Gen AI can help is when students can interact with the system as if it were a tutor: quiz it about their own writing; evaluate Gen AI writing and code to stop errors and identify ways to improve on it.”
It also depends on how the teacher has embedded the tech. “If Gen AI is integrated into the course by design, and used wisely by the instructor, it will not constitute violation of academic integrity, but rather can be used as a helpful tool to foster learning,” Professor Grosmann says. Professor Chan agrees, saying that understanding AI-guilt and AI-giarism helps academics and university leaders create guidelines that “ensure a balanced use of AI”, adding “at the moment, most schools and institutions have only a grey line or no guidelines, leaving students confused”.
For Professor Grosmann’s classes, he has taken matters into his own hands. “I have made a conscious decision not to use essays to evaluate students since summer 2023,” he says, instead opting for “in-class, practical activities” to measure student success.
Can you really use Generative AI to write your essays?
With the concept of AI-guilt in mind, let’s take a moment to consider the purpose of learning. As one teacher noted in response to Professor Chan’s study, “I fear that over-reliance on Als in learning might adversely affect students' learning. They may be deprived of the actual thinking process, which, to me, is the most crucial part of learning.” Should students choose to use Generative AI to write their essays, they may not develop the critical thinking skills and the ability to construct an argument that’s so necessary in life after university. Despite increasing acceptance among academics, Professor Grosmann says that misuse of the technology could have adverse impact on students’ grades and perception from their tutors. “Gen AI is a professional ‘pseudo-profound bullshit generator,’ and universities don’t want to get an impression you are going to “bullshit” yourself through school.”
“If English as a Second Language students are merely using Gen AI to draft or translate their essays, the current Gen AI systems will not be effective in making them competitive – the essays from Gen AI by default too generic, focus on generalities and miss the critical nuances that can help the applicant to stand out. Such Gen AI generated essays are also very easy to spot,” Professor Grosmann says. Professor Chan agrees: “Using AI to complete all of their essays [is likely to] be less frequent, as university assignments are often too complex to be fully handled by AI.”
She adds: “Focus on the substance you want to communicate and what your personal style is. Always craft those elements by yourself, in your own words. Use Gen AI to help you identify any parts you think you may have missed. Use it to get general feedback on the flow of your writing and suggestions for more eloquent and clear prose.”