An undergraduate artificial intelligence survey shows that more than 95 percent of students occasionally use ChatGPT for classwork. The UvA unfailingly labels the use of ChatGPT as plagiarism, but this does not solve the dilemmas of teachers, students, and exam boards.
What should we do with ChatGPT? This question has occupied students and teachers for six months. It has made reviewing assignments a lot harder, especially now that everyone has their own rules. Should I give a student who honestly admits to using ChatGPT fewer points? Or reward him for his honesty? The new “AI in Education” task force isn’t providing clarity either.
The reason everyone is “just doing whatever” is because the UvA only has “fraud and plagiarism” regulations. Those rules basically boil down to not copying someone else’s work without citing the source. Fortunately, the aforementioned survey shows that only seven percent of students copy ChatGPT answers. But that’s not where the problem lies either. More than three-quarters of students from the aforementioned survey use ChatGPT as a “starter”: to help them understand a question or find literature. And what does the regulation say about helping without a copy or source list? Nothing, actually.
The “solution” offered by the UvA is adjusting assignments and providing more writing instruction. For example, asking questions on topics that ChatGPT is not yet trained on. I find that idea to be mediocre: making up new assignments every year costs teachers a lot of effort and it’s only a matter of time until a chatbot knows what’s in the paper before I do. But I do think more writing instruction is a good solution. In banning ChatGPT, the UvA does not seem to realize why students use it: they find writing difficult. Writing skills are already trained in the bachelor year in tutorials. So why not teach these students how to write better with AND without AI bots?
Dear task force, I have a plan for you. Instead of banning ChatGPT, come up with three categories: “Allowed,” “Conditional,” and “Unacceptable.” The last category would include anything that is plagiarism, such as copying AI bots. This is also explicitly stated in the plagiarism regulation. “Allowed” would include anything not in the plagiarism regulations that is helpful to students, such as asking for sources for an assignment. And “Conditional?” That is discussed in tutorials where students are challenged not to use creative cut-and-paste, but to look critically at ChatGPT and their own writing skills.
You could set clear rules for mentioning ChatGPT in assignments so that teachers know where they stand. Finally, you could share these rules with the entire UvA so that it is clear from Roeterseiland to Science Park what is and what is not allowed.
And if you still can’t figure it out? Then I would be happy to come over for coffee. Or submit the issue to ChatGPT. I know it’s not allowed, but I would turn a blind eye.
Pepijn Stoop is a UvA student of artificial intelligence.