Last week, the UvA completed the first round of an experiment in which a proprietary version of ChatGPT was used in 13 pilots. Students and lecturers could use this in a secure environment. Is this more reliable than ChatGPT, about which there are many privacy concerns?
Suppose you are a physics student working on your thesis. You are struggling with writing the introduction and you would really like someone or something to help you with it. Ideally, you would ask ChatGPT for help, but the University of Amsterdam actually prohibits the use of this form of AI due to privacy concerns.
Yet the UvA cannot avoid using ChatGPT. And so the university looked for other possibilities and the UvA itself built a special ChatGPT version for Faculty of Science students, medical students and AUC students: UvA AI Chat. The aim of these experiments was to make ChatGPT, the well-known AI chatbot, available for educational purposes without compromising on privacy.
Last Thursday, the Teaching & Learning Center Science, the UvA center of expertise on teaching for Faculty of Science lecturers, presented the results of 13 AI pilots conducted over the past year in which UvA AI Chat was used.
AI as a writing assistant
UvA AI Chat was specially developed to help teachers and students with tasks such as coming up with assignments and writing essays. The system works with so-called 'personas': specialized mini-ChatGPTs that focus on a specific task, such as giving feedback on a thesis. Students can choose personas in this system, which runs in an online UvA environment, and chat with them or upload their written work for the AI to assess. In addition, UvA AI Chat offers teachers the option of having personas developed on request, which can learn the curriculum based on teaching materials or lecture slides, to provide targeted support to students.
The pilot results show that students and teachers found UvA AI Chat to be of added value in their studies or work, for example as an aid in practicing writing skills and as a feedback provider. “The ease of use was the deciding factor for many students to use it,” says Koen van Elsen, teacher in the bachelor's program in Computer Science and involved in the pilots. However, there were also concerns: "Some teachers and students did not always trust the answers provided by UvA AI Chat and its use in education.”
Fully managed by the university
The University of Amsterdam, in turn, has concerns about the regular ChatGPT and still prohibits its use in education. The university points out that OpenAI, the creator of ChatGPT, stores user data and conversations without offering any transparency about its further use. This raises the question of whether UvA AI Chat, which offers ChatGPT, does in fact protect this data.
According to two creators of UvA AI Chat, IT specialists Rik Jager and Danny van den Berg, UvA AI Chat is designed in such a way that data is not stored with OpenAI. Jager: “All commands given to UvA AI Chat are first anonymized and then forwarded to ChatGPT. As soon as it has generated an answer and sent it back to the UvA AI Chat user, all data that ChatGPT has received or created is immediately deleted.”
He explains that UvA AI Chat is fully managed by the UvA: “Compare it to a car that we have built completely ourselves. The only thing we lease, in our case from Microsoft, is the engine: the AI model itself. Apart from that, all the functions of that car are ours, so we can decide for ourselves which users can get in the car and what they can and cannot do with UvA AI Chat.”
Strategic collaboration
The UvA does use ChatGPT via Microsoft for UvA AI Chat. According to Jager and Van den Berg, the UvA deliberately chose this intermediate step because it allows the model to be used with European privacy protection, and they are not dependent on the more flexible American privacy regulations.
At the same time, the university is critical of ChatGPT because it is a product of OpenAI, a commercial company. Microsoft is of course also a tech giant. Shouldn't the UvA, and therefore Jager and Van den Berg, be critical of this as well?
Although Jager and Van den Berg are in favor of greater independence from large tech companies, they see the collaboration with Microsoft as a strategic step to strengthen that independence in the long term. Van den Berg: “We do not have the capacity to build such a large AI model ourselves at the UvA in a short period of time. If we want an AI solution that is in line with our moral values as a university, it is important to have a strong negotiating position with Microsoft. According to Jager, it would help if UvA AI Chat were rolled out at other universities: "Together with them, we could put pressure on Microsoft to enforce better conditions.”
No supervision
To guarantee students' privacy, no one can see what a user does within UvA AI Chat, except the user him or herself. According to Van Elsen, this means that even teachers and the creators of the system "do not have access to the assignments that a student enters”. Van Elsen: “This is a conscious choice, so that students feel as free as possible to use it.”
This does raise the question of whether students know how to use UvA AI Chat responsibly without outside supervision. Van Elsen indicates that more attention will be paid to this in the second round of the experiment that has now started and will last until the summer: "We want to make everyone aware of how to deal with AI and offer more guidance to students in this regard, while continuing to develop new pilots.” According to Jager, this is also an opportunity for the professional development of students: "As an institute, we must learn how students can safely use AI systems in their work later on.”
Jager and Van den Berg indicate that they would like to engage in a dialog with students about how they can consciously use UvA AI Chat. Jager: "The UvA gives us complete freedom to experiment with UvA AI Chat, so we want to discuss this with students and discover how we can best use this system to improve UvA AI Chat. We could add extra rules to make students more aware of how to use the system, but we would prefer to do this together with them, rather than just for them.”