It seems to finally be here: the first legislation regulating AI. The European AI Act passed through two EU committees on May 11 and may proceed to the European Parliament. Right now, many Science Park students are working on a thesis concerning an AI topic. What will the AI Act mean for scientific research and students?
The moment the AI Act was agreed to, I was reading about it while relaxing in a stuffy study hall, sitting next to a fellow student slaving away at his thesis. The first thing I read was that certain types of artificial intelligence would soon no longer be allowed on the market. In the list below, the most obvious example was “fraud detection.” I wondered if his work would soon be made redundant by the new law.
The AI Act is the first attempt to slow the AI express train. The law categorizes all AI based on the risks its deployment poses, for example, to human rights. The highest category, “unacceptable,” will soon be banned. This includes “social scoring,” i.e., giving a “social score” with facial recognition to people's behavior. Black Mirror-style facial recognition such as it takes place in China will not be allowed on the EU market under the law.
In addition, AI that is not “unacceptable” but is “high risk” will be subject to EU approval before it can be sold. It's about time: it's pretty crazy that my electric toothbrush has to meet more requirements than a technology that determines if I've committed fraud. And it’s not just about fraud detection, but also concerns AI in healthcare and education. For example, stricter rules are likely to apply to anti-cheating software, which is now known to discriminate against students.
The AI Act is especially relevant for companies, which must now seek approval for “high-risk” AI before marketing it. An approval process will probably involve a lot of paperwork. As a student, will you notice any of that? Probably. Even if it is only companies that must begin meeting requirements, this could inhibit AI innovation. Getting a graduate internship at an AI company will become more difficult in that case. In addition, major AI developments by companies - for facial recognition for example - feed science, and vice versa. What if the thesis itself is still legal, but the final product is not allowed to be developed or is curtailed?
Fortunately, the law is not yet final, so there is room for negotiation. The Eindhoven University of Technology has proposed changes to the AI Act to make it more research-friendly. The law won't take effect until next year at the earliest, so graduate students can get to work now without extra stress. And hopefully, by next year we will finally have a handle on how to eliminate bad AI in education. Or at least control it as well as my toothbrush.
Pepijn Stoop is a student of artificial intelligence.