Don’t wanna miss anything?
Please subscribe to our newsletter
Robot making music
Foto: Possessed Photography (Unsplash)
wetenschap

How do we protect artists from AI-companies stealing their music?

Pepijn Stoop Pepijn Stoop,
19 maart 2025 - 15:44

Dutch singers’ voices can be recognized in the AI-generated music of popular AI apps, even though they did not give permission for this. How do we ensure that AI music is created fairly? University associate professor João Pedro Quintais has been researching this for years.  

A new hit by Roxy Dekker or Jan Smit within thirty seconds without them having sung a single word themselves? Thanks to AI apps that generate lyrics, vocals and beats in the style of existing artists, this is no longer science fiction. With applications such as Udio and the music version of ChatGPT, anyone can get started with this.


These AI apps can imitate artists’ voices by analyzing a lot of their music. They learn the artist’s vocal sound and style and how their music is constructed. With this knowledge, the AI model can make new music that sounds as if the artist had sung it themselves, without them having been involved at all.  


This raises concerns among musicians. Last week, research showed that these apps can imitate the voices of Dutch artists such as Froukje and Trijntje Oosterhuis. Dutch artists are warning that their music is being used unlawfully and are accusing the makers of this AI of ‘theft’.  


Are these artists right and how can we protect them against ‘stealing’ AI? Associate professor João Pedro Quintais of the Institute for Information Law has been researching this for years.  

João Quintais
Foto: Private Collection
João Quintais

Is it true that music-generating apps such as Udio can just use or ‘steal’ artists’ music?  
“It’s not always clear. Many commercial AI models are developed by collecting music from publicly available websites on a large scale to train their AI. According to the 2019 European directive, there is an exception that allows companies to do this under certain conditions. Music must be accessed legally, and artists and their record labels must have the opportunity to indicate that they do not want their music to be collected, which is known as the ‘opt-out’. However, it is not always clear whether and how AI companies adhere to these conditions.” 


What makes it so difficult to verify whether AI companies adhere to these conditions?  
“These rules were drawn up before the emergence of music-generating AI and are not designed to deal with this new reality. For example, to ‘opt out’, artists need to know whether their music has actually been used to train an AI model. If you hear your song in the background of a YouTube video, it’s clear that your music is being used by that service. But it is more difficult to prove whether your work is in an AI model’s dataset and whether a song similar to yours was created with it, since AI companies are not required to disclose al their training data. Additionally, artists themselves don't want to be the ones chasing this information; they want AI companies to reach out to them directly.” 


The EU will require AI companies to provide a detailed overview of the data used for their AI models, starting in August 2025. Will this solve the problem?
“This could increase transparency by making it easier to check what music these AI-models were trained on and whether it complies with the 2019 Directive. If, for example, it turns out that the music version of ChatGPT was trained with music collected from an illegal website, legal action could be taken. However, the level of detail in these overviews is still up for debate. AI companies may find ways to withold problematic information, still making it hard to know which music was used to train their models.” 

“Artists don't want to be the ones chasing this information; they want AI companies to reach out to them directly.”

BumaStemra, the Dutch copyright organization for musicians, believes that the solution lies in making usage agreements, also known as licenses, with AI companies. Do you agree?  
“This is an option, but it depends on how licenses are negotiated. Because copyright law is territorial, BumaStemra would need to prove the model was trained or data was collected in the Netherlands, which is challenging since most AI companies are based in the US. Another option is for BumaStemra and similar organizations to sue over unlawful training on their music to block these apps in the Netherlands. Italy temporarily blocked ChatGPT because of data protection issues, so you could follow a similar route here. However, this type of legal cases is still relatively new, so it’s hard to predict its success.” 


Do you think this is the best remedy against music ’theft’ by AI companies?  
“License agreements are a powerful tool, but only if individual artists are strongly represented in them and if their contracts stipulate that they receive a fair share of the income from these licenses. I am concerned about the consequences that AI-generated music will have for the livelihoods of artists, because measures such as license agreements now usually put most of the money in the pockets of record companies, since they own most of the rights on this music.” 


Usage agreements therefore seem to be part of the solution. How can legal researchers help artists to share fairly in the proceeds?  
“By carefully explaining how to interpret the complex legislation and which feasible options can help artists. The companies building these AI-apps often exploit the legislative gray areas and the resulting lack of certainty. I believe that as legal researchers we should not be afraid to talk to policy makers and journalists, who can help us reduce this lack of clarity and actually improve the law or its application. Hopefully this would then benefit the artists who are probably going to be impacted the most by negative effects of AI-generated music, so they can also benefit from this technological development.” 
 

website loading