The EU’s AI act is a vital piece of legislation for the music industry – but what does it actually say?

Credit: Melinda Nagy/Shutterstock
MBW Explains is a series of analytical features in which we explore the context behind major music industry talking points – and suggest what might happen next. Only MBW+ subscribers have unlimited access to these articles.
WHAT’S HAPPENED?

The European Union has done it: It’s hammered out the democratic world’s first comprehensive set of laws governing the development of artificial intelligence…. or has it?

Following intensive negotiations between the European Parliament and the European Council, the Parliament announced on December 9 that it has a deal “on a bill to ensure AI in Europe is safe, respects fundamental rights and democracy, while businesses can thrive and expand.”

The bill’s architects are casting it as a balance between the need to protect “fundamental rights, democracy, the rule of law and environmental sustainability” in the face of risks from AI, and the creation of an environment that will enable Europe’s AI businesses to thrive amidst intense global competition.

Not everyone agrees that this bill has struck that balance; it has garnered some high-profile detractors, most prominently French President Emmanuel Macron, who has criticized the bill as a potential threat to the EU’s tech companies’ ability to compete globally as they face “much faster and much stronger” regulations than have been seen elsewhere. (More on that below.)

Meanwhile, the reaction from rightsholders, and especially the music business, has been one of cautious optimism.

IFPI, the global organization representing the recorded music industry described the bill as “a constructive and encouraging framework.”

“While technical details are not yet finalized, this agreement makes clear that essential principles – such as meaningful transparency obligations and respect of EU copyright standards for any [general purpose AI] model that operates in the internal market – must be fully reflected in the final legislation,” the group said following the deal’s announcement.

Of course, the deal goes well beyond the issues of copyright and transparency in the training of AI models that has preoccupied rightsholders when it comes to AI policy.

The bill introduces a number of restrictions on AI to ensure the protection of Europeans’ rights, including a ban on the use of biometric categorization systems that use “sensitive” characteristics, such as political, religious or philosophical views, or sexual orientation and race.

“This agreement makes clear that essential principles – such as meaningful transparency obligations and respect of EU copyright standards for any [general purpose AI] model that operates in the internal market – must be fully reflected in the final legislation.”

IFPI

It bans emotion recognition software in schools and workplaces; “untargeted scraping” of facial images from the internet or CCTV footage to create facial recognition databases; social scoring based on behavior or personal characteristics; AI systems that manipulate behavior to “circumvent free will”; and the use of AI to exploit people’s vulnerabilities, due to age, disability, or their social or economic situation.

It also requires developers of “high-risk” AI systems to carry out “fundamental rights impact assessments” on their technologies. And it gives EU citizens the right to launch complaints about AI systems and “receive explanations about decisions based on high-risk AI systems that impact their rights.”

But for the music industry, it’s what the law has to say about copyright infringement – and the transparency needed among AI developers that would allow rightsholders to confirm copyright infringement – that are the most salient points in the legislation.

And on that front, at least so far, things aren’t entirely clear.

HOW DOES THE EU AI ACT TREAT COPYRIGHT?

At first glance, the EU’s AI Act has good news for rightsholders, as it appears to confirm that the training of AI models on copyrighted materials will require permission from rightsholders.

“Any use of copyright protected content requires the authorization of the rightholder concerned unless relevant copyright exceptions apply,” states Article C of the proposed law, according to a document obtained by Politico.

That should certainly be welcomed by the music business and music rightsholders, who have been insistent, in the courts and in the court of public opinion, that AI developers should license any copyrighted content they use as part of training AI algorithms.

But there is that caveat in there – “unless relevant copyright exceptions apply.” And a few paragraphs further in the document, we meet one of these exceptions.

“Directive (EU) 2019/790 introduced exceptions allowing reproductions and extractions of works or other subject matter, for the purposes of text and data mining, under certain conditions,” the document states.

“Under these rules, rightholders may choose to reserve their rights over their works or other subject matter to prevent text and data mining, unless this is done for the purposes of scientific research.

“Where the rights to opt out has been expressly reserved in an appropriate manner, providers of general-purpose AI models need to obtain an authorisation from rightholders if they want to carry out text and data mining over such works.” [Emphasis added]

Now things are getting a little more complicated. Firstly, the “opt-out” language is likely to cause concern among at least some in the music industry.

“Opt-out regimes fundamentally undermine copyright protections by shifting the burden to obtain a license away from users.”

National Music Publishers Association

In a recent submission to the US Copyright Office, the National Music Publishers Association (NMPA) made it clear why an “opt-out” system, requiring rightsholders to expressly forbid their material from being used in AI training, could be a problem.

“Opt-out regimes fundamentally undermine copyright protections by shifting the burden to obtain a license away from users,” the NMPA stated.

“An opt-out scheme that requires rights holders to opt out on an AI company-by-AI company or application-by-application basis would not be feasible, given the sheer volume of AI companies and applications; it is nearly a full-time job to keep up with developments in the AI marketplace… Copyright owners, particularly individual creators and small businesses could not possibly meet such a burden.”

The NMPA advocates instead for an “opt-in” system that would require AI developers to license copyrighted materials before they are used to train AI. Yet that doesn’t seem to be the direction that the EU’s law is headed.


Secondly, it appears, given the wording of the document, that someone developing AI “for research purposes” wouldn’t have to license the use of copyrighted materials.

This could potentially create a loophole in the law. In their own submission to the US Copyright Office, the Recording Industry Association of America (RIAA) and the American Association of Independent Music (A2IM) argued that creating carve-outs for AI development for research purposes would allow AI developers to skirt copyright laws.

They noted that OpenAI, the company behind the ChatGPT app that launched the current craze surrounding AI, was initially a research-focused non-profit – before it switched to being a commercial enterprise whose market value was recently estimated to be as high as USD $80 billion.

With an exemption for research and/or non-profits, AI developers could set up shop as research enterprises, suck up copyrighted materials in their training without paying for them, then convert to for-profit businesses, the RIAA and A2IM argued in their USCO submission.

“The final Act appears to take a more flexible approach to transparency around use of training data.”

Creative Commons

Could that happen under the EU AI Act? It’s not entirely clear, but the wording of the draft document does suggest that, when an AI model is used for commercial purposes, it would be subject to the transparency requirements that would enable rightsholders to find out if that AI was trained on their materials. Rightsholders could then pursue compensation.

But those transparency requirements themselves might not allow rightsholders to discover whether or not their materials were used in training, at least not in all instances.

“The final Act appears to take a more flexible approach to transparency around use of training data,” explains Creative Commons, a non-profit that advocates for public domain content.

“Rather than expecting [general-purpose AI] providers to list every specific work used for training and determine whether it is under copyright, it instead indicates that a summary of the collections and sources of data is enough (for example,  it might be sufficient to state that one uses data from the web contained in Common Crawl’s dataset).”

That could be a problem for rights holders, given, for example, that Common Crawl’s datasets include petabytes of data scoured from the internet. Common Crawl’s data is cited in a copyright infringement lawsuit against OpenAI.


So is the EU AI Act a win for the music industry, and rightsholders more generally? Not entirely.

There is an ambivalence towards the copyright and transparency rules in the proposed law that is reflected in German collection society and performance rights group GEMA’s comment that the law is a “step in the right direction,” but that the law “needs to be sharpened further on a technical level.”

Warner Music Group CEO Robert Kyncl reflected this ambivalence as well, describing the AI Act’s rules as “a light touch” – implying he would prefer to see a heavier one.

“While the Parliament’s text needs to be strengthened to make the obligations truly effective, it lays a constructive foundation for establishing responsible generative AI in Europe,” Kyncl wrote in an opinion piece published at MBW.


TECH INDUSTRY REACTION AND THE POLITICAL DIMENSION

The agreement between the European Parliament and the European Council on the broad strokes of the law may, in fact, be the beginning of a new fight over the law’s details.

No sooner had the deal been announced than prominent voices rose up to criticize it – from all sides.

French President Emmanuel Macron has been among the most vocal critics of the deal, arguing that the rules could put Europe behind the US, China and other parts of the world in the race to develop AI.

“We can decide to regulate much faster and much stronger than our major competitors. But we will regulate things that we will no longer produce or invent. This is never a good idea,” Macron said, as quoted by the Financial Times.

He added: “When I look at France, it is probably the first country in terms of artificial intelligence in continental Europe. We are neck and neck with the British. They will not have this regulation on foundational models. But above all, we are all very far behind the Chinese and the Americans.”

Credit: Victor Velter/Shutterstock

“We can decide to regulate much faster and much stronger than our major competitors. But we will regulate things that we will no longer produce or invent. This is never a good idea.”

Emmanuel Macron

Some European tech industry groups have also voiced concerns about the relative stringency of the regulations.

The Computer & Communications Industry Association (CCIA) lamented that the proposed law departed from the “sensible risk-based approach” proposed by the European Commission, and argued the compromise law places “stringent obligations” on AI developers that are likely to undermine AI development in Europe, and lead to an exodus of AI talent, Euronews reported.

“Regrettably speed seems to have prevailed over quality, with potentially disastrous consequences for the European economy. The negative impact could be felt far beyond the AI sector alone,” CCIA Europe Senior Vice President Daniel Friedlaender said.

So it appears Europe has a fight on its hands over its AI Act – and that fight arrives just in time for the next round of European Parliament elections, to be held in June.


A FINAL THOUGHT…

One very relevant aspect of this legislation is the long timeline involved in turning it into law.

The EU’s labyrinthine process for turning bills into law means that, even if things go smoothly from here on in, it won’t be in force until 2025. And, according to Axios, some parts of the bill won’t become law until 2027.

Those parts of the bill involving “high-risk” uses of AI, such as credit scoring and decisions on education and employment, are subject to a 24-month delay, while regulations for AI in highly-regulated fields, such as medicine, will come with a 36-month delay.

And given that France, at least, has indicated it plans to keep fighting for changes to the law, there’s little guarantee it will pass the European Parliament, Council and Committee before the parliamentary elections in June.

“While the Parliament’s text needs to be strengthened to make the obligations truly effective, it lays a constructive foundation for establishing responsible generative AI in Europe.”

Robert Kyncl, Warner Music Group

Consider this in the context of how quickly AI technology is developing and changing. The capabilities of AI algorithms today are considerably greater than they were just six months ago.

Six months from now, the AI landscape could be a very different place than it is today.

The proposed law itself seems to acknowledge this problem, and it gives some flexibility to the EU’s AI Office – a new agency created by the law itself – to alter the shape of some of the regulations after the law has been passed.

But that, too, could prove to be a controversial move. European AI developers are already warning that it could create an unpredictable regulatory environment, further setting back Europe’s AI developers.

It’s not hard to imagine that the EU’s law, hailed as a first-of-its-kind legislative action, could still end up on the drawing board – or prove to be already obsolete on arrival.

JKBX (pronounced "Jukebox") unlocks shared value from things people love by offering consumers access to music as an asset class — it calls them Royalty Shares. In short: JKBX makes it possible for you to invest in music the same way you invest in stocks and other securities.Music Business Worldwide

Related Posts