The biggest heist in music history: Why AI needs to be built on transparency, consent and remuneration

Credit: PhotoGranary02 / Shutterstock.com
Danish CMO Koda suing US-based AI music generator Suno

MBW Views is a series of op-eds from eminent music industry people… with something to say. The following op-ed comes from Gorm Arildsen, CEO Danish CMO Koda.

Arildsen’s comments follow this morning’s (November 4) news that the org is suing US-based AI music generator Suno, claiming the company trained its AI model on copyrighted music without permission or payment.

Koda, which has over 52,000 members, comprising composers, songwriters, and music publishers, is accusing Suno of what it calls “the biggest theft in music history”.


Koda has taken legal action against US-based Suno in a first-of-its-kind lawsuit out of Denmark. Why? Because the generative AI music service didn’t just train on our members’ copyrighted repertoire without permission – Suno also failed to offer any way to pay for that use or for making those works and the service available. Let’s call it what it is: theft.

We already live with too much friction in the rights landscape – from black boxes to delayed payouts, there is a general lack of transparency throughout. If we don’t set a new, clear standard, someone else will, and creators will pay the price. That’s why Koda is stepping forward to define what fair and ethical use of our members’ human-created works – also known to the tech companies merely as “data” – should look like in the age of AI.

Theft disguised as innovation 

AI promises to expand music’s creative frontier but when innovation rides on concealed inputs and unpaid labour, it undermines the very ecosystem that makes creativity possible. Recent debate around AI music has exposed a glaring hole in the market’s ethics: transparency.

Without it, claims of “fairness” or “fair use” are paper-thin, and secrecy becomes the structural flaw at the heart of the AI gold rush. Too many services refuse to disclose what they train on, inviting the very outcome everyone fears: AI outputs displacing human works in a race to the bottom on price, quality, and integrity.

The solution is clear. At Koda, we call it a simple, universal standard: transparency, consent, and remuneration. These principles are shared across the global rights community and make AI investable and culturally sustainable.

Last week’s UMG–Udio deal proves responsible licensing isn’t just possible – it’s happening. And the stakes are high: Koda’s analysis with IFPI Denmark shows generative AI could erode up to 28% of local music economics by 2030.

Other markets report similar findings, introducing the term “cannibalization rates” to describe revenue lost when human-created music is replaced by fully AI-generated works. Even in a single market, the loss is measured in billions. Chew on that.

The law is catching up

The warning signs are mounting. The UMG–Udio deal didn’t happen in a vacuum. It followed UMG’s legal action against Udio, which pushed the service to the table. Legal action can open the door to fair licensing – and that’s exactly what we want to see. In the United States, lawsuits allege mass copying and even “stream-ripping” from YouTube – piracy by another name.

In Europe, our sister organization GEMA’s case against Suno underscores the same pattern through close-similarity exemplars. Koda’s lawsuit complements this trend by targeting systemic infringement and the platform’s role in distributing AI-generated tracks.

The claim that these services produce entirely new outputs collapses under scrutiny: wholesale ingestion – or scraping – of protected works is the rule, not the exception.  Courts and regulators in the US are signalling that unlicensed training on expressive works will not pass as fair use, especially when output competes in the same market as human created music.

What our lawsuit is – and isn’t – about

We welcome innovation, including AI but not when major tech companies acquire and use creative works without consent. Suno’s approach amounts to copyright infringement: using protected works to train and generate outputs clearly derived from the originals, while withholding the transparency that would allow rightsholders to verify and license that use. That is why Koda is asking the court to affirm that creators’ rights apply in the age of AI.

“We welcome innovation, including AI but not when major tech companies acquire and use creative works without consent.”

Gorm Arildsen, Koda

AI isn’t a threat if it operates with transparency and fair licensing. In fact, it can strengthen the music ecosystem and unlock new possibilities for creators. The UMG–Udio deal is a positive step in that direction, but it only covers works under UMG’s umbrella. Koda represents a broad spectrum of repertoire beyond the major-label system, and those creators deserve the same protection and licensing opportunities in the AI era.

Our aim is constructive. Koda exists to make music usable – lawfully and at scale. Here’s a practical path forward:

  • License the inputs, outputs and when making AI services available
  • Disclose data sources and enable audit
  • Pay fairly wherever protected works create value – at training, at service availability, and when an AI service generates outputs

These points reflect the joint Nordic position announced earlier this year by the Nordic Collective Management Organizations. We already do this across other uses of music; there is no reason AI should be exempt.

Licensing at scale works – we’ve done it before. The difference now is urgency: models can be trained overnight on catalogs that took decades to build. Koda will collaborate with developers who respect a sustainable music ecosystem and operate within clear legal and ethical frameworks. But we cannot and will not normalize a market where the fastest way to build value is to steal everything first and ask never.


Responsibility starts with permission

Music is cultural memory and economic livelihood. When tech companies train on unlicensed works, they strip creators of agency and income while flooding the market with soundalikes and tracks that would never exist without the human-made music underpinning AI systems like Suno.

For smaller European markets – many home to globally recognized repertoire – this is more than a legal dispute; it’s an economic and cultural threat.

“When tech companies train on unlicensed works, they strip creators of agency and income while flooding the market with soundalikes and tracks that would never exist without the human-made music underpinning AI systems like Suno.”

Gorm Arildsen, Koda

If we want a future where music thrives, innovation must reinforce the foundations of creative work rather than cannibalize them. Koda supports artists who use AI as a tool and technologists who build responsibly.

But responsibility starts with permission. The alternative? Licensing at source, auditable use, and fair remuneration. This isn’t just feasible – it’s the only way to ensure AI expands rather than erodes the market for human creativity.


An open invitation to serious innovators

Transparency and licensing aren’t barriers to progress but rather the fastest route to sustainable business models. If you’re building AI music responsibly, we invite you to work with us. If you’re not, expect scrutiny from societies like Koda and, if necessary, from the courts.

Progress can’t stop at isolated deals. Standards must apply universally so every creator – not just the biggest players – benefits from AI done right. The future of music depends on it.

Music Business Worldwide

Related Posts