74% of music fans believe that AI should not be used to clone or impersonate an artist without permission, says IFPI study

Photo credit: Laremenko Sergii/Shutterstock

A strong majority of music fans are in agreement with recording companies on the basic principles that should govern the use of AI in music, and an even larger majority say human talent remains “essential” to music creation, even in the age of artificial intelligence.

Those are among the key takeaways from the results of a survey carried out by global recorded music body IFPI.

The org’s soon-to-be-released Engaging With Music 2023 report – which IFPI describes as “the largest music study of its kind,” with more than 43,000 respondents across 26 countries – found that 76% of respondents believe that an artist’s music or vocals should not be used or ingested by AI without permission.

This year’s edition of the Engaging With Music survey marked the first time the IFPI asked music fans about the role of AI in music.

Almost the same percentage – 74% – say that AI should not be used to clone or impersonate an artist without authorization. These numbers apply to those music fans who are aware of AI’s capabilities, and the survey found broad awareness, amounting to 89% of respondents.

Both of these issues – the use/ingestion of copyrighted music and the mimicking of artists’ vocals and music – have come to the forefront this year.

The music industry was set abuzz earlier this year when an unauthorized “fake Drake” track went viral on social media. Though the track, which featured AI-generated vocals by artists Drake and The Weeknd, was one of numerous artist-mimicking AI-generated tracks to be released this year, its rapid rise to popularity triggered a strong response from Universal Music Group, to whose labels both Drake and The Weeknd are signed.

“The training of generative AI using our artists’ music (which represents both a breach of our agreements and a violation of copyright law) as well as the availability of infringing content created with generative AI on DSPs, begs the question as to which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation,” UMG said in a statement in April.

The unauthorized use of copyrighted music in training AI algorithms has made its way into courtrooms. In October, UMG, along with Concord Music Group and ABKCO, sued AI company Anthropic for alleged “systematic and widespread infringement of their copyrighted song lyrics”.

A number of authors, including George R.R. Martin and Sarah Silverman, have also sued AI companies – specifically, ChatGPT maker OpenAI and Meta Platforms – for what they say is unauthorized use of their works in training AI models, which were then allegedly able to reproduce or at least summarize those works.

However, even as rights holders find themselves in confrontations with AI developers over how their works are used, many in the music industry are embracing the potential of AI to improve the music creation process – though many in the business stress that human talent should, and does, remain at the forefront of music creation.

The IFPI survey finds fans are aligned with the industry on this sentiment as well. The survey found that 79% of fans feel that human creativity remains “essential to the creation of music.”

“While music fans around the world see both opportunities and threats for music from artificial intelligence, their message is clear: authenticity matters,” IFPI’s Chief Executive, Frances Moore, said in a statement.

“In particular, fans believe that AI systems should only use music if pre-approved permission is obtained and that they should be transparent about the material ingested by their systems. These are timely reminders for policymakers as they consider how to implement standards for responsible and safe AI.”

Frances Moore IFPI

“While music fans around the world see both opportunities and threats for music from artificial intelligence, their message is clear: authenticity matters.”

Frances Moore, IFPI

The survey found widespread support for transparency in the use of generative AI, with 73% saying that AI systems should clearly list any music they have ingested or used for training.

Social media platforms that engage with music are already working on applying the transparency principle at the consumer level, with both TikTok and YouTube announcing policies that will require AI-generated content to be labeled as such.

Meanwhile, legislators in the European Union have proposed a new set of laws – together known as the AI Act – that, among other things, would require AI developers to disclose the materials they used to train their AI models. That legislation is currently working its way through the EU’s legislative process.

The IFPI survey found that 70% of respondents agreed that there should be restrictions on what AI can do, while 64% said governments should play a role in setting those restrictions.Music Business Worldwide

Related Posts