AI vocal cloning app Voicify offers 3,000 deepfake models to replicate artists’ voices. Now it faces legal action from the UK’s music industry

Credit: Stock-Asso/Shutterstock

Vocal cloning service Jammable – formerly known as Voicify.ai – has been on the music industry’s radar for some time, and for good reason: It’s among the most popular services out there allowing users to clone voices of famous artists, without permission, and use them to create their own musical deepfakes.

As of last count, Jammable’s website had some 3,000 apparently unlicensed AI-generated voice models available on its service. Among the vocals are those of Adele, Justin Bieber, Phil Collins, Eminem, Ariana Grande, Michael Jackson, Bruno Mars, George Michael, Elvis Presley, Prince, Tupac Shakur, Ed Sheeran, Taylor Swift, and Amy Winehouse.

The voice models are available for the creation of “covers,” which on Jammable means taking a piece of recorded music, stripping out the vocal line, and replacing it with the AI-generated vocals of another artist. So, for example, a user can generate “Ed Sheeran” singing Michael Jackson’s Billie Jean – without licenses for either Sheeran’s vocals or Jackson’s song.

The site, which has described itself as the “#1 platform for making high quality AI covers in seconds,” has become big enough to warrant being the only voice-cloning tech to be called out by name by the Recording Industry Association of America (RIAA).

“The service stream-rips the YouTube video selected by the user, copies the acapella from the track, modifies the acapella using the AI vocal model, and then provides to the user unauthorized copies of the modified acapella stem, the underlying instrumental bed, and the modified remixed recording,” the RIAA said in a submission last fall to the US Trade Representative’s annual “notorious markets” report.

“This unauthorized activity infringes copyright as well as infringing the sound recording artist’s rights of publicity.”

The UK’s recorded music industry group, BPI, also noticed Voicify, and in late February, it sent a letter through its solicitors to the company, threatening legal action unless the vocal cloning site stopped its copyright-infringing activity.

It’s the first time the BPI has taken legal action against a service that enables “deepfakes” of musical artists.

The letter seems to have elicited a reaction from Voicify – though likely not the one that BPI had been looking for. Since the letter was sent, the service changed its name to Jammable, and apparently altered some of its functionalities as well. However, the service continues to offer users access to cloned voice models.

“Music is precious to us all, and the human artistry that creates it must be valued, protected and rewarded. But increasingly it is being threatened by deepfake AI companies who are taking copyright works without permission, building big businesses that enrich their founders and shareholders, while ripping off artists’ talent and hard work,” said Kiaron Whitehead, a general counsel at BPI.

“The music industry has long embraced new technology to innovate and grow, but Voicify (now known as Jammable), and a growing number of others like them, are misusing AI technology by taking other people’s creativity without permission and making fake content. In so doing, they are endangering the future success of British musicians and their music.

“We, like all true music fans, believe that human artists must be supported, and we reserve our right to take action against any operation that infringes artists’ rights and damages their creative talent and prospects in this way.”

“We, like all true music fans, believe that human artists must be supported, and we reserve our right to take action against any operation that infringes artists’ rights and damages their creative talent and prospects in this way.”

Kiaron Whitehead, BPI

Jammable does indeed seem to be “building a big business.” The company’s founder – Aditya Bansal, a computer science student at the UK’s Southampton University – told the Financial Times in May 2023 that he had made “a lot” of money from the app. And that was just months after it had launched.

A subscription to Jammable ranges in price from $1.99 a month for a “starter” account to $89.99 a month for a “power user” account.

In its efforts to confront Jammable/Voicify, the BPI has garnered support from various music industry groups, including the UK Musicians’ Union, the Ivors Academy, the Music Publishers Association, UK Music, PPL and the Association of Independent Music (AIM).

“The unethical use of AI by platforms such as Voicify AI (now known as Jammable) threatens not only the livelihood of creators but also the trust of music fans,” said Paul Clements, CEO of the Music Publishers Association.

“For Artificial Intelligence to be successful for the UK music industry and the UK economy, we require a responsible cooperative approach by all stakeholders, working in tandem and not aiming for the short-term gain for individuals abusing the system at the expense of the UK creative industries and the UK as a whole.”

“The fact that Jammable seem to be doing this with impunity is a reflection of the fact that action needs to be taken.”

Naomi Pohl, Musicians’ Union

Naomi Pohl, General Secretary of the Musicians’ Union, said that Jammable “is just one worrying example of AI developers encroaching on the personal rights of music creators for their own financial gain. It can’t be right that a commercial enterprise can just steal someone’s voice in order to generate unlimited soundalike tracks with no labelling to clarify to the public the output tracks are not genuine recordings by the original artist, no permission from the original artist and no share of the money paid to them either.”

She added: “The fact that Jammable seem to be doing this with impunity is a reflection of the fact that action needs to be taken.”

Nick Eziefula, a copyright and AI lawyer with law firm Simkins LLP, said that BPI’s efforts to rein in Jammable are a sign that things are changing in the unregulated world of AI.

“Today’s news proves that the ‘Wild West’ era of unlicensed AI music generation may not last much longer if music rights-holders have any say in the matter,” he said in a statement emailed to media.

“It is impossible to see how an AI platform that flagrantly and deliberately mimics artists’ voices could be built without using recordings of those voices as training data. Using copyright recordings in this way is unlawful in England unless permission has been granted or any of the relevant exceptions or defenses to copyright infringement apply.”

Eziefula added that, while an artist’s voice may not itself be a copyrighted work, “it may amount to personal data, in which case unauthorized use may be a breach of data privacy laws. There may also be a misappropriation of the artist’s brand, so false endorsement principles could well apply, and even claims for reputational damage if AI-generated works appear to be attributed falsely to the original artists.”

He added: “These are risks day-to-day consumers of the service are unlikely to be aware of – and Voicify seems to hide behind its small print here, expressly asserting that it is not responsible for its customers’ use of generated content.”

“Today’s news proves that the ‘Wild West’ era of unlicensed AI music generation may not last much longer if music rights-holders have any say in the matter.”

Nick Eziefula, Simkins LLP

BPI’s actions come at a time when the music industry is increasingly asserting its rights in the face of a massive proliferation of generative AI tools that have made it easier than ever to clone vocals, music or other forms of intellectual property.

In the US, rightsholders have filed numerous lawsuits against AI developers, asserting that the developers infringed on their rights by using copyrighted materials without permission in the training of AI models.

One of the most closely-watched cases involves Universal Music Group, ABKCO and Concord Music Group, who jointly sued Anthropic, developer of the Claude chatbot. The music companies allege that Anthropic violated their copyright by training Claude on lyrics owned by them, and that Claude not only regurgitates those lyrics when asked, but has also plagiarized them when asked to generate lyrics of its own.

As the alarm over deepfakes grows louder, lawmakers are beginning to react. In the US, a bill titled the No AI FRAUD Act is currently before the House of Representatives. If passed into law, it would effectively enshrine a right of publicity into federal law for the first time, giving individuals or rights holders the ability to sue when a person’s voice or likeness has been used in AI-generated content without permission.

Meanwhile, the European Union’s parliament recently passed the AI Act, the Western world’s first comprehensive set of laws regulating the development and use of AI. Among its regulations is a requirement that developers of “general-purpose AI models” obtain authorization to use copyrighted materials in developing their AI models. However, some doubts remain as to whether the law is sufficient to capture all the potential acts of infringement being seen today.Music Business Worldwide

Related Posts