AIM’s Gee Davy on the future of generative Artificial Intelligence in music

The following blog comes from Gee Davy (pictured), Head of Legal & Business Affairs at The Association of Independent Music (AIM). AIM Connected is AIM’s new flagship landmark conference, taking place from 1-3 April at Studio Spaces, London. More information and tickets are available from www.aimconnected.com.


This period may well be looked back on as AI’s awkward toddler phase.

It’s mobile and likely to be under your feet at every turn, banging its virtual spoon on its virtual tray, constantly wanting to be ‘fed’ and still very much a work in progress.

In comparison with the other buzzword technology phenomenon in its infancy, AI has far more immediate and clear impact and already significantly more current practical applications than Blockchain.

Blockchain may or may not give us new models for storage, transfer and use of information but contains many complex challenges, not least overcoming substantial issues around power consumption at scale and a significant paradigm shift in our understanding, which will first have to be reconciled before practical systems for use appear. In the meantime, there is more music than ever being created and enjoyed, and the businesses that financially support that creation need to run and turn a profit.

Approaching next week, AIM’s new flagship conference, AIM Connected, will weave together business, tech and people into one multi-day event with a plethora of AI keynotes and related panels. The event has brought all things AI to my mind and it seemed a fitting time to explore some of the business implications that this noisy new tech poses.

Though we are only now seeing AI at a publicly usable stage, it has been around for some time. What now seems like a lifetime ago, while studying for my first degree in electronic engineering I, as part of a small team of undergraduates, programmed our very own simplistic AI, a limited version of TfL’s journey planner. We were by no means the first to be developing AI in academic, defence and other circles, and the advent of apps and fast broadband has now enabled a public and business audience for AI applications.

AI is already ubiquitous. It is difficult to find a new app or music tool which doesn’t include the word, albeit to varying extents of application. From discovery-type tools which produce everything from personalised playlists to licensing pitches to systems which can auto-tag tracks or extract all kinds of information to creative or generative AI which can create new music.

Generative AI is already putting out entire albums (with the help of some human performers) and we are moving quickly towards AI which can create an album of music cradle-to-grave. Holly Herndon will be describing one such project in her keynote at AIM Connected.

There are clear financial benefits to releasing music using your very own generative AI rather than humans to create and perform. In an extreme scenario, it could mean no royalty payments, no messy clearance issues, no interpersonal issues and no fighting off competitors just when things are getting successful. After your initial investment in the technology is paid off, all profits could go to your pocket. There are some potential legal catches to this, but that’s a discussion for another day.

“The music industry already struggles to ensure that high quality professional music is heard above the noise, and AI tools are likely to increase this challenge.”

Gee Davy, AIM

It is all too likely that those who can afford significant investment in these technologies, the larger corporations in music, media and tech, will be at least first to market but it is encouraging that artists themselves, such as Holly Herndon, Bjork and Brian Eno are engaging with the technology directly and creating more of a fusion between human and AI.

It is in the interests of the independent music community to enable and encourage this interaction of risk being left out as bigger players develop their systems and audiences.

The increasing spread and accessibility of generative AI music apps for the public mean that, in theory at least, anyone can create music and, as these are ‘trained’ tools, the output is likely to be qualitatively better than the current cloud of extremely variable user-generated music.

The music industry already struggles to ensure that high quality professional music is heard above the noise, and AI tools are likely to increase this challenge. Music businesses must continue to update their knowledge of the landscape and the tools and strategies available to them to ensure they can compete.

Generative AIs can create music but can’t play live in front of you, at least not in the same way a human can (yet). There are certainly innovative gigs but even holograms can’t quite replace the human-to-human experience. Whether or not that will change or matter to future audiences is arguable and, again, engagement and innovation are key.

Additionally, though AIs can create music ‘on their own’, they are trained or ‘fed’ with human music, at least in the first generations of AI. Aside from any conversations within legal seminars about the potential implications of this in future copyright infringement litigation there are wider philosophical and business implications to participation in this process.

If generative AI music is to replace or dilute the market of human-created music, having gained from the history of human musical creativity it is, to my mind, essential to ensure that this input is recognised and that the profits generated should in some part flow back into investment in future human creativity.

Looking at the nearer-term implications of AI in music, though I think we are a little way off replacing all A&Rs with AI&Rs there are discovery AIs which trawl social media to find new breakthrough artists, which could well cut down your time and costs.

AIs can’t go to live shows and experience thrill (though perhaps they may be able to be ‘taught’ the variables to recognise it!). Nor can AI form relationships with the artist and think strategically about their career and musical development, though it could again help to crunch the data and learn patterns that help with release campaign planning.

AI can also help by, for example, organising and extracting useful data from music catalog to help service it better. In related applications, AI can use audio or catalog information to recommend music to listeners, to playlisters, to music supervisors and beyond.

For the time being at least, it seems likely that there will continue to be an audience for humanity in music and a need for humans in the music business, but the technology is evolving quickly and audiences will react and adapt and with the inevitable knock-on effects for us all and the importance for independent music businesses to stay informed and involved can’t be overstated.

The master of Cyberpunk, Bruce Sterling, said in his notorious keynote speech at SXSW this year that “cyberpunk is now reality”, so don your mirrorshades and learn to hack the new music business.Music Business Worldwide

Related Posts