Mr DAACI: AI Music - Pride & Prejudice

In the past year we’ve seen AI for music move at an incredible pace. Whether you view this as a good thing or not, it’s happening and it’s not about to stop. As part of this year’s Sŵn festival, the University of South Wales held a demonstration of DAACI, a new AI production tool that creates music to spec within seconds. 



Now I know next to nothing about music theory or production so there are definitely things I missed on how it works exactly, but the demonstration was impressive, as the team took an original track by Welsh singer Kate Westall, who’s music I was introduced to through her Hospital Records collaborations with Fred V & Grafix, and using the tool, created 3 new remixes.



First they played the track in question, a lovely, folky, ukulele led song called ‘Words’ into the system, then 10 seconds later DAACI had come up with a passable pop version, with a bit of a Glass Animals vibe. Producer Tom Manning then tweaked it a little, adding more chords and rhythm. Whilst I couldn’t work out exactly how he was doing that, it looked a hell of a lot easier than any recording package I’ve seen previously. What came out was admittedly pretty middle of the road Spotify pop fodder but the time frame was impressive to rework the track like that.


Next Kate suggested they create a more stomping folk version. Again, I’m not sure exactly what prompts they used but, a few seconds later the original ukulele version had switched to an uptempo guitar and a folky 4/4 kick drum. Definitely more my style this version, if not a little Mumford & Sons. I should add that each time it was just the music itself that changed. Kate sang her vocals live, adapting remarkably well to every version. 



Last up was a drum and bass version. Tom put in an extra prompt to add another harmonic instrument, DAACI chose a cello which worked nicely. Every version came out so neatly, with no clashing notes or rhythms. Whilst this last version wouldn’t exactly have Hospital knocking on the door, it really started to show the benefits and issues that are inherent with an AI (or RA - ‘Real Algorithms’ as host Damon Minchella preferred to call it) system like DAACI.



DAACI can definitely cut down trial and error time for creatives, allowing musicians to play around with different styles and sounds easily, before working more intimately on the final version of their track. It can suggest ideas like a co-producer, opening up more creative possibilities. Although if it only plays by the rules that have been fed into it, will it only put out generic options? I certainly didn’t see anything to suggest otherwise in this demonstration. 



As I said at the beginning though, there’s no escaping AI so I’d urge creatives to embrace it to whichever degree they find it useful. The last thing I’d like to see is it being put solely in the hands of those who wish to erase creatives from the process entirely. The risks are definitely there, especially for things like soundtracks and other sync opportunities which can serve as a lifeline for musicians when revenue from sales themselves appears to be ever decreasing. 

There’s also the possibility of a music industry that erases creatives almost entirely. Not wishing to scaremonger, it never pays to panic about these things, but AI pop stars already exists and I’ve seen some live shows recently where the performer hardly features at all, so it’s not such a giant step to have AI recording and touring artists. 



The best that can happen is it pushes, and facilitates artists to become even more creative in their work. There was another Hospital recording artist at the demonstration, Landslide. He made a good point, that the moment AI becomes really exciting is when people learn how to properly misappropriate it. I’m sure that’s already happening and I look forward to seeing how that plays out.



Previous
Previous

Cwmbran To Cali - Huw Wackman

Next
Next

First ever Welsh Rapper at Number 1