Algorithm Is A Dancer: How artificial intelligence is reshaping electronic music

If artists and music lovers don’t have a sense of agency when approaching technology, we will get left behind and potentially duped by those who are savvy with the technology; for every innovative experiment and collaboration with AI, there are many more cynical commercial applications of it within mainstream music.

In an interesting, in-depth feature on The Verge last year, musician and DJ Mag writer Dani Deahl took a deep dive into AI based music software such as Amper, IDM Watson Beat and Google Magenta -programs that digest mountains of data from decades of recorded music and its successes, to try to effectively formulate or help to create hits.

AI generated music is already passable enough for adverts, backing music for videos and broadcasts, jingles, call holding music and ‘lite’ music for playlists in public places.

Another interesting concept that is founded on access and co-creativity can be found at UK firm AI Music, which applies artificial intelligence to understand mood, location, activity and time of day.

AI Music claims its algorithms can, through nuanced learning, potentially create thousands of different versions of a song, hyper customising your experience and relationship with music, and how you interact with it.

AI Music founder and CEO Siavash Mahdavi has explained in interviews how people pick music much more through their activity and mood, and how their software encourages collaboration and experimentation from the user.

“What we’re looking at doing,” he says, “Is shifting music to a similar paradigm, so we get more and more people playing with music, lowering the barriers of entry to music creation using these tools. Looking at photography, we still have the artist level photography. That’s there to stay, and it’ll be a similar thing to music. But we’ll have more people playing with and creating music.”

This article was summarized automatically with AI / Article-Σ ™/ BuildR BOT™.

Original link