Artificial intelligence is earning itself a seat in just about every faculty; it’s in mental health, it’s in productivity and office assistance, it’s in web design, it’s in fashion, and it’s in legal research. Now, it is smack dab in the middle of music.
Last year we told you about AI systems that could automatically compose instrumental songs based on an understanding of moods, genres, and some “basic building blocks.” Now, welcome IBM Watson to the show; it’s collaborating with Grammy-winning producers and pop stars. Specifically, Watson’s music algorithm, Beat is helping fill the lyric books and ledger lines.
To provide ample writing assistance to producer, Alex Da Kid, Watson’s research was extensive; it studied tens of thousands of songs to pick up trends in structures, key signatures, melodies and even note velocity (how hard a note is struck); it reviewed everything from speeches to New York Times articles to uncover themes in content commonly capturing the public’s attention; then it reviewed the social activity corresponding to all that content so that it could map themes to sentiments and, as IBM puts it, “build the ’emotional fingerprint’ of each of the last five years.”
From all that, Watson pumped out scores, song snippets, bass lines, lyrics, and other components—it even created an interactive visualization of the data based on palette of preferred images, colors and album artwork. Alex Da Kid then took over, absorbed inspiration, took what he liked, discarded what he didn’t, added his own, and ultimately produced the song below, Not Easy. Don’t worry, it’s only the first song of a 4-song EP Alex, alongside Watson, plans to release.
Enjoy, SnapMunkers: you’re listening to the future. It sounds a lot like a Budweiser Superbowl commercial.
Also, here is some “behind the scenes” footage that provides some more background and context to the process and final product.