Music composed by AI breathes new life into video games

AI-generated music could help indie video game developers incorporate another element into procedurally generated worlds, making them more dynamic and lifelike.

Back in 2016, Hello Games released its much-anticipated exploration game "No Man's Sky" on PC and PS4. While the initial game largely failed to live up to buyer expectations, gamers found that Hello Games delivered on their promise to create a game with a truly massive world for players to explore.

The game offers an open universe with some 18 quintillion large-scale planets all generated using advanced procedural generation, a term used frequently in the gaming industry to refer to content created automatically and algorithmically rather than manually. Different worlds feature different flora and fauna to discover and unique creatures to hunt or tame.

Yet, even as the visuals change, the game's soundtrack, made by rock group 65daysofstatic, doesn't. The soundtrack, though met with acclaim, is static, something that seems almost at odds with the AI-generated world. What the game might need instead is music composed by AI in real time.

The use of music composed by AI has steadily increased over the last few years, but it has yet to overtake the video game industry. But there are signs that could be changing.

Old technologies               

The idea of music composed by AI isn't a new thing, and neither are some of the basic technologies behind it. Back in the early 2000s, prominent scientist Stephen Wolfram, perhaps best known for developing Wolfram Alpha and the Wolfram Language, created WolframTones, a simple AI program that can create unique clips of music based on one of 15 styles a user can choose from. The program is free online to use, but it is little more than a way to showcase the Wolfram Language.

On the legacy site for Ludum Dare, a well-known and popular Game Jam event that tasks developers with creating a game from scratch in a single weekend, there are several forum posts about WolframTones and other similar tools to create in-game music. Among those posts is the idea that WolframTones, with some tinkering and added repetition, can create music in a pinch or on an indie developer budget. However, what ultimately comes out of it will probably sound like it has been created by AI; tinny, hollow, emotionless and ultimately better suited for inspiration purposes.

Aside from the fact that the tool couldn't feasibly be used to generate meaningful, in-game music in real-time, a lack of emotion was one of its biggest problems.

Music composed by AI aids rather than replaces the human composer

What's lacking in most AI music production tools is the human element, something that Pierre Barreau, co-founder and CEO of Aiva Technologies, is quick to point out.

Pierre Barreau, co-founder and CEO of Aiva TechnologiesPierre Barreau

His company is one of several currently looking to make it in the AI-generated music scene and, so far, the two-year-old company has been fairly successful. Aiva's software was used to create simple music for some of Nvidia's trade show talks and product announcements, and it was even used to help create the theme for the free-to-play mobile and browser game "Pixelfield." But, in all of its use cases, there was a catch -- like WolframTones, Aiva's AI-generated music software generates ideas, not necessarily full and immediately useable songs.

While AI is great at creating emotional music now, it's not necessarily creating music with the desired meaning.
Pierre BarreauAiva Technologies

According to Barreau, a trained musician as well as a programmer, the Aiva platform contains a database of tens of thousands of scores that it creates using machine learning tools to build numerous different scores based on a client's needs. It is able to use music theory ideas to create what Barreau describes as emotional music. But, ultimately, he and his team have to sort through the scores to find the best one for the client and then work to refine it and even add in additional parts, as the scores are only produced in piano notation.

"While AI is great at creating emotional music now, it's not necessarily creating music with the desired meaning," Barreau said. "We think AI can be used by composers to make their jobs easier, make their lives easier." Ultimately, however, a human composer does need to be involved.

Actual performers need to be involved, as well, to actually play and record musical scores, eliminating the possibility of real-time music composed by AI for now. With the "Pixelfield" theme, for example, a whole orchestra contributed to the recording.

Still, Barreau believes that music composed by AI, particularly for video games, "will be a big thing of the future. And it will have to be created by AI, as human composers simply don't have that amount of time."

Maybe that future isn't so far off.

Real-time is real

Melodrive Inc., a fairly new company based in Germany, is currently working on an early stage engine for creating real-time music, and the company has plans to give the software out for free to independent game developers, many of whom have limited budgets.

"We found that users would love to create music if it was something simple to achieve," said Valerio Velardo, co-founder and CEO of Melodrive.

The engine, which is in the alpha stage right now, looks to replicate how a human composer would work, Velardo said.

Valerio Velardo, co-founder and CEO of MelodriveValerio Velardo

Built on all in-house code, the engine uses machine learning and a large database of music to help create original, potentially never-ending pieces. According to Velardo, the engine relies on multiple "high-level systems," including one based on simply creating a score, one that adds expressive dynamics to the score and another that works to "render the abstract score with sounds."

Currently, there are four separate styles to choose from, each with its own separate sounds and instruments. In total, Velardo said, there are close to 100 instruments the engine can call on, although there are plans to grow both the number of instruments and the number of styles.

In terms of actually incorporating the engine into a game, Velardo said it would likely "go in the same direction as customizing your own avatar in a game. [But] imagine you can have an extra level with that where you can also co-create music with AI."

A game developer is skeptical

Melodrive has several free demos available on its website, which veteran game developer and musician Matt Sughrue downloaded and tested.

A veteran of the game industry with 26 years of experience, Sughrue said that while Melodrive appears to have the capability to integrate music generated by AI into gameplay, the quality of the music still isn't up to par.

"The world example they gave had four variants of a piece that wasn't very good in any of the four styles," he said, adding that the company "has a long way to go before the quality of the music matches what a decent game composer can do."

For indie game developers without a music budget, using Melodrive, if indeed the software is given out for free, "is great," Sughrue said. "[But] for a small budget, there are hundreds of tunes that can be handpicked by the developer and licensed cheaply" from outlets like AudioJungle or Soundstripe.

"Great music can add a lot of depth to gameplay, and merely okay or bad music can detract from gameplay," Sughrue said. "I can appreciate what it takes to generate music from a technical standpoint, but I don't think generated music will ever match the quality of handcrafted tracks for a game."

It's likely too soon to know if the use of music composed by AI will take off in the gaming world, but other creative processes, including journalism, are already being augmented by AI or passed off to machines entirely. Time will tell if music will take the same route, but, given that so much else is procedurally generated in modern games, it doesn't seem like too much of a stretch.

Dig Deeper on AI business strategies

Business Analytics
CIO
Data Management
ERP
Close