At the Barbican Center in London, on October 31st, 2018, London Contemporary Orchestra will play two great pieces of contemporary classical music by Giacinto Scelsi and John Luther Adams. However, there is something new and unique about this event. In unprecedented step, the music will be accompanied by visuals that are automatically generated using an artificial intelligence algorithm in response to live music with aesthetics inspired by nature.
Displaying visuals to accompany music in concerts is typical in live pop-music concerts, and sometimes in classical music concerts too. However, such visuals are typically prepared offline and scripted to play along the music and mostly are not reactive to the music. Making such visuals usually requires lengthy work by computer graphics and animation experts using various software tools and elaborate effort to give the illusion that the visuals are synchronized with the music. In offline scenarios, when making a music video or a movie, art directors and computer animation and graphic designers have the time to generate visuals that synchronize with music, however this is such an intricate and time-demanding process. The 1940 Disney movie Fantasia was groundbreaking in that front, where around 1000 artists and technicians worked in making it.
What is happening at the Barbican event is quite different. The visual art is generated by an artificial intelligence algorithm in real-time in response to live music as it is played. The algorithm generates images with aesthetics from nature and uses the music as inspiration for the generation.
At Artrendex we developed Music2Art, a generative model that takes music as inspiration to make art. The algorithm uses a novel generative model that learn the aesthetics from a collection of images or videos offline. At the concert, in real-time, the algorithm takes the music stream, decomposes it to basic frequencies and align the audio frequency contents with a representation of the aesthetics that it learned offline. The generative model then renders new images that is directly reactive and synchronized with the music.
Earlier Experiments:
We first experimented Music2Art with AICAN art. AICAN uses Creative Adversarial Network to learn the aesthetics by looking at art from art history and generate art that follows that aesthetics but doesn’t fit existing styles. Music2Art aligns the music content with the modes of variations in the generated art. In this example we used the 5th movement from Giacinto Scelsi, Uaxuctum. We only chose the first frame for the visual.
In another example, we tried a very fast music the Sabre Dance from Khachaturian Gayaneh ballet, the results was very colorful and dazzling.
Nature-Inspired Aesthetics
For the concert with London Contemporary Orchestra, in Collaboration with the Universal Assembly Unit, we trained the system with nature-inspired visuals including videos of volcanos, thunder storms, caves, underwater, and others. The algorithm learns the aesthetics from these sources and use that in its generation. The results is very captivative chromatic visual experience.
This is a trial where we used aesthetics from a cave for the 5th movement of Giacinto Scelsi, Uaxuctum
In another trial, we used underwater aesthetics to accompany John Luther Adams Become Ocean
At the concert, several other visuals will be used. It is exciting to see what will happen at the concert. We have little control over the generation in real-time. The process were only tested in studio setting with recorded music, never in a concert hall with a live full size orchestra. We cannot predict what will be generated. Anything can go wrong. We hope everything will go right.
The Music
At the concert two great pieces of contemporary classical music will be played. The first is by Italian composer Giacinto Scelsi, titled “Uaxuctum: The Legend of the Maya City, destroyed by the Maya people themselves for religious reasons”. “it’s a work so demanding that this is the first time it has been heard in the UK, even though it was written over 50 years ago” said Huw Humphreys, the Head of Music at the Barbican Center. The second is by American composer John Luther Adams titled “Become Ocean” which musically depicts the world consumed by seas. Alex Ross, the New Yorker music critic described it as “may be the loveliest apocalypse in musical history.”
Project Credits:
The algorithm is developed by Artrendex Inc, a NY-based startup that builds innovative AI for the creative domain. Art direction is done by London-based Universal Assembly Unit. The development of the project has been the result of over six months of research and development aided by Devin Gharakhanian, Benjamin Heim and Simon Hendry. The concert will be performed by London Contemporary Orchestra and Choir, conducted by Robert Ames.
Credit: BecomingHuman By: Ahmed Elgammal