Artificial intelligence could be said to be the pinnacle of humankind’s inventions. But who would have thought that this technology could be the vehicle to bring us closer to nature in a way we could only have dreamed of, from the extremes of infinite night skies to enigmatic ocean depths? The limits of AI were explored recently by two fascinating experiments – one involving the magnificent Northern Lights, and the other the gentle Humpback Whale.
Listening to the light
The Sound of Light project was a months-long collaboration connecting technology and artistry. It tested the limits of how humans can teach AI to read images and create new pieces of sound, and transformed the fascinating natural phenomenon, The Northern Lights, into a multi-sensory Sound of Light symphony.
The project began in Tromsø, Norway, where Aurora chaser Kjetil Skogli filmed the Northern Lights. Sayfritz then worked with the custom-built AI system on the Huawei Mate 20 Pro to create a symphony based on the footage. This AI system was trained to recognise and analyse the different characteristics of light.
Sayfritz remarks, ‘Introducing artificial intelligence into my music has opened up a whole new range of creative possibilities. I can now feed footage of the Northern Lights into an AI system which analyses things such as colour, size, shape and speed. It then generates variations of my musical phrases, based upon what it sees. So, when the AI recognises a particularly dramatic light show, it can modulate the sounds to reflect that intensity in the composition. I can then incorporate those elements into the symphony.’
Once the musical pieces were composed, James Shearman, one of the world’s best conductors and composers, arranged them for an orchestral performance of the Sound of Light symphony in Vienna, Austria. He comments, ‘My role as conductor is to shape, inspire and direct the performance of the symphony. With the help of AI, I can breathe life into it and make the audience feel as if the Northern Lights were actually speaking to us through the music.’
The project culminated in a live audio-visual performance in the famous Brahms Hall of the Musikverein in Vienna on 28 November. With Shearman conducting, the incredible sound of the Northern Lights was performed by the Synchron Stage Orchestra to an audience of over 300 people.
The frequency of love
Conceived in Italy, and developed through the support of WWF Italy, the Frequency of Love project focused on enabling humanity to get closer to the animal world through artificial intelligence. The subject of this project was the Humpback Whale, a baleen species of whale that measures about 16 metres on average and weighs around 40 tons.
The Humpback Whale is well-known for its song, which can last for up to 20 minutes and can be repeated for hours. The males sing during mating season and their song can be heard by female Humpback Whales hundreds of kilometres away.
For this project, an instrument known as an idrophone, which registers underwater sounds, was used to record the mating call of the Humpback Whale. The Huawei Mate 20 Pro’s Kirin 980 processor was then taught to learn to recognise the sound of Humpback Whales and link each sound automatically to a progression of music notes. This resulted in a ‘translation’ of these far away sounds into music inspired by the whales’ original song, which can be shared with a human audience. By bringing the love story of Humpback Whales to a much bigger audience through AI technology, Huawei hopes that more people will become aware of how important it is to preserve the natural world.