The Montreux Jazz Festival’s Digital Archive is Changing What it Means to Experience Music
A new immersive concert experience combines VR, 360-degree video, positional audio and heat map technology to put music fans next to their favorite performers
Montreux, Switzerland is a picture-postcard beautiful town on Lake Geneva. It’s a country famous for inventing milk-chocolate and bringing a wide variety of cheeses to the world. It’s also the setting of one person’s journey to create a legacy that has endured disappointment and flames, that has launched the careers of world-renown musicians…and also fueled the march of entertainment technology. We’re about to take a look at how this all came to be, but if you want to just straight to learning about their latest innovation, click here to jump to it!
A long list of firsts
Starting in 1967, the then sleepy town Montreux became home to the Montreux Jazz Festival, an annual gathering created by organizer Claude Nobs of both famous and up-and-coming musicians from around the world. This included performers like Aretha Franklin, Miles Davis, Quincy Jones, Deep Purple, David Bowie, Prince, and Ray Charles – it’s a lengthy and impressive list. It also saw the launch of new, never before seen technology such as multi-track audio, high-definition video, and much more.
Entertainment has long been responsible for driving technological innovation. This union of art and tech proved the right catalyst to help Claude Nobs realize his dream of sharing the festival’s concerts with future generations via audio and video recordings. Recordings that can be enjoyed to this day via the Montreux Jazz Digital Project, a partnership with the École Polytechnique Fédérale de Lausanne (EPFL), a major national university, to develop a fully digital archive of 5,000 concerts stretching back 50 years with 11,000 hours of video, 6,000 hours of audio, and 80,000 photos. The archive is recognized by UNESCO as part of the official Memory of the World registry.
Some of the recordings can be enjoyed via the Montreux Jazz website or YouTube, but many more can be experienced via numerous kiosks and cafes, such as the Montreux Jazz Café in Montreux and in Lausanne, and more around the festival itself.
A vision born from an overwritten video tape
But Nobs’ vision hit a bump in the road. In 1987, two decades after the creation of the festival, Nobs learned that a recording of Aretha Franklin at the 1971 festival, created by the national television station, had been overwritten by a football game. This was when Nobs realized the need to take ownership for the preservation of concert recordings. His love of technology is what drove him create an archive to protect the recordings and to seek to record the concerts in the highest broadcast video and audio quality possible, and recently to even go beyond that to recording the experience of being at a Montreux Jazz Festival concert.
Eventually, technology progressed to the point where the process of converting the entire archive to a digital state, where it could be preserved – and, ultimately, shared – cost-effectively became realistic. In 2007, the Claude Nobs Foundation, the curator of the archives, and the EPFL embarked upon their digitization project. Today the archive is duplicated across three Western Digital ActiveScale storage systems (three is an ideal number to safeguard against the potential for system failure or other disaster – each server is at a different location in Switzerland) totaling many Petabytes of storage and growing rapidly.
To give you a sense of how Nobs’ vision translates into in reality, check out this HD recording of a Montreux concert from 1991. That’s 1080p video at a time when you could still buy a black and white television at retail. It looks like it was recorded today!
Going from audio recording to experience recording
In 2018 the festival pushed boundaries once again, this time creating a way for anyone to experience a Montreux concert in an entirely new way. And it all starts by bringing exciting fields of research together to create a new virtual reality (VR) based experience. It starts with 360-degree video of a concert streamed to a VR headset but grows considerably in complexity from there.
- Heat mapping: Using software developed by EPFL’s Multimedia Signal Processing Group, heat mapping monitors where people using the VR headset tend to look within a scene most often. This information is used to determine where the video needs to be kept at the highest quality level and where it can be compressed. This eases the burden on network and data infrastructure by reducing the amount of data that must be transmitted in real time.
- Optical flow-based panoramic video coding and streaming: While the challenge of network data capacity is addressed with heat mapping, there’s also the challenge of getting data from the camera to a server to be encoded in real time with minimal latency while maintaining the highest level of video quality. An optical flow-based acceleration for coding makes this possible while video quality is barely compromised.
- Optical flow-based stitching: Using the optical flow based stitching technology developed at Tsinghua University in China, an improvement over the more common template-based stitching method found in most 360 live video streaming today, the videos transmitted from the camera are stitched to one another seamlessly without distortion, so the viewer feels truly immersed in the scene. This all happens in real-time with low latency between the cameras and headset, meaning you can enjoy a concert live using this technology instead of merely on demand!
- 3D audio: This is what ties the experience together and makes it truly immersive. It starts with microphones that are attached to each instrument, capturing sound straight from the source (as opposed to being recorded on a microphone a few feet away). Then, audio mixing and rendering technology developed by German company Sphereo is used to recreate the audio scene authentically and change what you’re hearing relative to your position in a scene. This is not possible with a conventional mixing approach. The technology is so precise that it even adapts the audio based on where you are looking – just like real life.
So, what’s it like?
Imagine sitting down and putting on a VR headset. You’re standing on stage, literally in the middle of a band. You look around – up, down, side to side – and can see whatever you want. You could see angles no one would ever think, or bother, to record. If you’re standing next to a guitarist, the guitar will sound louder because of the positional audio. You click a button and move to the other side of the stage. Thanks to the multiple 360-degree cameras you’re standing next to the drummer now, and the drums are noticeably clearer and closer to you.
One thing that might not pop into your mind is how great everything looks. This is thanks to the heat map, the head tracking built into the VR headset, and custom software. The software keeps track of where people tend to look during the concert most often and prioritizes that video to have the highest quality in the experience. This means elements that don’t get viewed as often would be lower quality.
Why the compromise? It’s because this is where the bleeding edge meets reality. The amount of storage required to record an entire concert in 360 degrees, from multiple angles and in high-resolution, is tremendous. But perhaps the bigger compromise comes from bandwidth at multiple points. Networks, and the cables that transfer data from storage devices to computers or VR headsets, can only move so much data per second, and the data from this immersive experience is too great to transmit the entire experience at maximum resolution.
Still to come: The Bleeding Edge
There was one more technology being tested at Montreux we haven’t talked about yet: light field image capture. You may have heard of light field cameras – their standout feature was the ability to allow you to alter the focus of a scene after a photo was taken. This is possible because these cameras capture more information about the light in a scene than a typical camera. In the context of the immersive VR experience, light field technology would allow you to float through a scene. So, in the current VR experience, if you want to choose a different spot to watch a concert you’d be instantly teleported there. With light field data you could float over, or just wander a scene, with the sound in the scene adjusting accordingly. Just like playing in a first-person perspective video game. Exciting stuff!
How you can experience the Montreux Jazz Festival archive yourself
The concert ends. The crowd roars. The band takes a bow. You remove the VR headset. Congratulations – you just experienced a concert that took place the year before. But that’s not the only way you can enjoy past performances.
Besides the aforementioned Montreux Jazz homepage you can also visit a variety of interactive exhibits throughout the festival itself, some of them free to the public. For example, this year the festival had a car converted to faithfully reproduce the acoustics of past concerts. Two people sit inside, select a recording and relax as they are bathed in audio and video. It’s as close to VR as you can get without the helmet. You can also visit one of the Montreux Jazz Cafés where, alongside a great meal, you can see photos and other memorabilia about Montreux and even use kiosks that allow you to choose a concert from the festival’s archive to play back.
Finally, we’d be remiss if we didn’t suggest attending Montreux’s music festival – any music festival, really – yourself. Experiencing a performance in person is a truly unforgettable event, especially on the shores of Lake Geneva or a large open space near you, particularly with friends and family in tow.
Read More on Video, Media, and Entertainment
- Machine Learning Makes TV More Watchable Than Ever
- Can Data Technology Revive Moviegoing?
- How Video’s Dark Data is Coming into the Light