Stage Visuals, Custom Software, User Interface, Live Performance
To showcase the power their new line of Ultrabook laptops holds for musicians and creators, Intel held a national DJ competition with over 400 DJs entering and battling it out over six regional heats.
For the final night of the event series, London based studio Marshmallow Laser Feast was tasked with creating an immersive audio-visual experience and teamed up with elektropastete to bring their installation to life.
I was part of the team that created the visual content for the show, providing each of the six DJs in the final round with a personal visual style that would react to their performance on stage and the tracks they played.
Realizing early on that no existing software tool would allow for enough creative freedom to combine all data inputs coming from the DJs as well as integrating live control to achieve the direct interplay between performer, music and surrounding visuals, we decided to create our own using vvvv.
My task was to design the software core and user interface we were going to use to drive the show as well as creating visual content and performing with it during the event.
Being provided a number of Ultrabooks to work with, I had the opportunity to iterate on some advanced concepts for our custom live performance interface and was able to add some features usually not present in more general purpose VJ programs.
The basic structure of the tool was a two-deck video mixer with some post-processing visual effects before the output stage. Each of the two decks would contain a module that rendered generative visual content in realtime.
To the visual artists preparing the animation modules the system architecture provided audio analysis input and various clock sources to help building rhythmic animations synchronized to the music. On top it gave access to Kinect data capturing the DJs movements as well as a preset system for internal parameters with 8 slots and up to 6 additional external parameters that could be tweaked live.
To enable the visual performers to also easily play modules they did not create themselves and to give more coherence to the system, four of the external parameters were predefined for all modules in how they should influence the visual output. This left two »magic« parameters that the module author could use freely. To further clarity later on, all parameters could be annotated during development by the author.
During performance, module and preset selection in addition to parameter control was facilitated using MIDI controllers, eliminating the need for keyboard and mouse interaction.
For each DJ playing, two modules were created and performed live for his set.
Having multiple powerful laptops at my disposal enabled me to provide generous visual feedback to the performer.
One Ultrabook was used as the master in a network of four machines. It’s responsibility was generating the actual video output to be sent to the LED wall elements in the stage and provide the main user interface for monitoring output, audio analysis, module selection and post-fx.
An additional Ultrabook was positioned in close proximity to the DJ and had a Kinect depth camera connected to track the DJ’s position and movement. It would then send the information over ethernet to the master where it would be consumed by the content modules.
For each of the two decks a dedicated Ultrabook was synchronized to the master via ethernet and sent the current state of all control parameters. It would then display all of the available presets of the deck as realtime rendered previews as well as visualize the state of the additional parameters including custom labelling and information about whether a certain preset would actually make use of a specific parameter.
Positioning of all on-screen elements followed the spatial configuration of the physical control elements on the MIDI controllers to enable a very intuitive understanding of the control layout.
Having the opportunity to apply this level of attention to detail resulted in a very smooth experience during the live performance part of the project.