Lost in Space is a 40-minute show, developed and written by Andy Lawrence (Regius Professor of Astronomy at the Royal Observatory of Scotland) and myself and co-writers and producers. The show combines a live presenter, a new surround-sound electronic score, and 3D visuals.
My role in this was as composer and producer – I developed all the electronic music, wrote the string score, developed the software for performing the audio live, ran recording sessions, and worked with venues for booking and audio set up. We performed the show nearly 50 times and our estimated reach was around 5,000 people.
All the music can either be performed live, or use pre-recorded tracks, and was recorded by members of the Scottish Chamber Orchestra. In addition, there was room for various extras – the last performance included a ‘Puffer Sphere’ – a spherical touch screen that could bring some of the objects in the 3D visuals a bit more to life. This was placed very close to the audience, and was popular with children.
The 3D visuals are provided by Robert Motyka, who is developing a new kind of technology to deliver a 3D experience without the need for special glasses. The idea is to have two different screens – a solid screen in the back and a front mesh screen. By projecting the same image, but altered slightly, it can create the perception of depth.
Lost in Space, combines the experience of astronomers and artists to paint a unique picture of the relationship between humans and the universe. We wish to present something genuinely gripping and visceral that will use information and data from expert astronomers and the craft of emerging artists, in a one-of-a- kind science drama.
Below is a 2D rendering with stereo audio with Andy as the narrator.
My role in this as composer/producer led me to a variety of different jobs. First was to come up with an electronic score that could function as the backbone of the work. This was done in a variety of ways including granular synthesis, audio convolution, and time stretching – but the result of each process was a different ‘drone’ that could function as an emotional anchor for a given section.
To be able to play the musical score, I developed a piece of software in MaxMSP that would function as my interface for the show (below). It’s a one screen interface with volume control for all the speakers, pre-timed cues for fading in/out audio cues and sound effects, and has the ability to change or delay things on the fly, depending on how the narration is going.
I also added a variety of sound effects that could be triggered on cue by the presenter with a small remote. The ‘Trigger Effect!!!” button can be triggered remotely using a small USB remote. With both the presenter and myself in control of this, it allowed us the flexibility for the presenter to trigger things when he wanted, but if he missed a cue, I was able to trigger it myself.
My next task was to develop a score for string quartet – I was lucky enough to work with the amazing Aisling O’Dea and other members of the Scottish Chamber Orchestra. The score uses a series of cues to help line up metered and unmetered textures. The cues help to line the ensemble up as well as change the material played by the different strings in a subtle way so one doesn’t necessary feel all the musical changes moment to moment but does feel the change minute to minute. For example, the musical cue below (heard in the video at the 16:00 minute mark) – the first violin plays a solo on top of this texture.
To see the entirety of the score and how it lines up with the words, there is a PDF below.