Method & Process

The guiding concept behind this project is centred around using seismic data to create an immersive sensory experience, conveying the volatile union between life on Earth and the planet’s natural forces through artistic interpretation.

All of the source material in this installation came from seismometers located at the Tungurahua volcano in Ecuador, specifically from two significant seismic events in 2013 and 2016. Some theorise that the name Tungurahua is a combination of the Quichua tunguri (throat) and rahua (fire) meaning “Throat of Fire”. According to another theory it is based on the Quichua uraua for ‘crater’. Tungurahua is also known as “The Black Giant”, and in local indigenous mythology it is allegedly referred to as Mama Tungurahua (“Mother Tungurahua”).

The seismic waveform data was extracted using ObsPy and the IRIS web service, yielding a day’s worth of data from seismometers. This data was then compressed or ‘sped up’ to become audible in the range of human hearing by changing the sample rate to compress a full day of waveform data into just a few minutes of audio. Using this technique, what can then be heard is a wide range of clicks, cracks, pops, bangs from different amplitudes and intensities of seismic events like earthquakes. What you hear is the 2013 and 2016 events playing back in that order, at 4 minutes in length each. The 2013 event has a major eruption around halfway through, bookended by a variety of tremors in the build-up to and consequence of the eruption. The 2016 event has quite a different palette of sounds, mostly consisting of strange screaming or wailing noises emerging from the volcano. Both these events were chosen to offer some contrast and variety in the sounds to work with and to experience.

From these original audio files, we listened for particular frequencies which stood out, which were then isolated through EQ filtering, upwards expansion and downwards compression then exported as individual tracks. These isolated frequency tracks were then processed further through the use of delay, reverb, pitch manipulation, granular synthesis and more. All the isolated audio files were converted into harmony, melody and drum MIDI tracks using Ableton’s audio to MIDI converter, yielding interesting results. MIDI instruments were then created by using edited audio samples and assigning them to particular notes which were triggered by these converted MIDI files. The Expression Control Max for Live device was used to map velocity and keytrack to plug-in parameters (e.g. dry/wet, pitch, formant) and control them automatically. The decision of relying on the seismic data to control the composition of the music was in order to emphasise the notion of natural phenomena we are observing here, to let nature have creative control in a sense.

The visuals used in this installation also originate from the volcanic regions of Ecuador, this time through captured video footage. One short video of a pan around the volcanic landscape was transformed and processed heavily using Max/Jitter visual effects and by using amplitude and frequency changes from the original seismic waveform data to control the effect parameters, again with the intention of conceding creative control to the natural arrangement of data.