top of page

Soundscapes I & II 2002

“There is geometry in the humming of the strings ... there is music in the spacing of the spheres.” Pythagoras.

​

It seemed clear to the Pythagoreans that the distances between the planets had the same ratios that produced harmonious sounds in the plucked string of a musical instrument. To them, the solar system consisted of ten spheres revolving in circles about a central fire, each sphere giving off a sound the way a projectile makes a sound as it moves through the air; the closer spheres gave lower tones while those the furthest away moved faster and gave higher pitched sounds. All combined into a beautiful harmony... the music of the spheres.

 

Equally, aboriginal culture embodies sound creation through different forms, one for example expressed through the labyrinth of invisible pathways which meander all over Australia and known as 'Dreaming-tracks' or 'Songlines'. To the aboriginals these were the 'Footprints of the Ancestors'. Aboriginal creation myths told of legendary totemic beings who wandered over the continent in the Dreamtime, singing out the names of everything that crossed their path - birds, animals, plants rocks etc. and so singing the world in to existence.

 

There is nothing new, both culturally and historically in the belief that the Universe, at a micro and a macro level has its own resonance, an inner voice that can sometimes be heard if we care to listen closely. 'Soundscapes' is a contemporary exploration of this theme, by re-synthesising and giving voice to images of the environments within which we exist, the open landscapes, cityscapes and urban sprawls, the work digitally re-interprets our surroundings through the generation of sound produced from the moving imagery of landscapes and cityscapes.


General Description
 

Soundscapes consists of a database (catalogue) of looped moving image sequences of ten second duration that can be selected and onto which are superimposed five moveable 'targets' (see images below), whose velocity and position can be controlled by the user. As these targets scan across the image, data is collected from the colour, tonality and patterns of the underlying image information and used within sound generation and compositional algorithms.

 

The compositional algorithms are plans or methods for performing actions, operations or procedures on musical material or other data. In this case a number of differing strategies have been adapted to create the sonic landscapes or soundscapes. In this way Soundscapes utilises simple and more complex algorithms to determine variations in pitch, frequency, duration, transposition, tonal ranges and other factors in the interpretation of the image data.

​

Soundscapes generates its own compositions in real-time in conjunction with the image data that is being processed, as opposed to using pre-recorded or sampled sounds. Soundscapes also processes the selected image data in real-time and applies a compositional strategy (algorithm) or 'mood' that has been chosen by the user.

 

These moods therefore form the basic building blocks of the work and can vary considerably from simple melodic forms and natural sounds to abrupt discordant passages that bear no relation to classical music scales, harmony or melody. Currently the majority of these moods are based on synthesis techniques such as amplitude modulation (AM synthesis) and frequency modulation (FM synthesis). FM synthesis for example is very good for emulating acoustic instruments and for producing complex and unusual tones in a computationally efficient way. This is particularly crucial in real-time work such as this.


Installation Description
 

The installation consists of a single screen, Apple Mac G4 with mouse (no keyboard) and data projector together with Harman Kardon USB speaker system and base unit. Selections are made from the user menus enabling control of the moving image database, target speed, target positioning and mood amongst other settings.

​

The installation itself is flexible in its arrangements of screen and projector and requires a darkened area and some sound isolation to take full advantage of the sound output. A larger scale version utilising spectator positioning and movement to determine the evolution of the soundscapes was also developed.


Technical Description
 

The installation utilises a number of programs running together, namely MAX/ MSP and Nato+ modular. NATO+ is acting as the real-time image processing element, dealing with the data gathering and manipulation of the targets. MSP is looking after the real-time sound synthesis and MAX is the main programming environment.

 

The database of images comprises numerous ten second looped sequences of various urban and rural landscapes shot by the artist with the intention of building an ongoing repository of moving image landscapes and cityscapes.

​

MAX is a graphical programming language based on C and allows the user to create simple or complex objects. For example, whilst the interface to Soundscapes looks very plain, hidden beneath the surface there are numerous programme levels, referred to as 'patches' and deeper levels of patches within patches.

​

NATO.0+55+3d modular comprises a set of QuickTime 'externals' for MAX, which is to say, NATO permits any sort of QuickTime media (films, images, sound, QuickTime VR, QuickDraw 3D, Flash movies, etc.) to be dealt with from within MAX in the same fashion as MIDI or audio data can be dealt with using the built-in functions of MSP. NATO interfaces with MAX in the same manner that MSP does. MIDI and numerical data can be used to control any NATO function and consequently, complex structures can be built around QuickTime data that permit control at whatever level is preferred.

© 1996-2024 Nigel Johnson. Interactive Digital Media Artist. All Rights Reserved. www.nigel-johnson.com

bottom of page