30 March 2009


The Navigation Experience is primarily sonic, and therefore, although it is informed by visually-oriented experiences using innovative internet tools in the style of Google Earth (including satellite imagery, maps, terrain, 3D buildings etc), the system distance itself from it and from gaming-like navigation experiences. Geodata and cartographic aid role is just to provide the listener with a sense of orientation within the narrative. However, in the future, there is no reason why these visually-oriented technologies could not be taken into consideration, should other composers want to write for this system, as long as the driving force is sound.
Navigation and musical structure
To define the structure of the musical work, the performer needs to construct the journey by making continuous compositional decisions on what destinations are to be visited and in what order. A tactile screen map and a carrousel menu using Macromedia-Flash software were created to facilitate the task. As seen in the picture below, there are two basic status; Segments and Key-points:

- Segments are Sound-walk trajectories in between destinations (Key-points). They have an associated number of audio-files but sound-diffusion is mostly automated. The composer/performer sonically recreates the segment as she/he drives. The steering wheel and the throttle controller have also its own associated sounds and can control DSP processes.
- Key-points: each sub-destination, so-called Key-points for 3D recreation, is a tactical stop in a distinctive location with an associated sound-world. At these points, the performer/composer makes choices and distributes the sound materials, (those provided for this particular place by the system) across the space, using the Classic System (traditional sound-diffusion techniques as in acousmatic concerts). The steering wheel and the throttle controller remain blocked during stops.

Mission control and the compositional paddock
Mission control is where compositional and performative decisions are being taken while navigating. It is placed in the middle of concert arena and surrounded by speakers (5.0 configuraiton and above is desirable). The Navigation wheel rotates 360 degrees around its vertical axis and therefore, any position in the room could be considered front, and the system may include more than one projection screen.

The compositional paddock: Sound materials associated to a work created for the system are pre-composed in small musical audio-files but still to be ‘arranged’ in mosaic form during performance. Processes of chance and determinacy and the use of metadata in sound-files are incorporated to facilitate this task. This research on reconstructed ‘scores’ focuses on the concern for phrasing and articulation of musical ideas, including the exploration and definition of small sonic-cells and their reconstruction to create new alternatives for structuring sound and to classify sonic data. The system provides the user with alternative strategies for creative expression and responds to performers’ decisions, while the audio materials and visuals are semi-orchestrated according to geo-position and time. The system may deal or interpret real-time data taken from each location via the Internet (e.g. whether conditions).