A musical machine, an event, an installation and a sonic landscape, Prototype for a Spatialised Instrument explores and interrogates the relationships between instrument, player, composer, audience and location. Borrowing from the acoustic principles of piano construction, the bespoke musical instrument is arranged as an inhabitable environment intended to reveal new or different understandings of place, space and sound. A responsive environment, it is intended to challenge our appreciation of how sound interacts with object, our bodies and our surroundings.

Predominantly architecture that deals with sound is space designed for sound - buildings for performance – however this installation discusses architecture as the performative provocateur. More than a passive container for sound, it actively shapes, modulates, conditions and makes sound. Developed in Diploma Unit 23 at the Bartlett, the project originated in Orford Ness on the Suffolk coast near to Aldeburgh where the composer Benjamin Britten lived and worked and established the Aldeburgh Music Festival.

The intention was to design a building that responded to this musical context focusing in on the composer’s instrument, a piano, a remarkable and surprisingly strange device on close inspection. It is a stringed instrument, but its tuning is pre-determined. It is a percussion instrument and at the same time a keyboard instrument - its strings are struck with hammers, but it is played with a keyboard. The piano is also compromised due to its size and the fact that its base strings are wound with copper wire to lower their frequency. Ideally the strings should be as thin and at as high a tension as possible, so the project starts to try and answer the question: what happens if you design a piano technically as it wants to be and then design architecture around it?

The project deconstructs the piano, re-designing and fabricating it into a new spatial proposition. With 13 notes, each akin to a monochord constructed from the principle parts of a piano: frame, soundboard, strings, and action. The instrument however does not have a keyboard, it is driven instead by an Arduino microcontroller and computer running custom software built in Processing. Solenoids placed on each note actuate in response to site conditions such as ambient sound and movement. The audience and elements of the site become players and composers of a generated, site and time specific composition. Exploring a range of potential relationships between location, instrument, player, composer and audience, a series of software programs were developed to drive the behaviour of the piece.

•Digital Player Piano: Pre-composed playback of arrangement using MIDI (Musical Instrument Digital Interface).

•Sound Recognition, Analysis and Interpretation: A Fourier transform algorithm allowing ambient noise, someone speaking, singing or another musicians playing to interact with the piece.

•Body Presence: Infra red sensing to detect location of an audience.

•Motion Recognition: Using a live video feed, an optical flow algorithm senses motion triggering a composition according to types of movement through and around a space.

The project is a digital-analogue hybrid, both in terms of its design and the way it is made and the way it is played and interacted with – many of the parts are designed and cut using digital processes and then assembled by hand and the sound produced is acoustic but generated by physical interaction with a digital interface. Through its development some of the possibilities that digital fabrication open up to the architect are explored. Two types of manufacturing technology were used: laser-cutting, both steel for the frames and plywood for the soundboard stitches and CNC routing for cutting the soundboards.

The frame of a piano is cast iron and has to withstand tension forces of around 20 tonnes. As each of the notes of the prototype were a different size governed by the speaking length of string, 1000mm, middle C on a piano (261.626Hz), to 500mm, C one octave above (523.251Hz), a method of making that would allow for bespoke one offs was needed. The shape of the notes is partly generated by pragmatic considerations of structure: that the frame has to resist the tension of the strings while supporting a soundboard of sufficient size to act as a transducer and radiate their vibration and it is partly intuitive – developed through a series of models.

The notes were 3 dimensionally modeled and the individual components extracted and flattened to produce digital cutting patterns. Achieving a perfect fit between parts and accurately jigging and aligning pieces in order to weld them was difficult and time-consuming to do by hand. This led to the development of a self-jigging frame kit with interlocking slots and tabs requiring no more than a single clamp to align the pieces. The ring piece acts as the former around which the edge pieces are bent whilst a series of stiffeners hold the edge and ring at ninety degrees to one another. The soundboard requires a belly to vibrate freely. The soundboard was modeled, cut and flattened virtually and stitch profiles added in 2D. The patterns cut out on a CNC router were finished by hand. The ‘stitches’, also ply were laser-cut due to their size and tested in 0.05mm size increments to achieve a tight friction fit. Half of the first set of soundboards broke whilst forming them into shape.

Experiments with steaming ply, the sequence of stitching the seams, types of ply, seam spacing and pattern were explored in order to avoid the board cracking. A technique was developed to bring the whole board up into shape in one go using zip ties avoiding over-stressing any one area of the board. The pin block, hitch pins and capo d’astro bar were all made by hand, and the actions from an old piano were fitted to a new bracket which set the relative positions of the hammer and damper. The bridge connecting the strings to the soundboard is a critical component and was templated once the soundboard and strings were in place, drawn on computer and then laser-cut from mahogany to ensure even foot contact and pressure.

Prototype for a Spatialised Instrument, made possible and affordable by digital fabrication tools is crafted through a reflexive process of making and digital modeling. As the design spaces of the architect and the fabrication spaces of the manufacturer close, it suggests a more refined understandings of detail and material can be found through such processes. The role of the drawing is changed as each line represents not only the finished article but the process of making it - the act of drawing and the act of making become inseparable.

This project has become the starting point for a longer term proposition for the exploration and investigation of sound and architecture. Future collaboration with performers, dancers and musicians will test the boundaries and possibilities of the prototype. The exploration of scale - building the notes at the extremes – the bottom A, and the top C, will present particular problems associated with manufacturing, transport, hammer and action operation, volume, decay and sustain. Experimentation with further soundboard materials and manufacturing techniques, testing CNC milled spruce, moulded carbon fibre, and formed aluminium, are all ways in which the volume and tone quality of the instrument could be improved or varied. Further opportunities include installing dynamically adjustable tuning, so that it tunes and re-tunes to site feedback and the addition of a ‘bowing’ mechanism as a means of playing to produce a constant spatial drone. Currently under development is a spatialised instrument with its form and size dictated by its installation site and an enclosure for the current prototype whose size and form is derived from the resonant frequency of the twelve tone equal temperament tuning.

This project would not have been possible without generous help from Bob Sheil, Emmanuel Vercruysse, Paul Bavister, Abi Abdolwahabi, Bim Burton, Martin Avery, Christian Nold, Jon Mercer, Justin Goodyear, Fin Fraser, Javiera Izquierdo Ieiva, Ric Lipson and Lucy Voice. Also grateful thanks to the Centre for Cretive Collaboration, Brian Condon, Thias Martin and Neil Gregory.