The Development of the !trumpet
From Firenze University Press Journal: Music/Technology
The !trumpet is software synthesis system controlled from, and playing back through, a trumpet. It is not an electronically extended trumpet (like those of Ben Neill, Axel Dörner or Jonathan Impett, among others): the player produces no acoustic sounds by blowing through the mouthpiece. Instead, breath pressure and valve movement on the brass instrument are read by an embedded Arduino microcontroller and sent to a laptop, where the data is mapped onto various parameters in synthesis soft-ware; the resulting electronic sound is returned to the trumpet, where it plays through a loudspeaker inside the bell, and is further processed acoustically by valve position (changes in the length of tubing filter the speaker output), movement of a plunger mute (wah-wah style filtering), and orientation of the instrument in space (panning). The built-in speaker gives the !trumpet a self-contained acoustic quality, rare among electronic instruments, that blends well with more conventional instruments on stage.
The speaker is constrained to the bandwidth of a conventional trumpet (mid- to high-frequencies), but the performer can direct a full-range signal to stereo line outputs for connection to a PA system when bass frequencies or higher sound levels are desired.The mute contains seven momentary switches for controlling various functions in the software. Switch closures are sent to the Arduino on the trumpet body via an infrared link (similar to a TV remote control). Two additional momentary switches, mounted on the trumpet itself, control the routing of the audio to the built-in speaker and the line output.In a nod to David Tudor’s legendary composition Bandoneon! I dubbed this instrument “!trumpet”. But where Tudor employed the “!” to indicate “factorial”, I use the sign for its logical property of negation: this is definitely not a trumpet.
The !trumpet is the latest iteration of a design concept that dates back to instruments I built as an undergraduate at Wesleyan University in the 1970s: adapting conventional instruments for the acoustic manipulation of electronic sound. Under the influence of Alvin Lucier I composed a number of pieces that used feedback to articulate acoustical characteristics of architectural spaces, and eventually extended those techniques to “playing” acoustic instruments. In Feetback (1975), for example, I fitted small microphones inside the mouthpieces of two brass or woodwind instruments, and connected them to speakers (high-frequency horn-drivers) coupled to the mouthpieces of two other instruments. The players change fingering and spatial orientation of the instruments to elicit different feedback pitches as they walked through the performance space — using feedback to overblow the harmonic series of the instruments as they intersected with that of the room. In 1982 I built the first of a series of “backwards electric guitars”: electric guitars whose pickups are wired to the speaker outputs of amplifiers, so that the strings can be resonated with sound (similar to shouting into a piano with the sustain pedal down); chording and dampening the strings filter the sounds. As with the feedback-driven wind and brass instruments, the overtones of the guitar strings were elicited electronically instead of through the usual playing techniques.In 1986 I added a small keypad to a speaker-loaded trombone, linked an optical shaft encoder to the slide, and wired the instrument to a home-made digital signal processor and sampler. By pressing switches and moving the slide I could increment and decrement values in a computer program — in effect clicking and dragging a mouse without having to look at a screen.
Read Full Text: https://oajournals.fupress.net/index.php/mt/article/view/13302