This year’s WOMAD (World of Music, Arts and Dance) festival will see a performer from Goldsmiths, University of London create music using only electrical impulses from his body.
Atau Tanaka, Professor of Media Computing at Goldsmiths, will be playing his composition Myogram at the Bowers & Wilkins sound system on 29 July 2017. The piece uses a system that turns the electrical impulses generated from tensing muscles into sound.
Myogram, created in musical collaboration with Miguel Ortiz, was premiered at Goldsmiths in 2015 and since then has been performed around the world.
Ahead of his WOMAD appearance alongside artists such as Speech Debelle and Howie B, I asked Atau about the technology behind the music and the challenges of turning the human body into a musical instrument.
Pete Wilton: How did you first become interested in music controlled by gestures/the body?
Atau Tanaka: Back when I was a student at Harvard, I experienced a performance of the Japanese butoh troupe, Sankai Juku. This was back in the 80s and they were on their first tour of the US, and didn’t have the fame they have today. So at each city they came to, they would incite a “happening” by hanging off buildings in public space, their bodies completely whitened out. This inevitably led to front page newspaper coverage, leading to ticket sales. I was captivated by this, bought a ticket, and was moved by their performance of concentrated movement.
About five years after that, I heard a CD recording by the Dutch composer/performer, Michel Waisvisz. It was a CD compilation of computer music, and all the pieces on the album were very cerebral. Waisvisz’ track was different - it was visceral. He was using his instrument, called The Hands, to take body movement to control digital synthesizers. With just the photo on the album jacket and the intensity of the music, I became enthralled with the idea of corporeal, digital computer music.
PW: What are the challenges of creating instruments played with body movements and signals?
AT: Digital technology has the image of being cold and sterile, based on pre-meditated programming and binary code. Instrumental musical performance, on the other hand, is organic, requires sweat, and is all about the subtle, expressive differences in sound that touch and technique articulate. So the challenge for me of making musical instruments out of digital human-interface technologies was to find this warmth and life in digital technologies, which I’m convinced exists. I’m looking for the infinity of human expression in between the binary zeros and ones.
PW: How does the instrument you will use at WOMAD work?
AT: I perform with physiological signals of the body, electrical impulses that the nervous system produces to cause muscle tension. Much like a hospital electrocardiogram, electrodes pick up these neuron impulses as tiny electrical voltages. These are amplified, then digitised, and sent to the computer to control sound.
Whereas the electrocardiogram measures heart rate, I am using the electromyogram signal from my arms - the physiological signals that cause muscle tension. By tensing my arms and making concentrated gestures, I sculpt sound coming from the computer by shaping and modulating parameters of sound synthesis and sampling.
PW: How can such technologies change how we think about music?
AT: I suppose I’m interested in gestural technologies as a way to re-connect computer-based music with traditional instrumental music performance practice. With the huge success of electronic music, people know that music is produced on computers. But we imagine the creative process being computer-based: clicking on the mouse and using software production tools. There are people performing in the Live Coding scene, doing text-based computer programming live as a way to perform computer music.
In my case, I want to shift focus back to the human body, to capture its expressive richness digitally, and to have its movements be the source of musical expression. In this way, I want to think about the computer as an instrument, or the nexus of the performer’s body, physiological interface, and computer as an extended musical instrument.
PW: Why is showcasing these technologies at festivals such as WOMAD important?
AT: I think it’s great that festivals like WOMAD, initiated by Peter Gabriel as a way to showcase music from different corners of the world, are interested in my work, which is experimental and perhaps esoteric. I’m thrilled to be part of the WOMAD programme for the Bowers & Wilkins stage - it’s a stage where the sound system is of ultra-high quality, and where the audience can focus on the quality and detail of sound.
Meanwhile with the pervasiveness of technology in daily life, people are curious about what mobile and wearable technologies can do. Whereas ten years ago, I needed to make bespoke systems to do what I do, today, the technologies I use, despite sounding exotic, are readily available to anyone. What remains research-driven, and perhaps a bit exotic, is the way I use these gesture capture technologies to turn my body into the source of the music.
The gesture-sensing musical technology is the outcome of Professor Tanaka’s European Research Council project, MetaGesture Music, and follows the release of the CD on Goldsmiths Press, to be distributed by NX Records.