Body Rock: The Art and Science of Wearable Instruments

Artists and inventors seek to redefine what it means to be a musician. 

Tyler Freeman is pushing the boundaries of traditional music and live performance, with pants. Drumpants are a set of sensors created by Freeman that turn any piece of clothing into a playable instrument. Instead of breaking out pricey percussion instruments, musicians armed with Drumpants sensors simply place them in the clothing item of their choice, select which sounds they’d like to hear, and tap to create songs.

[Drumpants] play not only drum sounds, but they can play synthesizers or pianos,” and more than 100 other sounds, says the 30-year-old inventor. They can sync to smartphones and desktop apps and can be used for live and video performances as well. “You can [use them to] trigger video projections. You can trigger explosions on stage if you wanted.”

Drumpants are currently just in testing mode. Freeman has launched a Kickstarter campaign to raise money for commercial production.

The future of music may not be in traditional instruments. An increasing number of wearable and movement-based instrument projects, some of which incorporate biomedical technologies, are changing the landscapes of both musical and live performance. From the motion sensor-enabled gloves Grammy Award-winning artist Imogen Heap uses to loop and add musical effects on the fly to the prosthetic instruments created by researchers from McGill University in Montreal that range from sound-enhancing visors to flexible musical spines performers can attach, the innovation of instruments will inevitably drive the evolution of how we learn, enjoy, and consume music, say the experts.

More and more, we’re going to get to the point where no matter what instrument you buy, it will come with all these sensors embedded and hook up wirelessly to laptops,” says Dr. Ajay Kapur, associate dean for research and development in digital arts and director of the Music Technology: Interaction Intelligence and Design program at the California Institute of the Arts in Valencia. Dr. Kapur’s research focuses on how robotics technology can impact music and projects include founding The Karmetik Machine Orchestra, an ensemble wherein human performers play alongside musical robots.

Sensors are already working their way into the mainstream. “Digital guitars” are available on the commercial market in several varieties, including instruments that use touchpad sensors instead of physical strings, as well as traditional guitars that connect to software that can evaluate and perfect your shredding technique. Tech-savvy musicians don’t even need to leave home to get sensor-enhanced instruments. Smartphone apps that create sounds based on anything from gesture and movement to your heartbeat and vital signs are already available with the tap of a button and are far more portable than traditional instruments.

Smart, mobile, and customizable are where instruments are headed, says Freeman, especially since all three allow performers maximum stage space and minimum travel and set-up baggage. It may be only a matter of time until instruments are so compact that they are a literal part of the musician.

Maybe people aren’t comfortable quite with embedding things into their bodies, like cyborgs, but we’re definitely moving towards wearable accessories,” he says. “It’s not as invasive as actually embedding technology into your flesh, but you still get the same benefit of having this technology with you wherever you go all the time.”

With sensor-enhanced instruments, musicians of the future may also get the benefit of being able to create ear-palatable tunes without spending years learning to read music. Twenty-five-year-old Pieter-Jan Pieters loves music but couldn’t get into music school because he doesn’t read notes. Heading to design school in Eindhoven, Netherlands, Pieters created his own series of instruments that produce sound based on a player’s movement.

I wanted to create something new and some instruments that you could play intuitively,” Pieters says.

Sound on Intuition, Pieters’ five-piece collection of intuitive instruments, includes a band that wraps around a musician’s foot and produces sound effects with every tap, a flexible sensor that fits around a fingertip and allows players to manipulate sounds with the slightest hand movement, and a scanner that translates drawings and handwriting into music.

[This technology] gives you more of the human sector. It kind of brings back some personality, because now everybody just uses his mouse and keyboard to code music almost,” says Pieters. “If I draw a line which evolves into music, and you draw a line, it will sound exactly the same, but if you move your hand or tap your finger, those gestures are unique because your body movement is unique. The instruments are a way to kind of put some personality in digital music.”

While the wearable revolution will obviously have an enormous impact on how instruments are both created and played, smart instruments and those that incorporate robotic technologies may have an even greater effect on live performance.

It used to be that we would just go to a show and sounds [would be] coming from the speakers,” says Dr. Ajay Kapur of CalArts. “Now with these types of devices and robotic instruments, the instruments are all over the stage. Depending on where you’re sitting, you’re getting sound from all over the place. It really gives you a more enchanted experience. It really turns into an experience, where you can buy an album from a musician on iTunes, but really that is just advertisement for you to go to their show, which you could not experience in your home. All of this technology is helping to push towards that type of future.”

The technology is also taking performance out of theaters and concert venues. From fall of 2011 through spring of 2012, New Orleans played host to Dithyrambalina, a series of mini-cottages built with non-traditional instruments embedded in the architecture. Spearheaded by the artist collective New Orleans Airlift, the “shantytown sound laboratory” featured instruments built into walls, ceilings, and floorboards, including weather-driven instruments and a rocking chair that creates bass notes powered by movement.

The installation drew guest performances from artists including Thurston Moore, Andrew W.K., and Diplo. Earlier this year, sound artist Di Mainstone unveiled Human Harp, a piece wherein a player straps on a cyborg-like suit outfitted with retractable, sensor-enhanced strings. The strings are then connected to an urban structure, in this case the Brooklyn Bridge, and produce sounds based on the structure’s vibrations and the player’s movements.

Widespread musical architecture is still a long way off, but communities of musicians who specialize in areas like sensor-based instruments are already emerging, says Dr. Sidney Fels, a professor of electrical and computer engineering at the University of British Columbia in Canada. Fels was one of the pioneers behind DiVA, a synthesizer that transforms hand gestures into both music and speech. Fels cites projects like reacTable, a musical table that produces sound based on manipulating physical objects set upon it, as examples of new technologies that are garnering a significant following.

Also available as an iOS and Android app that allows players to write their own music by moving on-screen objects, reacTable gained international attention in 2007 when the table version was played live by Icelandic singer Bjork, and has since gained a strong following of music makers who upload their own original compositions and remixes. On a smaller scale, iPhone and other alternative digital instruments are gathering their own band of followers who not only play the instruments differently than traditional music; they judge differently too, says Fels.

Because these instruments are kind of unusual and your expectations are very different, the idea of a virtuoso is not so strong,” says Fels. “A guitar, if you went onto YouTube and played shitty guitar, nobody’s going to watch that. On the other hand, if you go on there and you have some funky new thing that nobody really knows what it does and you play it and do kind of an OK job, it’s kind of interesting and you become part of a community of other people who are trying to play this weird thing. [These communities of] people are into this making of the music, not the listening of the music, and these new interfaces and these new forms of expression kind of allow you bring that back because they’re easier to play, potentially.”

The near-infinite creative potential for sensor-embedded instruments combined with the increasing accessibility mean that widespread use is almost inevitable at some point in the not-so-distant future. It also means that as instruments evolve, so will requirements for musicians and those learning the craft.

I think a musician [will] need to become more interdisciplinary,” says Dr. Kapur. “I think it’s already happening. It’s very rare to find a musician who doesn’t have a laptop. More and more, they’re going to have to learn how to program and be able to speak to these technically inclined artists who are going to work side-by-side with them to push their work to contemporary and modern concert halls.”

Whether artists of the next generation will benefit more from an intro computer programming course than a class in theory or composition remains to be seen. One thing that’s clear—the digital revolution is just getting started.

Related Content

Have some feedback for our editors? Contact Us

0 Comments