Baltimore art-pop provocateur Dan Deacon is famous for getting his fans in on the show. He’s decidedly anti-stage, preferring not only to orchestrate his maximalist electro-classical punk opuses from within the audience, but to lead said audience in all kinds of group activities ranging from synchronized gestures to feel-good gauntlets. He’s kinda like a down-and-dirty Flaming Lips in that regard, but while those wacky Okies focus their technological efforts on figuring out how to get music inside of a human skull, Deacon has something else in mind.
His new app for iPhone and Android has the potential to turn any mobile phone in his vicinity into a light and sound instrument. By sending out a coded audio signal, Deacon can convert app-equipped devices within the crowd, hijacking screens, speakers and LED bulbs so that they become synchronized with the music he’s playing. We figured he’d do a better job at explaining this, so we hopped on the phone with the mad scientist himself to learn more. Check out the promo clip for the app, then find out what old modems and future revolutions have to do with it.
What inspired you to make an app?
Two things. One, I wrote a piece in my chamber music alter identity where the audience was given instructions that utilized cell phones as an element. They would call each other to create feedback, set their alarms for a specific time, or put someone else on speaker and ask them to sing. I really liked the way it sounded spacially. It created a unique environment. And then I remember watching the Beijing Olympics opening ceremony where they gave everyone LEDs. It looked so cool, but it must’ve cost a fucking fortune, and the majority of people in that stadium already had something in their pocket that functions as a computer. I started thinking, if you can control all of the phones in a room the same way you control a venue’s sound and lights, you’d really have something.
I mean, it would be crazy, because phones are never thought of that way. You never really see five phones doing something at the same time. It’s traditionally something that takes someone out of an experience rather than bringing them deeper into it. I kept thinking about it, that it’d be awesome to have a show where the light and the sound comes from the audience and there’s an actual symbiotic relationship.
So what was the next step?
I brought it to a bunch of friends who work in digital arts and the main programmer Keith Lea, who used to work for Google. Five of us would get together once a week or so and hash out a plan. The main thing was that we wanted to be able to use this anywhere. We don’t want people to have to rely on wi-fi because you can’t really get more than 100 people on a router, and if we’re at a festival, there’s never data.
Keith came up with the idea of using what old modems did: sounds. We’d have unique audio calibrations send information to the app that the app would then decode and know, “This means turn red. This means turn blue. This means play the lighting sequence for ‘True Thrush.'” We send it about 80 sine waves every eighth of a second. Then the phone interprets where it is in the room, which is split into quadrants.
The next chapter in experimental crowd involvement…
Yeah, but everything before was based around the movement of the human body, or simply trying to recontextualize space, like taking the audience from inside the venue to outside, using different sections of the room, or changing the focal point of the performance. This goes along with that, but it’s the first time we’ve had an additional element — the element of technology.
So what is the crowd going to see on this tour?
I have no idea. [Laughs] We’re trying to get the basics down first. It’s not a perceptive technology. It’s very much gonna be trial and error for the bulk of this tour, but we’ll learn something every night. We’re constantly pushing new updates, so by the end of the week, there’ll be a new update that’ll contain new choreography, tweaked based on what we’ve seen over the past few days.
Like in Cleveland, we couldn’t get them to turn the lights all the way off for security reasons. I’m like, “You realize that everyone is holding a fucking flashlight right now. You’re killing me here.” So it’s good to know — ultimately, they’re just phones. The goal is to change things from, “I’m looking at a stage and the sound is coming at me,” to, “I’m looking around me and the sound and light are engulfing me.”
There’s a keyboard thingy included with the app. What is it?
It’s a little sequencer. You decide the number of steps, and if you’re holding the phone landscape, when you tilt it forward or backward, it speeds up or slows down the pattern. And if you tilt it to the left or the right, it changes the wave shape. We have a piece that we’re not doing on this tour that utilizes it, but we figured the app might as well not be completely useless if you’re not at one of my shows.
Any other plans for Wham City Apps?
Well, we’re making this app open-source, so I’m excited to see what people do with it, how they place it into new contexts. I’m very excited about it in an activist sort of way, because anything that can be turned into binary can be sent using this method. So thinking about how important Twitter was in the Arab Spring after they turned off the phone lines and shut down the cell towers. We could be taking it to the next level.