Spiderman

May 31 2011

For most of us, the Broadway musical represents a very traditional, distinctly American, form of entertainment, something we don’t associate with high technology, let alone the kind of music technology used in contemporary music production. That rather quaint notion of musical theater is apparently giving way to the type of high tech spectacle we’re now used to experiencing in a typical concert performance. I had the opportunity to see Spiderman – Turn Off the Dark while visiting New York City this week. In spite of the mixed reviews and a very public reorganization of the show’s creative direction, the revised show I saw last night was spectacular.

Hiro Iida

To anyone familiar with superhero scenarios, the story line is familiar: geek kid, bullied in school, accidentally gets fortified with supernatural powers that he uses to battle the forces of evil, and gets the girl of his dreams. The music here is by Bono and Edge, and while it has a strong U2 flavor, ii’s really their take on musical theater, familiar to both avid theatergoers raised on radio, and to a younger generation used to music videos and iTunes. While the set and lighting design was decidedly high tech, employing massive LCD video panels, the technology never distracted from the storyline. In fact, for a generation that is coming of age in an era of sensory overload entertainment, the level of visual immersion here is probably essential to keep a large portion of the audience engaged. This is no Annie Get Your Gun….

The real treat for me was a backstage tour and conversation with Hiro Iida, a friend and former colleague at Berklee who was deeply involved with implementing the music technology used in the show. Hiro is truly passionate about electronic music and is an absolute wizard at anything to do with synthesizers. For Spiderman, he worked closely with the show’s lead keyboardist Billy Jay Stein on the electronic music design for the show. Stein is a journeyman New York keyboardist and producer whose resume runs the gamut of popular music. For Spiderman, Bill and Hiro spent months creating each synthesizer patch used in the show. That currently runs about 200 for the main keyboard parts Stein covers live as well and perhaps another 150 used by a second keyboard player and an electronic percussionist. As the show matures, these are revised and updated to match changes in the music.

The system Hiro and Stein designed to support this is made up of eight Power Macs, four running the show and four as backups, ready to step in at the first sign of trouble. Like any major concert production, the technology supporting the show is "mission critical," as nobody wants to tell an audience to hold on while a computer reboots…. All performance patches are made in Apple’s Mainstage using Native Instruments’ Kontakt sampler and Spectrasonics’ Omnisphere as the uber-synth of choice. In addition, the keyboard rig includes a Moog Voyager for special touches of analog beef at select times during the show. All patch switching is done using foot controllers to simply scroll through the Mainstage presets sequentially.

The Spiderman keyboard rig.

While the music for the show is performed live by a band that includes two keyboardists, three guitarists, two bass players, drums, two percussionists and a small pit orchestra, Ableton Live is used throughout for some loops, sampled effects and to provide a click when needed. Tempos can be set by the conductor using Live’s tap tempo function. While Live has many powerful tools for performing, the conductor and music director in the pit only use a handful of these during the show or in rehearsals. Changes often need to be made immediately to a few key parameters, such as transposition and loop length. To make this as easy and intuitive as possible, the show commissioned New York Max for Live wizard David Linnenbank, a Berklee alum, to create a custom interface for controlling Live in the "heat of battle." Linnenbank made great use of the Max4Live API to gain deeper access to the program than available using the standard MIDI mapping functions. The result is a full screen interface tailored to the exact needs of the Spiderman music crew.

Max 4 Live user interface.

My backstage tour included a visit to the band room. While a typical show has an orchestra pit directly in front of the stage, where the conductor has a clear view of both the actors and musicians, the amount of isolation needed to create an effective, studio quality live mix of the music demands that the musicians are in a completely separate studio space, isolated from the stage. The conductor watches the show from a video monitor and the actors, in turn, follow the conductor from LCD video monitors in front of the stage. The amount of technology used to effectively produce the live music and sound for the show is quite impressive, and really demands a completely separate set of skills of to mange the show on top of traditional musical skills from the key musical players.

The Spiderman conductor podium.

I also had the opportunity to chat with one of the guitarists, Ben Butler. Being a guitarist myself, I was fascinated by his work in the band. The show uses the entire gamut of guitar sounds and techniques found in current pop music, so his guitar rack was filled with everything from a Gibson Les Paul, a Martin acoustic, a Jerry Jones baritone guitar, and a Rickenbacker twelve-string electric made for the Edge especially for the show. Both Bono and the Edge were deeply involved in producing music for the show, and Ben said the Edge helped develop some of the specific guitar parts.

Spiderman guitarist Ben Butler

The technology surrounding the music for Spiderman was truly a tour-de-force of techniques and strategies used by modern musicians both on stage and in the studio.

AES New York

Nov 01 2009

The Audio Engineering Society held it’s annual convention in New York City the weekend of October 9. The event has a number of components from work group meetings that discuss proposals for various audio standards, to technical papers and workshops, as well as the mother of all professional audio trade shows. This year’s show was noticeably smaller, as the economy forced many to cut back from their usual presence. Nowhere was this more evident than the eerie absence of Digidesign. While Pro Tools 8 captured this year’s TEC award for DAW Technology, the company spent the year downsizing, losing many key engineering and management positions. Prior to the show, Digi announced Eleven, a new product for guitarists, leaving many in the pro audio community wondering if the company was shifting its attention to the potentially more lucrative mass market. While Pro Tools remains a kind of industrial standard, one wonders what might happen if an industry standard goes out of business…

While the industry as a whole is having a hard time, there’s a common thread that runs through all the players who are weathering the storm and made it to AES, they all have a real love for high quality audio, and regardless of shifting trends and economic conditions, they’re in it for the long haul. Nowhere was this more evident then at API, who manufacture high end analog mixing consoles and modules. This year theuy celebrated they’re 40th year in business with a party and concert featuring guitarist Sonny Landreth with guest Bob Weir of Grateful Dead fame. Their slogan, "celebrating 40 years of ups and downs" says it all. With the rise of DAW systems and "mixing in the box" many thought the end was near for many of these manufacturers. API was quick to realize that the analog technologies they developed for high end consoles could be repurposed for the digital age. Their "lunchbox" series of preamps, EQs and compressors provides a flexible and cost effective way to assemble a high quality, analog signal path for a variety of recording and mixing scenarios.

Sonny Landreth and Bob Weir of Grateful Dead

Although AES is primarily a pro audio show, a number of musical instrument manufacturers make an appearance. Korg has a stake in both camps with their revolutionary MR series of digital recorders, a decidedly pro audio product on one hand, and their line of keyboards a dominant player in the instrument arena. This year they rolled out the SV-1, a new modeled stage keyboard that’s designed from the ground up to be a players instrument. To emphasize this, they’re promoting it with video presentations from respected players such as Neil Evans from the group Soul Live. Korg also rolled out a new version of the fabled Korg Wavedrum. The original was an innovative product that was really ahead of it’s time, and while it was a kind of secret weapon for innovative percussionists, it never really took off in the mass market. The drum itself is not a pad, but uses a drum head to provide the feel of an acoustic instrument. The drumhead serves to provide input to a physical modeling engine capable of sounds that range from organic to electronic. Korg updated the design and dropped the price, and with a renewed interest in electronic performance, this will be an important addition to just about any performer’s arsenal.

Moldover in Brooklyn

One of the real highlights of the weekend was a trip to Brooklyn’s Williamsburg district for a late Saturday night performance by Moldover, who I featured in one of my early blog posts. He’s now a resident of San Francisco, but he was on an East Coast tour supporting a new CD release Circuit Board Instrument. While Ableton Live was the engine for the show, his laptop was off to the side, out of the spot light. As a key proponent of "controllerism," Moldover believes that electronic music performance should be an entertaining visual experience for the audience, and this show was a tour de force of that aesthetic. After years hacking and customizing existing controllers, he’s now using a custom-built unit. It faces the audience and his nimble manipulation of the controls provides a clear visual connection to the sound that’s being produced. Combine this with good writing, guitar playing, and clever use of effects processing, and you get a thoroughly engaging performance.

The hardest part of getting through a NAMM show is wearing a badge that identifies me with Berklee. Now don’t get me wrong, I’m proud to represent the institution, and while my affiliation opens many doors, there are scores of alums in all aspects of the music industry who love re-connecting with their alma mater. If you want to travel to NAMM incognito, get your badge from Harvard.

The best spokesperson for any product is an artist who uses the product, and uses it well. This year, I was pleasantly surprised to see one of my former students, New York electronic artist Matt Moldover, talking about his work and performing at the Ableton Live booth. While sharing a common school experience with fellow alum Dan Lehrich, profiled in an earlier blog entry, Matt has taken a very different path, establishing a profile as performing artist.

Moldover performing 

Matt was one of the legion of guitar players that comes to Berklee each year. While most are looking to follow in the footsteps of one fretted deity or another, Matt always wanted to forge his own path, and after getting in the Music Synthesis major, that was combining interactive electronic performance with the guitar. At Berklee he discovered MAX, and soon was on to the idea of extending what he did as a player to sound from electronic sources. Matt didn’t want to play in a band, he wanted to play with sound.

Matt also got turned on to DJ and club culture. Moving to New York after graduation, he found a scene for like-minded electronic performers, and jettisoned his first name, becoming the artist known as Moldover. Being a player and a geek, he was in the right place at the right time when Native Instruments came out with Guitar Rig. The first time I saw him at NAMM, he was the Guitar Rig guy at NI. While he gave knowledgeable and convincing demos, I got the sense a different muse was calling. At a party in LA we had a chance to talk, and I got a glimpse of some of the projects he was working on, the first of which was the Interstellar ReMix Wagon for Burning Man, 2004.

The thing I didn’t quite realize about Moldover was that he was really pretty good at building stuff. His next project was the Octamasher, a performance system fueled by Ableton Live that gave eight “mashers” a tool to communally create a club mix. Social networking and interactive performance might sound like a research project at the MIT Media Lab, but this is a guy with a laptop, hacking a bunch of cheap keyboard controllers and hitting parties…. pretty cool.

Sometime last fall came a new website and the birth of “controllerism.” According the the site, Controllerism is “the art of manipulating sounds and creating music live using computer controllers and software.” Perhaps Matt will be the first to make both YouTube and dictionary.com. But, what I saw from him at the Ableton booth this year was a virtuoso performance that combined electronic music with the spontaneity and inventiveness of a jazz soloist, swapping clips of sound for notes and scales.

Dan Lehrich and Moldover may seem at opposite ends of a very wide playing field, but what really fascinates me is the real passion they both have for creating immersive performance experiences using computers and physical interfaces. While research in the field of interactive music systems continues at the highest levels of academia, it’s really cool to see real innovation happening on the street as well.