Copyright 1992 by Mary Eisenhart and MicroTimes For MicroTimes #97 Truckin' In Style Iris Jams With The Dead By Mary Eisenhart Some months ago, the Grateful Dead, contemplating their 27th year on the road and the ever-larger venues needed to accommodate their growing crowds, decided the computer video at their shows needed work. Long noted for fanatical perfectionism in the area of sound, justly renowned for effects created by lighting director Candace Brightman, the technology-conscious band decided that the video portion of the GD concert experience wasn't up to their standards. These days most self-respecting rock 'n' roll groups hit the stadium tour circuit with a dazzling array of techno-toys designed to enhance the concert experience--lights, videos, animations and graphics add new dimensions to the music itself. Even for unfortunate audience members a football field's length from the musicians, technology goes a long way toward creating a compelling show. But a Dead show presents unique problems for a video crew. Says Grateful Dead video director Bob Hartnett, who's previously toured with the likes of Janet Jackson and ZZ Top, "Whereas most tours are highly choreographed, right down to the seeming ad libs, here it's totally improvisational all the time." The Dead have a working repertoire of around 250 songs. Nearly 100 times a year, they walk onstage with no very clear idea of what they're going to play that night. It will probably not include anything they played the night before--six-day runs with only three songs repeated are not uncommon. New songs appear without warning. Consequently, says Hartnett, "It's really hard for us to bank or pre-store effects, pre-produce material for it, because on any given night, any song could follow any other song. So that aspect is really challenging." Not only that, the Deadhead audience is none too receptive to the prefabricated. For one thing, many of them work in technology fields and are quite familiar with the state of the art. For another, unlike other fans who are content to catch the Madonna tour when it comes to their home town, Deadheads expect every show to be different, and travel a good deal in order to catch as many concerts as possible. "Night to night," Hartnett explains, "we have a lot of the same audience. They have such a loyal following--we can't do the same thing twice. What gets cheers on one night is commonplace the next night, so we have to constantly reinvent and redo the way we do each individual song." Skull And Iris Hartnett, a freelance video professional based in San Francisco, spends most of his time working in live concert video, usually in conjunction with Bill Graham Presents at the Shoreline Amphitheatre in Mountain View, which has its own dedicated video system. But he says it was his work in broadcast video--MTV, VH-1, CNN, Satellite News Channel, and CNN Headline News, where he was design director--that attracted the Dead, who were especially interested in effects and show design. "They wanted to bring video up to the same state-of-the-art level that their lighting and sound have always been," he says. He laughs, "There was so much technology available, because so many Deadheads are into high technology--computer design, NASA...So I started trying to find out what freebies we could have!" Meanwhile, across the street from the Shoreline at Silicon Graphics, another of those fabled SGI under-the-table projects was in full swing--an attempt to develop a computer system for rock 'n' roll touring. SGI's Ron Fischer had been working with percussionists D'Cuckoo, and when D'Cuckoo opened the Dead's Mardi Gras show in February, lighting director Candace Brightman asked him to keep the equipment set up and do graphics for the Dead's set. Everybody liked the result. And so, for the June tour, SGI lent the Grateful Dead a top-of-the-line system: a 4D/420 VGXT workstation, with two CPU boards with MIPS processors running at 40 MHz, lots of RAM and lots of disk. Along with the system went operator and programmer David Tristram, a former NASA employee. After a pair of shakedown shows at Shoreline in May, Tristram and the Iris went to Buffalo, New York, Washington DC, Chicago and Columbus. Within hours of the first Shoreline shows, the nets were abuzz with talk about the new computer video. Tristram was combining live video from the stage with footage from videodisk, and adding effects created by ElectroPaint, a real-time paint program he'd written, and Dancing Marionette, a program written by Fischer using the Iris Inventor toolkit. "What we do is pretty much generate live computer animation in real time," explains Harnett. "Before, the Dead had used something called a Fairlight, which is kind of antiquated at this point--it was sort of a disco early-graphics-station paintbox kind of thing. But the VGXT workstation from SGI is much more sophisticated in terms of application and output. All the psychedelic overlays that we use to layer in keys during the performances are generated in real time, with the operator, David Tristram of SGI, pretty much jamming video-wise when the band jams onstage music-wise. "It's a rather sophisticated arrangement. We needed to go through a bunch of different decoder systems to get the one that really lets us time the SGI gear to the switcher, so we can interface it with the various other elements of the show--cameras and tape decks and laser discs and all that kind of stuff. "We ended up using a Grass Valley encoder and decoder. Basically the computer puts out a component signal, and we're using an analog system, so when we're going out of the computer into our system we have to go from component to composite into our system. And then to input stuff that David can grab and treat--camera pictures, or tape pictures, or anything like that--we have to go from composite on our end back into component to get it back into the SGI computer. "The SGI computer is really fascinating, because he can designate a portion of his workscreen as a standard television NTSC 525-line output. He can designate any part of his monitor as the output that we're seeing in video. He can move that area around so we can center things and position things. Then we also use it to capture stills during the show--pick up a still picture and save it, that we'll put up in between numbers during a blackout. So we're getting a lot of mileage out of that one box." Morphing In The Band The real crowd-pleaser among the effects seems to be a morphing sequence in which the faces of the musicians melt sequentially into each other. Creating the sequence was a large collaborative undertaking, and involved a visit to SGI by the band members to have their heads scanned for a database. SGI CEO Ed McCracken explains: "You sit still for seventeen seconds, and a 3-D camera goes around you. And instantly what pops up on the screen is a three-dimensional model of your entire head, with a photograph of your face from every angle texture-mapped onto it. We've had lots of people do it now, so we have this database of all of our heads. What we're going to *do * with it I haven't the slightest idea...but we have this library now. "It's amazing that in seventeen seconds you can have this image on the screen that you can then move around withmouse9loom at it from different angles. The resolution's as good as a photograph. As good as a video camera photograph, of course." The three-dimensional camera is the product of Cyberware Laboratory in Monterey. The camera is connected to a PC, which is in turn connected to an SGI 4D/220 workstation with 40Mb of RAM, VGX graphics, and two CPU boards with MIPS R2000A/R3000 CPUs running at 25MHz. Graphics transformations are done in hardware, which supports anti-aliasing, motion blur, depth cuing, and texture mapping. Explains programmer Jeanne Rich, who took existing software created with Iris Inventor by SGI's Thad Beier and Gavin Bell and enhanced it for the project: "The scanner projects a beam of red light, which the scanner uses to pick up the 3-D information and the color along the beam of light. As the camera moves around the subject, it continues to pick up the longitudinal information, and from this a 3-D quadmesh can be determined. "If you look at the wireframe of the model, you see four-sided shapes. A quadmesh connects the four-side shapes together to create a model. "The color information picked up by the camera is mapped onto the quadmesh. Various Cyberware products which run on the SGI system are used to clean up the data. Once the quadmesh has been cleaned up, a program is used which registers the color data so the heads can align--that is, for each head, the eyes, ears, nose and mouth are in the same location. From this, a 40x40 quadmesh is created, which can be used for morphing. "The morph program allows control of the animation, that is, control of the change of one head to the other. The user can change the transition rate and manually control the morphing. A sequencer allows changing the objects selected for morphing, and the process is viewed though an Inventor examiner view which allows the user to rotate and translate the object in real time. Rotation can be made to operate without user intervention. The user can also zoom in and out on the object." The data and programs used to generate these effects are quite large and reside on the workstation's hard disk. However, in order to keep the effects smooth and seamless, they are running in memory. Consequently a system designed for this purpose tends to have a lot of RAM, usually at least 60-80Mb. Further "It's certainly a unique concert situation," says Hartnett. "The Grateful Dead are unlike any other band in the world. The crowd is so in tune with them that they pick up on every single nuance of the performance. So it's keeping us all on our toes. Everybody's constantly pushing each other to go a little bit higher, just in terms of our output. With a fan base that's so loyal and that follows every single element of the show itself, we're hoping that video will go along for the entire tour at some point." Plans for the future include a number of projects to take further advantage of the Iris system's capabilities. Not surprisingly, Son of Morph is high on the list. "The band really got into it," Hartnett says. "In the long run, we're hoping to scan the band in completely, so we have a computer model of all of them. We'll go back in and do their whole bodies. Eventually what we want is an animation character for each member of the band, and looking further down the line, have them be able to do interactive video, so they could actually interact with the combination of what we put up on screens and the animation we're generating." He's also working with lighting director Brightman to achieve interesting new combinations of computer effects and conventional lighting tools. "Candace wanted to start using video on the floor, probably starting off small with playback and effects, and then gradually building up. "All the elements are working together. The lighting, the sound, the staging, the video. What we're trying to do, and I think we're doing a real good job now, is creating one whole palette. With Candace's lighting, she involves everybody in the stadium in the show. There's a lot of lighting effects designed specifically for the crowd--lighting different parts of the audience, really bringing everybody into the performance and making the entire arena part of the set. "We shoot out into the back of the hall, following light trails as they go back into the audience. The other night in Giants' Stadium, it was a full moon, so we were getting moon shots and incorporating them into the show. The Deadheads are both a colorful and an enthusiastic bunch, so we've been trying to get more pictures of them up on the screens, drawing them into the show further. "At its best, what I'm doing on video is complementing what the lighting is doing, which is complementing what the band's doing. It's trying to be one cohesive unit that has one head and three hundred arms, and it's really a challenge to do, night to night. But this is actually the most fun gig I've ever had."z --- .