Loading ...
Sorry, an error occurred while loading the content.
 

Real-Time Immersive Theater in Pittsburgh

Expand Messages
  • Ryan Wyatt
    For those of us who attended MAPS this past month, we had a great opportunity to spend a few hours with Kerry Handron at the Earth Theater in the Carnegie
    Message 1 of 3 , Jun 5 7:59 PM
      For those of us who attended MAPS this past month, we had a great
      opportunity to spend a few hours with Kerry Handron at the Earth
      Theater in the Carnegie Museum of Natural History. The theater is
      a Sky-Skan set-up: five LCD video projectors edge-blended into a
      1000x7000 image projected onto a dome section that wraps about
      220 degrees around the audience.

      (Kerry can correct me if any of what follows is factually incorrect,
      because I'm working from memory here.)

      On Friday afternoon, Kerry showed her opening program, "The Millennium
      Show," followed by some playback pieces that didn't make it into the
      previous evening's SkyVision demo. This included some work by University
      of Pittsburgh and Carnegie-Mellon students... Great to see!

      After the playback sequences, though, Kerry showed *more* in the way
      of experimentation... She has had eight Carnegie-Mellon students
      in her theater working on a real-time piece that runs through her
      system (because she uses physical baffles to blend her projectors,
      she can do real-time graphics much more easily than the average
      SkyVision theater). They run the entire thing off a *single* PII CPU
      packed with *five* video cards! Furthermore, the audience could
      interact with the system by making noise and moving around in our
      seats -- standing up, leaning right, leaning left.

      The program they showed us, "Cretaceous Chaos," had a time-travel
      theme, back to (you guessed it) the cretaceous. We basically saw
      a very rough cut: the landscape looked a little plain (pun intended),
      and dinosaurs had a tendency to emerge from mountainsides and slide
      across the terrain. But Kerry and the students were doing something
      most people only *talk* about, and I for one was delighted to see the
      results of their efforts! Representing the work of only eight
      students over one term, it was quite impressive.

      We saw another single-channel interactive piece, which used laser
      pointers as the input device... We played Missile Command, drew
      pictures, and voted on questions. Pretty kewl, but until we feel
      comfortable handing out laser pointers to our entire audience,
      probably not the direction many planetariums will go.

      Both these projects result from collaboration with Carnegie-Mellon's
      Entertainment Technology Center (ETC). If you go to ETC's website at
      http://www.etc.cmu.edu, under the "Projects" page, you'll find, under
      the Earth Theater heading, the aforementioned "Cretaceous Chaos."

      So...

      Now that I've taken everybody's time to describe all this, I thought
      I'd open the floor to discussion. :) Does anyone else have a reaction
      to what we saw in Pittsburgh? For those of you involved in real-time
      interactive programming, what kind of experimentation have you tried?
      What's worked and what hasn't? For anybody and everybody, how do you
      see the two technologies, immersive video and interactivity, working
      together?

      Just some late-night thoughts to see if we could get some discussion
      going...


      Ryan Wyatt, Science Visualizer
      Rose Center for Earth & Space
      American Museum of Natural History
      79th Street & Central Park West
      New York, NY 10024
      212.313.7903 vox
      212.313.7868 fax
    • kerry
      Thank you Ryan for the post about the theater. Please refer to his post as I don t wish to repeat. I would just like to add a little more..... The ability to
      Message 2 of 3 , Jun 14 7:06 AM
        Thank you Ryan for the post about the theater. Please refer to his post as
        I don't wish to repeat. I would just like to add a little more.....

        The ability to respond to audience input is what sets us apart from film
        from the audience's perspective. Most audience members don't know or care
        that we can create the stuff on our computers rather than needing a film
        crew. Most audience members don't notice or care about the technical
        advantages digital projections have over film. What they can see is that
        they are changing the course of the show as it goes along. That being said,
        it is much harder to write and create and have something more than a
        branching story or one that stops occasionally.

        The most successful approach in the Earth Theater is GlobeTrotting: XXX. So
        far the XXX's are The Americas, Africa and I am working on Bugs. This is a
        geography show for 2-5 grade with a flow based closely on the standard
        children's live star show. There are lots of questions like "what do you
        see" "what is this" "where are we". There are many prerendered bits and
        the operator picks what comes next based on what the kids know, like and
        what the rest of their museum experience that day is. This works, but only
        for homogenous school groups.

        Cretaceous Chaos was the first real attempt at total audience interaction.
        At MAPS you saw an 8 minute version with frame rates as low as 5/sec. We
        have trimmed it to 6 minutes and have a minimum frame rate of 11 now are
        will be running it for school groups this summer. The total interaction can
        be summed up as 2 leaning, 3 yelling, and one motion detection interaction,
        with many questions which don't trigger anything, but are asked as real
        questions. The interactive system is very solid. The ETC's vision code can
        reliably pick up 4 inputs from the audience... lean left, right, center and
        arms up, and can break the audience into teams by blocks. We use an active
        IR camera that really can see in the dark. The audio code can either feed
        the rendering software a number or compare it to trigger values. It is all
        off the shelf hardware. The rendering software works but is less stable.
        The interesting part is that it is free at alice.org When the students
        come back in the fall we will work on something that combines realtime
        generated with prerendered material.

        As for the laser games, like Ryan, I think they are great fun and don't know
        how to use them in public shows.

        As a last note I am wondering if anyone has seen the Smithsonian's new
        theater experience?
        http://www.cnn.com/2001/TRAVEL/DESTINATIONS/06/12/smithsonian.immersion.ap/

        -kerry

        Kerry Handron
        Earth Theater Director
        Carnegie Museum of Natural History
        HandronK@...
        412-578-2580
      • Ryan Wyatt
        Thanks for the follow-up on your theater, Kerry. Now I want to come back to Pittsburgh for your school shows... As far as the Smithsonian thing goes, the show
        Message 3 of 3 , Jun 15 1:06 AM
          Thanks for the follow-up on your theater, Kerry. Now I want
          to come back to Pittsburgh for your school shows...

          As far as the Smithsonian thing goes, the show sounds like
          "Body Wars," which I saw at iSci in Montréal last November.
          I reviewed it for a colleague, so I've appended my (somewhat
          flippant) review below. Background info: the Immersion Studios
          theater has three HD images placed side-by-side to create one,
          single 3x16 image, plus about thirty workstations with
          touchscreens. People sit two to a screen (unless the crowds
          are small enough for people to work on their own) and answer
          questions to control the flow of the show. (Not just three
          buttons...)

          "Also, my take on 'Body Wars'... Interesting concept, but less
          engaging in its execution than I would have hoped. It seemed
          unable to decide whether it was a video game or an immersive
          experience: should I look at the large screen or the small one?
          I found the small screen too enormous a distraction from the
          large screen, and after all the voting and such, I didn't feel
          as though I had much effect on the outcome (although I tied for
          fourth place out of about forty people). The experience seemed
          to me very unconvincing in terms of feeling *truly* interactive.

          "On top of that, the program doesn't really seem to *teach*
          anything. Nor could I follow the effect that any of our choices
          had on the plot.

          "I didn't exactly look under the hood at iSci, but all the Immersion
          Studios equipment is behind glass, so I was able to guess at most
          of the functions. A lot of computing power -- and the response
          time still seemed sluggish to me! What kind of world do we live
          in?"

          BTW, Immersion Studios has a similar ride/experience/thingie in
          Boston... I'm actually *in* Boston at the moment for MIT's "Image
          and Meaning" conference, but I don't think I'll have time to explore
          much off MIT's campus (aside from a trip to the Museum and the
          *other* Hayden Planetarium this past Tuesday).

          I'll report back on "Image and Meaning" next week.


          Ryan Wyatt, Science Visualizer
          Rose Center for Earth & Space
          American Museum of Natural History
          79th Street & Central Park West
          New York, NY 10024
          212.313.7903 vox
          212.313.7868 fax
        Your message has been successfully submitted and would be delivered to recipients shortly.