Loading ...
Sorry, an error occurred while loading the content.
 

Re: Highest resolution fulldome projection

Expand Messages
  • Martin Howe
    Lance, to answer your earlier question regarding highest resolution system for a Dome... At IPS in Melbourne we announced and showed our new Zorro projector
    Message 1 of 15 , Aug 16, 2006
      Lance, to answer your earlier question regarding highest resolution
      system for a Dome...

      At IPS in Melbourne we announced and showed our new Zorro projector
      which by early 2007 will be available in a native QXGA resolution (which
      is 2048 x 1536 pixels). In a 'standard' 6 channel configuration (five
      around the periphery and one at the cap) this will provide approximately
      2 arc minutes per pixel (measured from dome centre). Eye limiting
      resolution is considered to be around 1 arc minute (which is starball
      territory I understand).

      We can of course add more channels to increase this resolution. By way
      of an example we are installing over 20 channels in to a flight
      simulator partial sphere to provide eye limiting resolution over a much
      greater field of view (oh and it's rear projected as well!).

      Whilst most of our business up to now has been based upon designing to
      meet a particular specification (we normally build systems or develop
      technologies to meet a clients' specific requirements) the feedback from
      IPS indicates that a lot of people would be quite happy with a
      resolution based upon a 6 channel QXGA configuration to provide 2 arc
      minutes.

      I would be interested to hear from the community if there is a
      significant demand to match starball resolution in the near future.

      I hope this is helpful,

      Martin

      Martin Howe
      Vice President, Visualization & Operations
      SEOS Ltd
      tel: +44 (0) 1444 870 888
      mob +44 (0) 7793 414 553
      martin.howe@...
      www.seos.com

      -----Original Message-----
      From: fulldome@yahoogroups.com [mailto:fulldome@yahoogroups.com] On
      Behalf Of Martin Howe
      Sent: 18 July 2006 20:41
      To: fulldome@yahoogroups.com
      Subject: [fulldome] Re: Highest resolution fulldome projection

      I would like to propose that when answering Lance's question a
      normalising 'currency' of resolution is used.

      In the Fulldome group we have already discussed the measurement of arc
      minute resolution, measured from the 'design eye point' (or dome
      centre), assuming a full hemisphere in this case (180 degree), and
      either using pixel size (for instance 4 arc minutes per pixel) or per
      pixel pair (i.e 8 arc minutes, normally referred to as optical line
      pair) which also helps determine the modulation transfer function of the
      imaging technology - which is laypersons terms relates to sharpness.

      Simply stating the amount of pixels available at the projection device
      can give very misleading figures.

      I look forward to reading the responses, we shall post ours shortly
      Lance.

      Kind regards

      Martin

      ________________________________

      From: fulldome@yahoogroups.com on behalf of Lance Tankersley
      Sent: Mon 17/07/2006 22:01
      To: fulldome@yahoogroups.com
      Subject: [fulldome] Highest resolution fulldome projection

      Thought I would cheat and save a little bit of time/research and ask the
      group what is currently, or will be within the next 6 months, the
      highest resolution fulldome projection system available today for a 50ft
      dome?

      Thanks
      -Lance

      --
      *****
      Lance Tankersley
      Omnisphere Director
      The Coca-Cola Space Science Center
      www.CCSSC.org
      Columbus State University
      706-649-1484
      lance@...
      *****
    • Ed Lantz
      The highest resolution multi-projector fulldome theater I know of is Denver Museum of Nature and Science, with 11 Barco DLP SXGA (I think) projectors - a SEOS
      Message 2 of 15 , Aug 16, 2006
        The highest resolution multi-projector fulldome theater I know of is Denver Museum of Nature and Science, with 11 Barco DLP SXGA (I think) projectors - a SEOS project. It approaches 4k x 4k resolution. Next in line, I think, is Mexico City's Papalote Museo del Nino with 9 SXGA projectors (next-gen Barco DLPs).

        While I've not yet seen it, E&S's laser projector boasts 4k x 4k resolution in a single projector.

        Martin writes:

        <<At IPS in Melbourne we announced and showed our new Zorro projector which by early 2007 will be available in a native QXGA resolution (which is 2048 x 1536 pixels). In a 'standard' 6 channel configuration (five around the periphery and one at the cap) this will provide approximately 2 arc minutes per pixel (measured from dome centre). Eye limiting resolution is considered to be around 1 arc minute (which is starball territory I understand).>>

        Sounds impressive. Certainly better than 11 projectors.

        Eye limiting resolution is approx. 1 arc minute per _line pair_, no? That would be 0.5 arc minute per pixel. So this system could theoretically be improved by a factor of 4 - not that anyone would want to manage that many pixels.

        Two arc minutes per pixel is 30 pixels per degree or a 5400 x 5400 pixel fisheye frame. That's about 23 million active pixels on a hemisphere (out of the total available 29 million pixels in the square frame, 23 million actually map within the fisheye circle - or a bit more if you use a truncated hemi). Good enough for screening IMAX films (in my opinion), provided you can hit a brightness of at least a foot-lambert or two. If driven by a real-time system this would make a killer starfield. However, there are not many fulldome producers who would render to this resolution anytime soon. unless the economic model was there to support it.

        The larger issue with a multiprojector system as described is contrast and edge-blending artifacts. What good are all those pixels if there are double-images due to geometry mismatch, bright bands across the sky from poor color balancing, or a non-uniform gray background instead of black?

        Multiprojector systems have many issues:
        1) specular properties of the dome screen can make brightness levels between projectors depend on viewer position
        2) temperature fluctuations can cause mechanical drift resulting in optical/geometry mismatch
        3) lamp ageing can cause brightness and color nonuniformities between projectors
        4) routine maintenance can result in projectors being accidentally bumped and misaligned
        5) both white and black levels must be precisely matched between all projectors - resulting system contrast is limited by "least common denominator"
        6) electronic-only blends require very high contrast ratio projectors, since black levels are additive in blends
        7) maintenance nightmares scale linearly with the number of projectors used
        8) alignment nightmares scale exponentially with the number of projectors used (only half-joking here)

        Possible solutions to keep an eye on:
        1) single-lens projector design
        a. E&S laser projector is one approach to this - time will tell what other issues are raised with this fundamentally new technology
        b. tiling of multiple DLP or LCoS panels INSIDE the projector (likely approach being used by Imax Corp. in their new digital projector)
        c. dual Sony SXRD projectors leave only a single edge-blend to manage (Zeiss' 4Dome and Sky-Skan's definiti HD)
        d. with LCoS technology (Sony and JVC) pushing 4k x 2k resolution, 4k x 4k is probably right around the corner

        2) co-located, clustered multiprojector system with single lamp design

        3) hands-off imager-based auto-alignment system
        a. resulting system contrast is still limited by "least common denominator" effect
        b. black levels are still additive in blends
        c. bottom line - requires very high contrast projectors for best results

        I applaud all those actively contributing to improving state-of-the art in fulldome projector design!

        Ed Lantz
        Visual Bandwidth, Inc.
      • Paul Mowbray
        From a hardware point of view this is great news as more is always better. From a production point of view, argh!! It is challenging enough working with the
        Message 3 of 15 , Aug 16, 2006
          From a hardware point of view this is great news as more is always
          better.

          From a production point of view, argh!!

          It is challenging enough working with the currently accepted 3600x3600
          dome master files. The amount of effort required to produce these things
          can be painfull to say the least. The detail required in texturing,
          modeling and rendering sucks up a lot of resources not to mention
          storing all the files once they are rendered!

          I am of course talking about pre-rendered shows, I am still yet to see
          evidence (although I was not in attendance at IPS this year) that real
          time is ready for prime time. You just can't re-create a cinematic
          experience this way at the moment.

          Also I'm sure you guys at seos will have some spangly playback hardware
          but our D3 can't even play back 1280x1024 mpeg2 files consistently so
          playback hardware is gonna have to be ramped up to handle all the extra
          data.

          The worst bit of this whole equation is that the hardware is getting
          more fancy and expensive but the budgets for producing Fulldome shows
          and the licensing revenue is in decline.

          What good is a dome that has detail approaching the level of what the
          human eye can perceive if all you have to show on it is the same shows
          that have been kicking around for the past few years only available at a
          lower res than the system is capable of.

          I know that the production problems can be overcome with enough money
          thrown at it but when the content producers are facing diminishing
          returns it is hard to justify.

          I love technology and I am excited about the quality of the video stars
          on this and future systems but at the end of the day there is more to
          life than stars and for the Fulldome medium (which I think is one of the
          few remaining passive experiences with a distinct wow factor) to even
          think about any kind of mainstream penetration we need more happening on
          the content side of things, it is King after all ;)

          Just my 2cents

          Paul


          Paul Mowbray

          Digital Animation/Design Artist
          National Space Centre, Exploration Drive, Leicester, LE4 5NS, UK.
          Tel: +44 (0) 116 2582117
          Fax: +44 (0) 116 2582100
          www.spacecentre.co.uk <http://www.spacecentre.co.uk/>
        • Steve Cooper
          Our D3 is handling the 3600X3600 masters well, even swapping between that and older 2400X2400 or 2200X2200 masters. The communications issues are resolving as
          Message 4 of 15 , Aug 16, 2006
            Our D3 is handling the 3600X3600 masters well, even swapping between
            that and older 2400X2400 or 2200X2200 masters. The communications
            issues are resolving as quickly as the resolution standards. We got
            ours in May of 2005, if that tells you anything.

            While the theater does have it's passive 'WOW' factor, we have been
            using our theater to do more and more interactive approaches. Our
            'Cosmic Jukebox' is doing well (16,071...now 16,072...now 16,073 five to
            seven minute customer-built shows since May 2005), and our skies show is
            the best attended. We even make an effort to make 'Cardboard Rocket'
            sort of interactive by insisting the kids do all of the countdowns as
            loud as they possibly can. After the first 'BLASTOFF' the kids feel
            more involved and are totally engrossed.

            Later this year we will be doing a show in the theater with a 'live
            reporter' who will interact with the audience, the dome visuals, and two
            Real Time robot reporters on the dome.

            This might not work for the more commercial venues, but we do it
            because:
            Guests have responded well to it.
            We want to be as different from our IMAX as we possibly can.
            We wanted to exploit the newer capabilities of the system.
            Because we can(heh! heh!).

            Steve Cooper
            Technical Coordinator
            Science Center of Iowa
            401 W. ML King.
            Des Moines, Iowa 50309
            515-274-6868 X:231
          • Martin Howe
            Good to see more discussion on this point, a few threads are developing; hardware and content... Since Ed Lantz s first reply commenting on what eye limiting
            Message 5 of 15 , Aug 17, 2006
              Good to see more discussion on this point, a few threads are developing;
              hardware and content...

              Since Ed Lantz's first reply commenting on what eye limiting resolution
              is I've done a little more research (partly just by visiting our R&D
              guys down the corridor), and as always, things are not what they first
              appear.

              What the eye can resolve varies based on what it is looking at (dots,
              lines, etc), which part of the eye is looking at it, the brightness of
              the object and to some degree the brightness of the environment in which
              the eye is operating.

              I also like to raise the point that we're now dealing with square pixels
              not round ones, more a commentary on technology progress than what we
              can see (my guess is that E&S laser pixels are squarish - care to
              comment?)

              Generally though, around 2 arc minutes per pixel is regarded as the
              practical limit today mainly for the reason of content generation and
              distribution. The only other limit I can think of is budget. I heard
              recently that someone paid $7m for a starball - is this true? that level
              of budget can buy a lot of pixels (for delivery Q1 07!).

              Ed goes on to point out the issues of channelis/zation (not a word in
              either language I'm sure). However I suspect that a true 4k x 4k
              projector is a while off whilst the Cinema industry drive is to true 4k
              x 2k (the actual native resolution of the imaging device, the current
              offering still comprise multiple lower resolution sub-components) which
              should come a lot sooner (within 2 years; installable!).

              Even then, the issue of channelisation stays with us. For instance the
              Sony 4k projector needs 4 inputs of lower resolutions, each of which has
              to be matched and synchronised, and whilst the point about projector
              matching is true, the issue of matching all the components up stream
              applies equally well. So just because a system may use 'only' two
              projectors for example to make up an image (which still need to be
              matched by the way) there are still eight sets of electronics that need
              to perform identically too, if not you may see synchronisation issues,
              image tearing, fill rate differences etc.

              It is definitely the case that whatever approach is taken in the medium
              term, multiple devices will need to be matched very well in a system to
              make the image appear truly coherent.
              The good news is that it is now increasingly possible to do this with
              today's technology, and there are fewer excuses for why the audience
              should be able to see blends on mutliple channels systems (however many)
              or image synchronistion issues.

              To answer Paul Mowbray's points relating to creating content and playing
              back these resolutions;

              1 - Er yes, we do have spangly playback hardware that we believe can
              deal with these resolutions :-) (we call it SEOS Media Server).
              Importantly it is based upon commodity PC hardware (albeit hi-end), so
              it should be relatively affordable to acquire and maintain (I say
              relative because we all have our own bechmarks)
              2 - I was lucky enough to go to IPS and I saw real time software that I
              thought looked truly excellent. One was on our booth and the others on
              Skyskan's. Both were showing the Digital Universe Atlas
              (http://haydenplanetarium.org/universe/) and whilst I thought one was
              better than the other (and I will be accused of bias), both offer
              compelling audience experiences I believe.
              3 - I agree, content is still King, I hope we can find a way for that to
              keep up with what the technology is now capable of!

              Oh, & screen technology needs to keep up too!!! (look forward to reading
              those replies)


              Martin



              Martin Howe
              Vice President, Visualization & Operations
              SEOS Ltd
              tel: +44 (0) 1444 870 888
              mob +44 (0) 7793 414 553
              www.seos.com
            • david mcconville
              ... Paul, I m assuming you re talking about real-time astronomical applications specific to the system you re running? Uniview is running successfully in
              Message 6 of 15 , Aug 17, 2006
                At 11:33 AM 8/16/2006, Paul Mowbray wrote:
                >I am of course talking about pre-rendered shows, I am still yet to see
                >evidence (although I was not in attendance at IPS this year) that real
                >time is ready for prime time. You just can't re-create a cinematic
                >experience this way at the moment.

                Paul,

                I'm assuming you're talking about real-time astronomical applications
                specific to the system you're running? Uniview is running successfully in
                numerous SEOS installations and SkySkan has Digital Sky (both of which use
                the Digital Universe database). Additionally, there are many, many
                "dome-enabled" real-time applications for single-projector systems,
                including astronomy applications, gaming engines, flight/driving
                simulators, VJ applications, and many more that are permanently installed
                or touring with portable systems. I'm sure I'm missing some, but real-time
                is definitely already prime time and used in many locations. In our
                systems, real-time is often used more than pre-rendered movies.

                cheers,
                david



                --------------------------
                david mcconville
                http://www.elumenati.com
                612.605.0826 x5
              • Paul Mowbray
                David, I failed to explain my point sufficiently, I ll have another go. The space based real time systems and data sets that are available are very good and
                Message 7 of 15 , Aug 18, 2006
                  David,

                  I failed to explain my point sufficiently, I'll have another go.

                  The space based real time systems and data sets that are available are
                  very good and provide a compelling way to explore the universe. What I
                  was trying to say is that not all shows are about space or to be more
                  specific stars, planets and other cool space stuff.

                  Our latest show Astronaut featured a section explaining the effects of
                  microgravity on the human body. We go inside the inner ear, observe bone
                  decaying etc, Another scene features advanced character animation
                  containing cloth simulation, particle effects, soft and hard bodied
                  dynamics simulations. To my knowledge it is not possible to implement
                  all the rendering effects and density of geometry in any current
                  Fulldome real-time system. Modern day graphics cards can do some amazing
                  things but to perform indirect lighting, displacement mapping,
                  raytracing etc at the kind of resolutions we require and are looking at
                  for the future is just not available as we speak.

                  Real-time offers many, many advantages over pre-rendered opening up a
                  new paradigm for story telling, but you can not achieve the look and
                  feel that people have come to expect from their experiences with the
                  world of cinema... yet.

                  My personal passion lies in trying to recreate a cinematic experience
                  within the unique environment of the dome. I think there is room for
                  both pre-rendered and real-time applications for some time to come but
                  my initial point was that you can not achieve what I want to put up on
                  the dome with current real-time systems. When the real-time systems can
                  do what I want and better (which is inevitable... maybe) I will be more
                  than happy to transition over!

                  Thanks

                  Paul
                • ed@visualbandwidth.com
                  Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott
                  Message 8 of 15 , Aug 18, 2006
                    Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott Draves uses 30,000 CPU's to run his particle animations. And live-action is rendered on an image capture system requiring linear playback.

                    I think the most popular use of real-time is to replace (and exceed) planetarium functionality in a digital system. Planetarians have traditionally told stories interactively, reaching down to turn a knob to produce diurnal, latitude, or heading star motion, advance through the annual seasons, display grids, etc. The digital planetarium adds a “z” axis allowing liftoff from the planet and travel through the known universe as well. This adds a tremendous educational dimension - knowledge of our place in the universe.

                    Regarding Ken's comments on HYBRID, I greatly respect what GOTO has done here. Many planetarians DO place emphasis on naked eye astronomy, and optomechanical does produce a superior sky simulation. A dedicated “star theater” tasked with naked-eye simulation of the night sky may require an optical projector for the best possible simulation.

                    A facility with broader educational goals may not see the value in such an investment, since naked eye astronomy is a narrow educational goal within the field of astronomy and astrophysics, which is a narrow branch of physics, which is only one of the sciences, which in turn is only one possible curricula at an educational institution which might include art, drama and other schools of study that would take an interest in a digital theater resource (planetarians may have to learn to share). Still, starballs are kewl and I do hope they don't fade away too soon...

                    How a HYBRID system can be cheaper than a digital system alone is beyond me, Ken.

                    Ed

                    Ed Lantz
                    President
                    Visual Bandwidth, Inc.
                    Tel: 610.590.4269
                    Mobile: 484.467.1267
                    ed@...
                    www.visualbandwidth.com
                  • Todd Slisher
                    Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as
                    Message 9 of 15 , Aug 18, 2006
                      Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as both real-time and rendering capabilities evolve), what people are trying to achieve with real-time systems is NOT a cinematic experience. The most effective use of real-time systems that I've seen are with a live presenter and an interactive program. These will necessarily be different than a cinematic experience. Including the live presenter element breaks some of the immersive qualities of a cinematic experience. And this is fine, if interactivity is your goal. In fact, it could be said that part of the goal of an interactive experience is to break the cinematic 'trance' that often comes over audiences and cause them to think, rather than just view.

                      That said - I'm a big believer that both interactive and pre-rendered cinematic experiences should be part of our virtual 'bag of tricks' that we can deliver to our audiences. Let's have the best both worlds can deliver.

                      Just my $0.02

                      Todd


                      Todd K. Slisher
                      Vice President of Science Programs
                      Detroit Science Center
                      5020 John R Street
                      Detroit, MI 48202
                      Tslisher@...
                    • david mcconville
                      Paul, It seems that the true utility of real-time systems is not to re-create the cinematic experience but to evolve the immersive mediated experience beyond a
                      Message 10 of 15 , Aug 18, 2006
                        Paul,

                        It seems that the true utility of real-time systems is not to re-create the
                        cinematic experience but to evolve the immersive mediated experience beyond
                        a solely passive one. Granted, simply rendering pre-determined shots and
                        camera paths in real-time to replace their pre-rendered equivalent for most
                        material is currently not feasible (though many gaming engines are heading
                        in that direction). But watch AMNH's Passport to the Universe and then look
                        at the same shots rendered in real-time in Uniview - you'll be hard-pressed
                        to tell the difference.

                        That said, Source, Unreal Tournament, Crytek and others are great examples
                        of gaming engines that are increasingly being used to create Machinima
                        real-time cinema (see http://www.machinima.com for examples). Any real-time
                        engine can theoretically be adopted for use in domes (we've already helped
                        create a dome version of Unreal Tournament at http://planetjeff.net), so
                        it's really a matter of the value proposition to implement this
                        modifications. It's is very unlikely that the best real-time fulldome
                        engines, especially for non-astronomy applications, will be developed by
                        fulldome hardware companies - they will likely be modified engines that are
                        already being developed for gaming and visualization applications.

                        All of this was discussed at length last year at the NEI fulldome meeting
                        at Chabot
                        (http://www.ips2008.org/partners/images/Chabot_NEI_report.pdf#search=%22%22will%20wright%22%20fulldome%22).
                        We theorized and waxed poetic about how we could us Will Wright's new Spore
                        game in the dome and how to interface numerous handheld devices with the
                        larger display (ie everyone having a wireless PSP and playing each other on
                        the dome). Do yourself a favor and watch
                        http://video.google.com/videoplay?docid=8372603330420559198 - you'll
                        quickly get the idea of why real-time environments could be incredible for
                        interactive education. CMU Entertainment Technology Center students
                        addressed these issues as well with their Interactive Dome Project
                        (http://www.etc.cmu.edu/projects/dome/) last year.

                        And Tom is right - just because it's real-time doesn't mean it's cheap,
                        especially for high-quality data collection, modeling, animation,
                        simulation, lighting, etc. But serious real-time capabilities do
                        dramatically increase possibilities for new forms of interactivity,
                        networking, education, exploration, and experimentation - in other words,
                        some of the reasons that many of us became interested in immersive virtual
                        environments to begin with...

                        cheers,
                        david

                        --------------------------
                        david mcconville
                        http://www.elumenati.com
                        612.605.0826 x5
                      • pauldavidbourke
                        My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a
                        Message 11 of 15 , Aug 19, 2006
                          My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a whole raft of factors.

                          The resolution of the raw frames that make up the content is only one part of the story, it only determines the limit of the visual quality of the final projected result given a perfect projection system. There are however lots of factors that reduce the effective resolution of the result reflected off the dome surface, some of these are:

                          1. Codec choice. It seems many (perhaps most) planetariums are not usng lossless encoding/playback in which case the pixels being sent to the projection hardware are at a lower fidelity than the original rendered material. Personally I've always been surprised that some/many systems even use mpg and variants .... surely with todays hardware we can move past such compromising technologies.

                          2. Projectors don't give a 1:1 representation of pixels on the projection surface, this is especially so for CRT technology but also true for digital projectors. There are all sorts of sources for this including use of analog signals, the reality of lens physics, focusing on a curved surface, etc.

                          3. Many multiprojector systems use digital warping to correct for the spherical geometry, this lowers the information content.....individual pixels get contrbutions from neighbours so are no longer independent.

                          4. Edge blending is rarely (if ever) perfect, the result is usually a blurring (often significant) of the image between the projection patches.

                          I'm sure it is easy to find examples of planetariums with significantly higher theoretical projected resolution but where content looks worse than planetariums with lower spec'ed projection hardware.

                          One should also note that just because raw pristine frames may have a certain pixel count, that certainly does not mean it is higher resolution (in the informational sense) than content with a lower pixel count. This is clearly the case for filmed material, but also for CG. The resolution as it relates to quality is dependent on things such as antialiasing settings, quality of the model geometry, texture resolution, and other factors related to rendering technique.
                        • Ryan Wyatt
                          Posted for Brad Thompson: Hello, Sorry for the long diatribe but Paul and David touched lightly on something that I ve encountered over and over again
                          Message 12 of 15 , Aug 20, 2006
                            Posted for Brad Thompson:

                            Hello,

                            Sorry for the long diatribe but Paul and David touched lightly on
                            something that I've encountered over and over again throughput my
                            years sitting under a dome. This is a bit off-topic and my long post
                            isn't meant as a retort to anything anyone said here. There is this
                            myth about real-time technology that I seem to keep encountering over
                            and over, even from people who really should know better. The key to
                            Paul's statement is that you "can't re-create a CINEMATIC experience
                            this way at the moment." I'm guessing that by "cinematic
                            experience", he's referring to passive linear cinematic
                            storytelling. Sure, Uniview, Digital Sky, D3, Starry night, game
                            engines, etc. are out there and being used to create great and unique
                            experiences. However, they aren't currently the most powerful/
                            effective/efficient tools for passive linear cinematic storytelling
                            on the dome (aka a cinematic experience.)

                            For that to be the case, a system would have to be able to generate
                            The Enchanted Reef, Black Holes: The other side of Infinity, DarkStar
                            Adventure, Astronaut, or (insert name of your favorite prerendered
                            dome show here) frames in "real-time" without visual compromises. It
                            would have to be a more efficient platform to produce upon than the
                            applications that those shows were produced with (3dsMax, Maya,
                            custom supercomputer apps, etc.), and the resulting production would
                            have to be as portable as prerendered content is now. To my
                            knowledge, this doesn't even come close to describing any system that
                            currently exists or will exist in the forseeable future. I'm not
                            saying that this can never happen, but ever since high-end graphics
                            technology became a commodity item, Nvidia, ATI and the like have
                            been trying to convince us that the next generation of their
                            technology will bring cinema quality graphics in real-time. So far,
                            they've always been at least 10 years behind. On the tech side, what
                            happens is that new methods are pioneered in software labs at
                            universities or the R&D teams at animation or FX studios. These
                            methods and tools quickly show up in animation software, then in the
                            cinema. People's expectations are raised, and then finally, the
                            propeller-heads at the video card companies and the engine
                            programmers at game companies figure out ways to accelerate these new
                            methods via new hardware. I could mention Blinn's law here as well,
                            which is a long standing maxim credited to CG pioneer Jim Blinn that
                            states “any renderer, no matter how fast processors get, will always
                            take a couple of hours, because that's the tolerance level of artists."

                            Furthermore, I find that people often don't recognize that the same
                            graphics technology that powers all the cool real-time digital dome
                            systems out there, also powers 3dsMax, Maya, FinalCut, AfterEffects,
                            etc. I have a real-time viewport where I can spin around, set up my
                            animations, press play, and immediately see the results. If I've
                            kept things basic, I don't have to wait for a render to know what
                            things will look like. It's only when I start taking my visual
                            fidelity or complexity beyond what real-time graphics systems are
                            capable of that I have to do test renders to see the results. Also,
                            what it means to "keep things basic" is evolving at the same pace as
                            real-time graphics technology. Finally, these tools have been
                            hammered on and optimized for efficiency of production much more, and
                            by many MANY more people than the latest narrowly focused tool that
                            us dome-specific guys/gals invent.

                            In the end, it all comes down to the question of which tools will
                            help me tell my story the best (highest quality, fastest, most
                            efficient, etc.) in the dome. Software applications usually have to
                            make a choice between simplicity and power. If I only ever produced
                            "the sky tonight" type talks, then I'd probably choose something like
                            Starry Night, D3, or one of the other simple streamlined focused
                            applications for doing that (unless I'm concerned about wide
                            distribution.) If I wanted to do something truely interactive, then
                            obviously a real-time application is the only choice. If I want to
                            craft a linear cinematic experience with no limitations except my own
                            imagination, budget and ability, then the hundreds of tools that fall
                            into the "prerendered" category are the only logical choice.
                            Remember, all of these things are just tools to create the
                            experience. Understand and use your tools or else they will use you.

                            --
                            Brad Thompson - bthompson@...
                            Digital Animation & Design - Spitz, Inc.
                            http://www.spitzinc.com

                            -- "Hush, may I ask you all for silence? The dreamer is still asleep"
                          • Ryan Wyatt
                            I ll echo others sentiments about leaving room for both real-time and rendered technologies in our domes, in particular, Todd Slisher s ... Agreed! In many
                            Message 13 of 15 , Aug 22, 2006
                              I'll echo others' sentiments about leaving room for both real-time
                              and rendered technologies in our domes, in particular, Todd Slisher's
                              words:

                              > I'm a big believer that both interactive and pre-rendered cinematic
                              > experiences should be part of our virtual 'bag of tricks' that we
                              > can deliver to our audiences. Let's have the best both worlds can
                              > deliver.

                              Agreed!

                              In many ways, the two are apples and oranges. I think AMNH's attempt
                              to proceed down both routes shows how we value the differing
                              experiences they can offer, even though our 430-seat theater doesn't
                              give us the same kind of opportunities for real-time experimentation
                              that a smaller, classroom-sized theater would (which is why we need
                              to build a second theater, in my opinion, but that's another story).

                              However, the "apples and oranges" mentality doesn't do the
                              relationship justice. Our monthly real-time program, "Virtual
                              Universe," has given us an opportunity to explore topics in the dome
                              that have in turn informed our big-budget "space shows." One huge
                              benefit is that the same people who create the shows (e.g., Carter
                              Emmart and I) also share responsibilities for the monthly program.
                              So we get to experience first-hand audience reactions to our work,
                              tweak and modify the presentation both visually and verbally, and
                              basically play with our ideas before they appear in a more formal,
                              "cinematic" context.

                              (Using my preferred term "narrative journey" for immersive
                              experiences, the difference is a bit like the slick, pre-recorded
                              tour versus the friend pointing out places of interest. A well-
                              informed friend, with enough experience, can provide some insight
                              into improving the pre-produced version.)

                              We also use our dome's real-time capabilities for testing and
                              creating flight paths. In particular, we were able to collaborate
                              with NCSA over the phone as we shared the same virtual environment:
                              they would make a flight path, share it with us, then we could fly it
                              together through an identical virtual space. But this idea could be
                              carried further... If a production work flow were designed
                              correctly, it seems that one could effectively create a real-time
                              version of a show (or portions thereof), test it with audiences, then
                              use that feedback to inform development of a final production. Seems
                              like it would be worth a try, at any rate. N.B., however, that real-
                              time needs must be considered specifically in advance of a
                              production; done correctly, they can proceed naturally from work done
                              for the rendered work, but that doesn't happen for free or without an
                              investment of time and resources.

                              Finally, I see potential for real-time show distribution. Because
                              flightpaths and models could be downloaded off an FTP site or sent on
                              a DVD-R, they represent a much more (and quite literally)
                              "lightweight" means of distributing content than shipping hard drives
                              around the world. As real-time engines continue to improve, there
                              are many things they do just as well as pre-rendered media (e.g.,
                              most solar system shows, some general astronomical content), so why
                              not save on the FedEx bill? Perhaps one could even disassemble the
                              show kits, in fine old planetarium tradition, to create mini-shows or
                              portions of a program that could be used in different contexts?

                              Just my $0.02.


                              Ryan, a.k.a.
                              Ryan Wyatt, Science Visualizer
                              Rose Center for Earth & Space
                              American Museum of Natural History
                              79th Street at Central Park West
                              New York, NY 10024
                            Your message has been successfully submitted and would be delivered to recipients shortly.