Loading ...
Sorry, an error occurred while loading the content.

Re: Highest resolution fulldome projection

Expand Messages
  • Paul Mowbray
    David, I failed to explain my point sufficiently, I ll have another go. The space based real time systems and data sets that are available are very good and
    Message 1 of 15 , Aug 18, 2006
    View Source
    • 0 Attachment
      David,

      I failed to explain my point sufficiently, I'll have another go.

      The space based real time systems and data sets that are available are
      very good and provide a compelling way to explore the universe. What I
      was trying to say is that not all shows are about space or to be more
      specific stars, planets and other cool space stuff.

      Our latest show Astronaut featured a section explaining the effects of
      microgravity on the human body. We go inside the inner ear, observe bone
      decaying etc, Another scene features advanced character animation
      containing cloth simulation, particle effects, soft and hard bodied
      dynamics simulations. To my knowledge it is not possible to implement
      all the rendering effects and density of geometry in any current
      Fulldome real-time system. Modern day graphics cards can do some amazing
      things but to perform indirect lighting, displacement mapping,
      raytracing etc at the kind of resolutions we require and are looking at
      for the future is just not available as we speak.

      Real-time offers many, many advantages over pre-rendered opening up a
      new paradigm for story telling, but you can not achieve the look and
      feel that people have come to expect from their experiences with the
      world of cinema... yet.

      My personal passion lies in trying to recreate a cinematic experience
      within the unique environment of the dome. I think there is room for
      both pre-rendered and real-time applications for some time to come but
      my initial point was that you can not achieve what I want to put up on
      the dome with current real-time systems. When the real-time systems can
      do what I want and better (which is inevitable... maybe) I will be more
      than happy to transition over!

      Thanks

      Paul
    • ed@visualbandwidth.com
      Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott
      Message 2 of 15 , Aug 18, 2006
      View Source
      • 0 Attachment
        Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott Draves uses 30,000 CPU's to run his particle animations. And live-action is rendered on an image capture system requiring linear playback.

        I think the most popular use of real-time is to replace (and exceed) planetarium functionality in a digital system. Planetarians have traditionally told stories interactively, reaching down to turn a knob to produce diurnal, latitude, or heading star motion, advance through the annual seasons, display grids, etc. The digital planetarium adds a “z” axis allowing liftoff from the planet and travel through the known universe as well. This adds a tremendous educational dimension - knowledge of our place in the universe.

        Regarding Ken's comments on HYBRID, I greatly respect what GOTO has done here. Many planetarians DO place emphasis on naked eye astronomy, and optomechanical does produce a superior sky simulation. A dedicated “star theater” tasked with naked-eye simulation of the night sky may require an optical projector for the best possible simulation.

        A facility with broader educational goals may not see the value in such an investment, since naked eye astronomy is a narrow educational goal within the field of astronomy and astrophysics, which is a narrow branch of physics, which is only one of the sciences, which in turn is only one possible curricula at an educational institution which might include art, drama and other schools of study that would take an interest in a digital theater resource (planetarians may have to learn to share). Still, starballs are kewl and I do hope they don't fade away too soon...

        How a HYBRID system can be cheaper than a digital system alone is beyond me, Ken.

        Ed

        Ed Lantz
        President
        Visual Bandwidth, Inc.
        Tel: 610.590.4269
        Mobile: 484.467.1267
        ed@...
        www.visualbandwidth.com
      • Todd Slisher
        Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as
        Message 3 of 15 , Aug 18, 2006
        View Source
        • 0 Attachment
          Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as both real-time and rendering capabilities evolve), what people are trying to achieve with real-time systems is NOT a cinematic experience. The most effective use of real-time systems that I've seen are with a live presenter and an interactive program. These will necessarily be different than a cinematic experience. Including the live presenter element breaks some of the immersive qualities of a cinematic experience. And this is fine, if interactivity is your goal. In fact, it could be said that part of the goal of an interactive experience is to break the cinematic 'trance' that often comes over audiences and cause them to think, rather than just view.

          That said - I'm a big believer that both interactive and pre-rendered cinematic experiences should be part of our virtual 'bag of tricks' that we can deliver to our audiences. Let's have the best both worlds can deliver.

          Just my $0.02

          Todd


          Todd K. Slisher
          Vice President of Science Programs
          Detroit Science Center
          5020 John R Street
          Detroit, MI 48202
          Tslisher@...
        • david mcconville
          Paul, It seems that the true utility of real-time systems is not to re-create the cinematic experience but to evolve the immersive mediated experience beyond a
          Message 4 of 15 , Aug 18, 2006
          View Source
          • 0 Attachment
            Paul,

            It seems that the true utility of real-time systems is not to re-create the
            cinematic experience but to evolve the immersive mediated experience beyond
            a solely passive one. Granted, simply rendering pre-determined shots and
            camera paths in real-time to replace their pre-rendered equivalent for most
            material is currently not feasible (though many gaming engines are heading
            in that direction). But watch AMNH's Passport to the Universe and then look
            at the same shots rendered in real-time in Uniview - you'll be hard-pressed
            to tell the difference.

            That said, Source, Unreal Tournament, Crytek and others are great examples
            of gaming engines that are increasingly being used to create Machinima
            real-time cinema (see http://www.machinima.com for examples). Any real-time
            engine can theoretically be adopted for use in domes (we've already helped
            create a dome version of Unreal Tournament at http://planetjeff.net), so
            it's really a matter of the value proposition to implement this
            modifications. It's is very unlikely that the best real-time fulldome
            engines, especially for non-astronomy applications, will be developed by
            fulldome hardware companies - they will likely be modified engines that are
            already being developed for gaming and visualization applications.

            All of this was discussed at length last year at the NEI fulldome meeting
            at Chabot
            (http://www.ips2008.org/partners/images/Chabot_NEI_report.pdf#search=%22%22will%20wright%22%20fulldome%22).
            We theorized and waxed poetic about how we could us Will Wright's new Spore
            game in the dome and how to interface numerous handheld devices with the
            larger display (ie everyone having a wireless PSP and playing each other on
            the dome). Do yourself a favor and watch
            http://video.google.com/videoplay?docid=8372603330420559198 - you'll
            quickly get the idea of why real-time environments could be incredible for
            interactive education. CMU Entertainment Technology Center students
            addressed these issues as well with their Interactive Dome Project
            (http://www.etc.cmu.edu/projects/dome/) last year.

            And Tom is right - just because it's real-time doesn't mean it's cheap,
            especially for high-quality data collection, modeling, animation,
            simulation, lighting, etc. But serious real-time capabilities do
            dramatically increase possibilities for new forms of interactivity,
            networking, education, exploration, and experimentation - in other words,
            some of the reasons that many of us became interested in immersive virtual
            environments to begin with...

            cheers,
            david

            --------------------------
            david mcconville
            http://www.elumenati.com
            612.605.0826 x5
          • pauldavidbourke
            My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a
            Message 5 of 15 , Aug 19, 2006
            View Source
            • 0 Attachment
              My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a whole raft of factors.

              The resolution of the raw frames that make up the content is only one part of the story, it only determines the limit of the visual quality of the final projected result given a perfect projection system. There are however lots of factors that reduce the effective resolution of the result reflected off the dome surface, some of these are:

              1. Codec choice. It seems many (perhaps most) planetariums are not usng lossless encoding/playback in which case the pixels being sent to the projection hardware are at a lower fidelity than the original rendered material. Personally I've always been surprised that some/many systems even use mpg and variants .... surely with todays hardware we can move past such compromising technologies.

              2. Projectors don't give a 1:1 representation of pixels on the projection surface, this is especially so for CRT technology but also true for digital projectors. There are all sorts of sources for this including use of analog signals, the reality of lens physics, focusing on a curved surface, etc.

              3. Many multiprojector systems use digital warping to correct for the spherical geometry, this lowers the information content.....individual pixels get contrbutions from neighbours so are no longer independent.

              4. Edge blending is rarely (if ever) perfect, the result is usually a blurring (often significant) of the image between the projection patches.

              I'm sure it is easy to find examples of planetariums with significantly higher theoretical projected resolution but where content looks worse than planetariums with lower spec'ed projection hardware.

              One should also note that just because raw pristine frames may have a certain pixel count, that certainly does not mean it is higher resolution (in the informational sense) than content with a lower pixel count. This is clearly the case for filmed material, but also for CG. The resolution as it relates to quality is dependent on things such as antialiasing settings, quality of the model geometry, texture resolution, and other factors related to rendering technique.
            • Ryan Wyatt
              Posted for Brad Thompson: Hello, Sorry for the long diatribe but Paul and David touched lightly on something that I ve encountered over and over again
              Message 6 of 15 , Aug 20, 2006
              View Source
              • 0 Attachment
                Posted for Brad Thompson:

                Hello,

                Sorry for the long diatribe but Paul and David touched lightly on
                something that I've encountered over and over again throughput my
                years sitting under a dome. This is a bit off-topic and my long post
                isn't meant as a retort to anything anyone said here. There is this
                myth about real-time technology that I seem to keep encountering over
                and over, even from people who really should know better. The key to
                Paul's statement is that you "can't re-create a CINEMATIC experience
                this way at the moment." I'm guessing that by "cinematic
                experience", he's referring to passive linear cinematic
                storytelling. Sure, Uniview, Digital Sky, D3, Starry night, game
                engines, etc. are out there and being used to create great and unique
                experiences. However, they aren't currently the most powerful/
                effective/efficient tools for passive linear cinematic storytelling
                on the dome (aka a cinematic experience.)

                For that to be the case, a system would have to be able to generate
                The Enchanted Reef, Black Holes: The other side of Infinity, DarkStar
                Adventure, Astronaut, or (insert name of your favorite prerendered
                dome show here) frames in "real-time" without visual compromises. It
                would have to be a more efficient platform to produce upon than the
                applications that those shows were produced with (3dsMax, Maya,
                custom supercomputer apps, etc.), and the resulting production would
                have to be as portable as prerendered content is now. To my
                knowledge, this doesn't even come close to describing any system that
                currently exists or will exist in the forseeable future. I'm not
                saying that this can never happen, but ever since high-end graphics
                technology became a commodity item, Nvidia, ATI and the like have
                been trying to convince us that the next generation of their
                technology will bring cinema quality graphics in real-time. So far,
                they've always been at least 10 years behind. On the tech side, what
                happens is that new methods are pioneered in software labs at
                universities or the R&D teams at animation or FX studios. These
                methods and tools quickly show up in animation software, then in the
                cinema. People's expectations are raised, and then finally, the
                propeller-heads at the video card companies and the engine
                programmers at game companies figure out ways to accelerate these new
                methods via new hardware. I could mention Blinn's law here as well,
                which is a long standing maxim credited to CG pioneer Jim Blinn that
                states “any renderer, no matter how fast processors get, will always
                take a couple of hours, because that's the tolerance level of artists."

                Furthermore, I find that people often don't recognize that the same
                graphics technology that powers all the cool real-time digital dome
                systems out there, also powers 3dsMax, Maya, FinalCut, AfterEffects,
                etc. I have a real-time viewport where I can spin around, set up my
                animations, press play, and immediately see the results. If I've
                kept things basic, I don't have to wait for a render to know what
                things will look like. It's only when I start taking my visual
                fidelity or complexity beyond what real-time graphics systems are
                capable of that I have to do test renders to see the results. Also,
                what it means to "keep things basic" is evolving at the same pace as
                real-time graphics technology. Finally, these tools have been
                hammered on and optimized for efficiency of production much more, and
                by many MANY more people than the latest narrowly focused tool that
                us dome-specific guys/gals invent.

                In the end, it all comes down to the question of which tools will
                help me tell my story the best (highest quality, fastest, most
                efficient, etc.) in the dome. Software applications usually have to
                make a choice between simplicity and power. If I only ever produced
                "the sky tonight" type talks, then I'd probably choose something like
                Starry Night, D3, or one of the other simple streamlined focused
                applications for doing that (unless I'm concerned about wide
                distribution.) If I wanted to do something truely interactive, then
                obviously a real-time application is the only choice. If I want to
                craft a linear cinematic experience with no limitations except my own
                imagination, budget and ability, then the hundreds of tools that fall
                into the "prerendered" category are the only logical choice.
                Remember, all of these things are just tools to create the
                experience. Understand and use your tools or else they will use you.

                --
                Brad Thompson - bthompson@...
                Digital Animation & Design - Spitz, Inc.
                http://www.spitzinc.com

                -- "Hush, may I ask you all for silence? The dreamer is still asleep"
              • Ryan Wyatt
                I ll echo others sentiments about leaving room for both real-time and rendered technologies in our domes, in particular, Todd Slisher s ... Agreed! In many
                Message 7 of 15 , Aug 22, 2006
                View Source
                • 0 Attachment
                  I'll echo others' sentiments about leaving room for both real-time
                  and rendered technologies in our domes, in particular, Todd Slisher's
                  words:

                  > I'm a big believer that both interactive and pre-rendered cinematic
                  > experiences should be part of our virtual 'bag of tricks' that we
                  > can deliver to our audiences. Let's have the best both worlds can
                  > deliver.

                  Agreed!

                  In many ways, the two are apples and oranges. I think AMNH's attempt
                  to proceed down both routes shows how we value the differing
                  experiences they can offer, even though our 430-seat theater doesn't
                  give us the same kind of opportunities for real-time experimentation
                  that a smaller, classroom-sized theater would (which is why we need
                  to build a second theater, in my opinion, but that's another story).

                  However, the "apples and oranges" mentality doesn't do the
                  relationship justice. Our monthly real-time program, "Virtual
                  Universe," has given us an opportunity to explore topics in the dome
                  that have in turn informed our big-budget "space shows." One huge
                  benefit is that the same people who create the shows (e.g., Carter
                  Emmart and I) also share responsibilities for the monthly program.
                  So we get to experience first-hand audience reactions to our work,
                  tweak and modify the presentation both visually and verbally, and
                  basically play with our ideas before they appear in a more formal,
                  "cinematic" context.

                  (Using my preferred term "narrative journey" for immersive
                  experiences, the difference is a bit like the slick, pre-recorded
                  tour versus the friend pointing out places of interest. A well-
                  informed friend, with enough experience, can provide some insight
                  into improving the pre-produced version.)

                  We also use our dome's real-time capabilities for testing and
                  creating flight paths. In particular, we were able to collaborate
                  with NCSA over the phone as we shared the same virtual environment:
                  they would make a flight path, share it with us, then we could fly it
                  together through an identical virtual space. But this idea could be
                  carried further... If a production work flow were designed
                  correctly, it seems that one could effectively create a real-time
                  version of a show (or portions thereof), test it with audiences, then
                  use that feedback to inform development of a final production. Seems
                  like it would be worth a try, at any rate. N.B., however, that real-
                  time needs must be considered specifically in advance of a
                  production; done correctly, they can proceed naturally from work done
                  for the rendered work, but that doesn't happen for free or without an
                  investment of time and resources.

                  Finally, I see potential for real-time show distribution. Because
                  flightpaths and models could be downloaded off an FTP site or sent on
                  a DVD-R, they represent a much more (and quite literally)
                  "lightweight" means of distributing content than shipping hard drives
                  around the world. As real-time engines continue to improve, there
                  are many things they do just as well as pre-rendered media (e.g.,
                  most solar system shows, some general astronomical content), so why
                  not save on the FedEx bill? Perhaps one could even disassemble the
                  show kits, in fine old planetarium tradition, to create mini-shows or
                  portions of a program that could be used in different contexts?

                  Just my $0.02.


                  Ryan, a.k.a.
                  Ryan Wyatt, Science Visualizer
                  Rose Center for Earth & Space
                  American Museum of Natural History
                  79th Street at Central Park West
                  New York, NY 10024
                Your message has been successfully submitted and would be delivered to recipients shortly.