Loading ...
Sorry, an error occurred while loading the content.
 

Re: Highest resolution fulldome projection

Expand Messages
  • Paul Mowbray
    From a hardware point of view this is great news as more is always better. From a production point of view, argh!! It is challenging enough working with the
    Message 1 of 15 , Aug 16, 2006
      From a hardware point of view this is great news as more is always
      better.

      From a production point of view, argh!!

      It is challenging enough working with the currently accepted 3600x3600
      dome master files. The amount of effort required to produce these things
      can be painfull to say the least. The detail required in texturing,
      modeling and rendering sucks up a lot of resources not to mention
      storing all the files once they are rendered!

      I am of course talking about pre-rendered shows, I am still yet to see
      evidence (although I was not in attendance at IPS this year) that real
      time is ready for prime time. You just can't re-create a cinematic
      experience this way at the moment.

      Also I'm sure you guys at seos will have some spangly playback hardware
      but our D3 can't even play back 1280x1024 mpeg2 files consistently so
      playback hardware is gonna have to be ramped up to handle all the extra
      data.

      The worst bit of this whole equation is that the hardware is getting
      more fancy and expensive but the budgets for producing Fulldome shows
      and the licensing revenue is in decline.

      What good is a dome that has detail approaching the level of what the
      human eye can perceive if all you have to show on it is the same shows
      that have been kicking around for the past few years only available at a
      lower res than the system is capable of.

      I know that the production problems can be overcome with enough money
      thrown at it but when the content producers are facing diminishing
      returns it is hard to justify.

      I love technology and I am excited about the quality of the video stars
      on this and future systems but at the end of the day there is more to
      life than stars and for the Fulldome medium (which I think is one of the
      few remaining passive experiences with a distinct wow factor) to even
      think about any kind of mainstream penetration we need more happening on
      the content side of things, it is King after all ;)

      Just my 2cents

      Paul


      Paul Mowbray

      Digital Animation/Design Artist
      National Space Centre, Exploration Drive, Leicester, LE4 5NS, UK.
      Tel: +44 (0) 116 2582117
      Fax: +44 (0) 116 2582100
      www.spacecentre.co.uk <http://www.spacecentre.co.uk/>
    • Steve Cooper
      Our D3 is handling the 3600X3600 masters well, even swapping between that and older 2400X2400 or 2200X2200 masters. The communications issues are resolving as
      Message 2 of 15 , Aug 16, 2006
        Our D3 is handling the 3600X3600 masters well, even swapping between
        that and older 2400X2400 or 2200X2200 masters. The communications
        issues are resolving as quickly as the resolution standards. We got
        ours in May of 2005, if that tells you anything.

        While the theater does have it's passive 'WOW' factor, we have been
        using our theater to do more and more interactive approaches. Our
        'Cosmic Jukebox' is doing well (16,071...now 16,072...now 16,073 five to
        seven minute customer-built shows since May 2005), and our skies show is
        the best attended. We even make an effort to make 'Cardboard Rocket'
        sort of interactive by insisting the kids do all of the countdowns as
        loud as they possibly can. After the first 'BLASTOFF' the kids feel
        more involved and are totally engrossed.

        Later this year we will be doing a show in the theater with a 'live
        reporter' who will interact with the audience, the dome visuals, and two
        Real Time robot reporters on the dome.

        This might not work for the more commercial venues, but we do it
        because:
        Guests have responded well to it.
        We want to be as different from our IMAX as we possibly can.
        We wanted to exploit the newer capabilities of the system.
        Because we can(heh! heh!).

        Steve Cooper
        Technical Coordinator
        Science Center of Iowa
        401 W. ML King.
        Des Moines, Iowa 50309
        515-274-6868 X:231
      • Martin Howe
        Good to see more discussion on this point, a few threads are developing; hardware and content... Since Ed Lantz s first reply commenting on what eye limiting
        Message 3 of 15 , Aug 17, 2006
          Good to see more discussion on this point, a few threads are developing;
          hardware and content...

          Since Ed Lantz's first reply commenting on what eye limiting resolution
          is I've done a little more research (partly just by visiting our R&D
          guys down the corridor), and as always, things are not what they first
          appear.

          What the eye can resolve varies based on what it is looking at (dots,
          lines, etc), which part of the eye is looking at it, the brightness of
          the object and to some degree the brightness of the environment in which
          the eye is operating.

          I also like to raise the point that we're now dealing with square pixels
          not round ones, more a commentary on technology progress than what we
          can see (my guess is that E&S laser pixels are squarish - care to
          comment?)

          Generally though, around 2 arc minutes per pixel is regarded as the
          practical limit today mainly for the reason of content generation and
          distribution. The only other limit I can think of is budget. I heard
          recently that someone paid $7m for a starball - is this true? that level
          of budget can buy a lot of pixels (for delivery Q1 07!).

          Ed goes on to point out the issues of channelis/zation (not a word in
          either language I'm sure). However I suspect that a true 4k x 4k
          projector is a while off whilst the Cinema industry drive is to true 4k
          x 2k (the actual native resolution of the imaging device, the current
          offering still comprise multiple lower resolution sub-components) which
          should come a lot sooner (within 2 years; installable!).

          Even then, the issue of channelisation stays with us. For instance the
          Sony 4k projector needs 4 inputs of lower resolutions, each of which has
          to be matched and synchronised, and whilst the point about projector
          matching is true, the issue of matching all the components up stream
          applies equally well. So just because a system may use 'only' two
          projectors for example to make up an image (which still need to be
          matched by the way) there are still eight sets of electronics that need
          to perform identically too, if not you may see synchronisation issues,
          image tearing, fill rate differences etc.

          It is definitely the case that whatever approach is taken in the medium
          term, multiple devices will need to be matched very well in a system to
          make the image appear truly coherent.
          The good news is that it is now increasingly possible to do this with
          today's technology, and there are fewer excuses for why the audience
          should be able to see blends on mutliple channels systems (however many)
          or image synchronistion issues.

          To answer Paul Mowbray's points relating to creating content and playing
          back these resolutions;

          1 - Er yes, we do have spangly playback hardware that we believe can
          deal with these resolutions :-) (we call it SEOS Media Server).
          Importantly it is based upon commodity PC hardware (albeit hi-end), so
          it should be relatively affordable to acquire and maintain (I say
          relative because we all have our own bechmarks)
          2 - I was lucky enough to go to IPS and I saw real time software that I
          thought looked truly excellent. One was on our booth and the others on
          Skyskan's. Both were showing the Digital Universe Atlas
          (http://haydenplanetarium.org/universe/) and whilst I thought one was
          better than the other (and I will be accused of bias), both offer
          compelling audience experiences I believe.
          3 - I agree, content is still King, I hope we can find a way for that to
          keep up with what the technology is now capable of!

          Oh, & screen technology needs to keep up too!!! (look forward to reading
          those replies)


          Martin



          Martin Howe
          Vice President, Visualization & Operations
          SEOS Ltd
          tel: +44 (0) 1444 870 888
          mob +44 (0) 7793 414 553
          www.seos.com
        • david mcconville
          ... Paul, I m assuming you re talking about real-time astronomical applications specific to the system you re running? Uniview is running successfully in
          Message 4 of 15 , Aug 17, 2006
            At 11:33 AM 8/16/2006, Paul Mowbray wrote:
            >I am of course talking about pre-rendered shows, I am still yet to see
            >evidence (although I was not in attendance at IPS this year) that real
            >time is ready for prime time. You just can't re-create a cinematic
            >experience this way at the moment.

            Paul,

            I'm assuming you're talking about real-time astronomical applications
            specific to the system you're running? Uniview is running successfully in
            numerous SEOS installations and SkySkan has Digital Sky (both of which use
            the Digital Universe database). Additionally, there are many, many
            "dome-enabled" real-time applications for single-projector systems,
            including astronomy applications, gaming engines, flight/driving
            simulators, VJ applications, and many more that are permanently installed
            or touring with portable systems. I'm sure I'm missing some, but real-time
            is definitely already prime time and used in many locations. In our
            systems, real-time is often used more than pre-rendered movies.

            cheers,
            david



            --------------------------
            david mcconville
            http://www.elumenati.com
            612.605.0826 x5
          • Paul Mowbray
            David, I failed to explain my point sufficiently, I ll have another go. The space based real time systems and data sets that are available are very good and
            Message 5 of 15 , Aug 18, 2006
              David,

              I failed to explain my point sufficiently, I'll have another go.

              The space based real time systems and data sets that are available are
              very good and provide a compelling way to explore the universe. What I
              was trying to say is that not all shows are about space or to be more
              specific stars, planets and other cool space stuff.

              Our latest show Astronaut featured a section explaining the effects of
              microgravity on the human body. We go inside the inner ear, observe bone
              decaying etc, Another scene features advanced character animation
              containing cloth simulation, particle effects, soft and hard bodied
              dynamics simulations. To my knowledge it is not possible to implement
              all the rendering effects and density of geometry in any current
              Fulldome real-time system. Modern day graphics cards can do some amazing
              things but to perform indirect lighting, displacement mapping,
              raytracing etc at the kind of resolutions we require and are looking at
              for the future is just not available as we speak.

              Real-time offers many, many advantages over pre-rendered opening up a
              new paradigm for story telling, but you can not achieve the look and
              feel that people have come to expect from their experiences with the
              world of cinema... yet.

              My personal passion lies in trying to recreate a cinematic experience
              within the unique environment of the dome. I think there is room for
              both pre-rendered and real-time applications for some time to come but
              my initial point was that you can not achieve what I want to put up on
              the dome with current real-time systems. When the real-time systems can
              do what I want and better (which is inevitable... maybe) I will be more
              than happy to transition over!

              Thanks

              Paul
            • ed@visualbandwidth.com
              Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott
              Message 6 of 15 , Aug 18, 2006
                Clearly there will always be the need for pre-rendered no matter how good real-time gets. Some astrophysical simulations take CPU years to run. Artist Scott Draves uses 30,000 CPU's to run his particle animations. And live-action is rendered on an image capture system requiring linear playback.

                I think the most popular use of real-time is to replace (and exceed) planetarium functionality in a digital system. Planetarians have traditionally told stories interactively, reaching down to turn a knob to produce diurnal, latitude, or heading star motion, advance through the annual seasons, display grids, etc. The digital planetarium adds a “z” axis allowing liftoff from the planet and travel through the known universe as well. This adds a tremendous educational dimension - knowledge of our place in the universe.

                Regarding Ken's comments on HYBRID, I greatly respect what GOTO has done here. Many planetarians DO place emphasis on naked eye astronomy, and optomechanical does produce a superior sky simulation. A dedicated “star theater” tasked with naked-eye simulation of the night sky may require an optical projector for the best possible simulation.

                A facility with broader educational goals may not see the value in such an investment, since naked eye astronomy is a narrow educational goal within the field of astronomy and astrophysics, which is a narrow branch of physics, which is only one of the sciences, which in turn is only one possible curricula at an educational institution which might include art, drama and other schools of study that would take an interest in a digital theater resource (planetarians may have to learn to share). Still, starballs are kewl and I do hope they don't fade away too soon...

                How a HYBRID system can be cheaper than a digital system alone is beyond me, Ken.

                Ed

                Ed Lantz
                President
                Visual Bandwidth, Inc.
                Tel: 610.590.4269
                Mobile: 484.467.1267
                ed@...
                www.visualbandwidth.com
              • Todd Slisher
                Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as
                Message 7 of 15 , Aug 18, 2006
                  Just a quick note here. With respect to all the cinematic concerns, (and of course pre-rendered will always be more cinematically beautiful than real-time, as both real-time and rendering capabilities evolve), what people are trying to achieve with real-time systems is NOT a cinematic experience. The most effective use of real-time systems that I've seen are with a live presenter and an interactive program. These will necessarily be different than a cinematic experience. Including the live presenter element breaks some of the immersive qualities of a cinematic experience. And this is fine, if interactivity is your goal. In fact, it could be said that part of the goal of an interactive experience is to break the cinematic 'trance' that often comes over audiences and cause them to think, rather than just view.

                  That said - I'm a big believer that both interactive and pre-rendered cinematic experiences should be part of our virtual 'bag of tricks' that we can deliver to our audiences. Let's have the best both worlds can deliver.

                  Just my $0.02

                  Todd


                  Todd K. Slisher
                  Vice President of Science Programs
                  Detroit Science Center
                  5020 John R Street
                  Detroit, MI 48202
                  Tslisher@...
                • david mcconville
                  Paul, It seems that the true utility of real-time systems is not to re-create the cinematic experience but to evolve the immersive mediated experience beyond a
                  Message 8 of 15 , Aug 18, 2006
                    Paul,

                    It seems that the true utility of real-time systems is not to re-create the
                    cinematic experience but to evolve the immersive mediated experience beyond
                    a solely passive one. Granted, simply rendering pre-determined shots and
                    camera paths in real-time to replace their pre-rendered equivalent for most
                    material is currently not feasible (though many gaming engines are heading
                    in that direction). But watch AMNH's Passport to the Universe and then look
                    at the same shots rendered in real-time in Uniview - you'll be hard-pressed
                    to tell the difference.

                    That said, Source, Unreal Tournament, Crytek and others are great examples
                    of gaming engines that are increasingly being used to create Machinima
                    real-time cinema (see http://www.machinima.com for examples). Any real-time
                    engine can theoretically be adopted for use in domes (we've already helped
                    create a dome version of Unreal Tournament at http://planetjeff.net), so
                    it's really a matter of the value proposition to implement this
                    modifications. It's is very unlikely that the best real-time fulldome
                    engines, especially for non-astronomy applications, will be developed by
                    fulldome hardware companies - they will likely be modified engines that are
                    already being developed for gaming and visualization applications.

                    All of this was discussed at length last year at the NEI fulldome meeting
                    at Chabot
                    (http://www.ips2008.org/partners/images/Chabot_NEI_report.pdf#search=%22%22will%20wright%22%20fulldome%22).
                    We theorized and waxed poetic about how we could us Will Wright's new Spore
                    game in the dome and how to interface numerous handheld devices with the
                    larger display (ie everyone having a wireless PSP and playing each other on
                    the dome). Do yourself a favor and watch
                    http://video.google.com/videoplay?docid=8372603330420559198 - you'll
                    quickly get the idea of why real-time environments could be incredible for
                    interactive education. CMU Entertainment Technology Center students
                    addressed these issues as well with their Interactive Dome Project
                    (http://www.etc.cmu.edu/projects/dome/) last year.

                    And Tom is right - just because it's real-time doesn't mean it's cheap,
                    especially for high-quality data collection, modeling, animation,
                    simulation, lighting, etc. But serious real-time capabilities do
                    dramatically increase possibilities for new forms of interactivity,
                    networking, education, exploration, and experimentation - in other words,
                    some of the reasons that many of us became interested in immersive virtual
                    environments to begin with...

                    cheers,
                    david

                    --------------------------
                    david mcconville
                    http://www.elumenati.com
                    612.605.0826 x5
                  • pauldavidbourke
                    My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a
                    Message 9 of 15 , Aug 19, 2006
                      My quick comment on full dome resolution ..... the final resolution as it relates to the informational content on the final dome surface is dependent on a whole raft of factors.

                      The resolution of the raw frames that make up the content is only one part of the story, it only determines the limit of the visual quality of the final projected result given a perfect projection system. There are however lots of factors that reduce the effective resolution of the result reflected off the dome surface, some of these are:

                      1. Codec choice. It seems many (perhaps most) planetariums are not usng lossless encoding/playback in which case the pixels being sent to the projection hardware are at a lower fidelity than the original rendered material. Personally I've always been surprised that some/many systems even use mpg and variants .... surely with todays hardware we can move past such compromising technologies.

                      2. Projectors don't give a 1:1 representation of pixels on the projection surface, this is especially so for CRT technology but also true for digital projectors. There are all sorts of sources for this including use of analog signals, the reality of lens physics, focusing on a curved surface, etc.

                      3. Many multiprojector systems use digital warping to correct for the spherical geometry, this lowers the information content.....individual pixels get contrbutions from neighbours so are no longer independent.

                      4. Edge blending is rarely (if ever) perfect, the result is usually a blurring (often significant) of the image between the projection patches.

                      I'm sure it is easy to find examples of planetariums with significantly higher theoretical projected resolution but where content looks worse than planetariums with lower spec'ed projection hardware.

                      One should also note that just because raw pristine frames may have a certain pixel count, that certainly does not mean it is higher resolution (in the informational sense) than content with a lower pixel count. This is clearly the case for filmed material, but also for CG. The resolution as it relates to quality is dependent on things such as antialiasing settings, quality of the model geometry, texture resolution, and other factors related to rendering technique.
                    • Ryan Wyatt
                      Posted for Brad Thompson: Hello, Sorry for the long diatribe but Paul and David touched lightly on something that I ve encountered over and over again
                      Message 10 of 15 , Aug 20, 2006
                        Posted for Brad Thompson:

                        Hello,

                        Sorry for the long diatribe but Paul and David touched lightly on
                        something that I've encountered over and over again throughput my
                        years sitting under a dome. This is a bit off-topic and my long post
                        isn't meant as a retort to anything anyone said here. There is this
                        myth about real-time technology that I seem to keep encountering over
                        and over, even from people who really should know better. The key to
                        Paul's statement is that you "can't re-create a CINEMATIC experience
                        this way at the moment." I'm guessing that by "cinematic
                        experience", he's referring to passive linear cinematic
                        storytelling. Sure, Uniview, Digital Sky, D3, Starry night, game
                        engines, etc. are out there and being used to create great and unique
                        experiences. However, they aren't currently the most powerful/
                        effective/efficient tools for passive linear cinematic storytelling
                        on the dome (aka a cinematic experience.)

                        For that to be the case, a system would have to be able to generate
                        The Enchanted Reef, Black Holes: The other side of Infinity, DarkStar
                        Adventure, Astronaut, or (insert name of your favorite prerendered
                        dome show here) frames in "real-time" without visual compromises. It
                        would have to be a more efficient platform to produce upon than the
                        applications that those shows were produced with (3dsMax, Maya,
                        custom supercomputer apps, etc.), and the resulting production would
                        have to be as portable as prerendered content is now. To my
                        knowledge, this doesn't even come close to describing any system that
                        currently exists or will exist in the forseeable future. I'm not
                        saying that this can never happen, but ever since high-end graphics
                        technology became a commodity item, Nvidia, ATI and the like have
                        been trying to convince us that the next generation of their
                        technology will bring cinema quality graphics in real-time. So far,
                        they've always been at least 10 years behind. On the tech side, what
                        happens is that new methods are pioneered in software labs at
                        universities or the R&D teams at animation or FX studios. These
                        methods and tools quickly show up in animation software, then in the
                        cinema. People's expectations are raised, and then finally, the
                        propeller-heads at the video card companies and the engine
                        programmers at game companies figure out ways to accelerate these new
                        methods via new hardware. I could mention Blinn's law here as well,
                        which is a long standing maxim credited to CG pioneer Jim Blinn that
                        states “any renderer, no matter how fast processors get, will always
                        take a couple of hours, because that's the tolerance level of artists."

                        Furthermore, I find that people often don't recognize that the same
                        graphics technology that powers all the cool real-time digital dome
                        systems out there, also powers 3dsMax, Maya, FinalCut, AfterEffects,
                        etc. I have a real-time viewport where I can spin around, set up my
                        animations, press play, and immediately see the results. If I've
                        kept things basic, I don't have to wait for a render to know what
                        things will look like. It's only when I start taking my visual
                        fidelity or complexity beyond what real-time graphics systems are
                        capable of that I have to do test renders to see the results. Also,
                        what it means to "keep things basic" is evolving at the same pace as
                        real-time graphics technology. Finally, these tools have been
                        hammered on and optimized for efficiency of production much more, and
                        by many MANY more people than the latest narrowly focused tool that
                        us dome-specific guys/gals invent.

                        In the end, it all comes down to the question of which tools will
                        help me tell my story the best (highest quality, fastest, most
                        efficient, etc.) in the dome. Software applications usually have to
                        make a choice between simplicity and power. If I only ever produced
                        "the sky tonight" type talks, then I'd probably choose something like
                        Starry Night, D3, or one of the other simple streamlined focused
                        applications for doing that (unless I'm concerned about wide
                        distribution.) If I wanted to do something truely interactive, then
                        obviously a real-time application is the only choice. If I want to
                        craft a linear cinematic experience with no limitations except my own
                        imagination, budget and ability, then the hundreds of tools that fall
                        into the "prerendered" category are the only logical choice.
                        Remember, all of these things are just tools to create the
                        experience. Understand and use your tools or else they will use you.

                        --
                        Brad Thompson - bthompson@...
                        Digital Animation & Design - Spitz, Inc.
                        http://www.spitzinc.com

                        -- "Hush, may I ask you all for silence? The dreamer is still asleep"
                      • Ryan Wyatt
                        I ll echo others sentiments about leaving room for both real-time and rendered technologies in our domes, in particular, Todd Slisher s ... Agreed! In many
                        Message 11 of 15 , Aug 22, 2006
                          I'll echo others' sentiments about leaving room for both real-time
                          and rendered technologies in our domes, in particular, Todd Slisher's
                          words:

                          > I'm a big believer that both interactive and pre-rendered cinematic
                          > experiences should be part of our virtual 'bag of tricks' that we
                          > can deliver to our audiences. Let's have the best both worlds can
                          > deliver.

                          Agreed!

                          In many ways, the two are apples and oranges. I think AMNH's attempt
                          to proceed down both routes shows how we value the differing
                          experiences they can offer, even though our 430-seat theater doesn't
                          give us the same kind of opportunities for real-time experimentation
                          that a smaller, classroom-sized theater would (which is why we need
                          to build a second theater, in my opinion, but that's another story).

                          However, the "apples and oranges" mentality doesn't do the
                          relationship justice. Our monthly real-time program, "Virtual
                          Universe," has given us an opportunity to explore topics in the dome
                          that have in turn informed our big-budget "space shows." One huge
                          benefit is that the same people who create the shows (e.g., Carter
                          Emmart and I) also share responsibilities for the monthly program.
                          So we get to experience first-hand audience reactions to our work,
                          tweak and modify the presentation both visually and verbally, and
                          basically play with our ideas before they appear in a more formal,
                          "cinematic" context.

                          (Using my preferred term "narrative journey" for immersive
                          experiences, the difference is a bit like the slick, pre-recorded
                          tour versus the friend pointing out places of interest. A well-
                          informed friend, with enough experience, can provide some insight
                          into improving the pre-produced version.)

                          We also use our dome's real-time capabilities for testing and
                          creating flight paths. In particular, we were able to collaborate
                          with NCSA over the phone as we shared the same virtual environment:
                          they would make a flight path, share it with us, then we could fly it
                          together through an identical virtual space. But this idea could be
                          carried further... If a production work flow were designed
                          correctly, it seems that one could effectively create a real-time
                          version of a show (or portions thereof), test it with audiences, then
                          use that feedback to inform development of a final production. Seems
                          like it would be worth a try, at any rate. N.B., however, that real-
                          time needs must be considered specifically in advance of a
                          production; done correctly, they can proceed naturally from work done
                          for the rendered work, but that doesn't happen for free or without an
                          investment of time and resources.

                          Finally, I see potential for real-time show distribution. Because
                          flightpaths and models could be downloaded off an FTP site or sent on
                          a DVD-R, they represent a much more (and quite literally)
                          "lightweight" means of distributing content than shipping hard drives
                          around the world. As real-time engines continue to improve, there
                          are many things they do just as well as pre-rendered media (e.g.,
                          most solar system shows, some general astronomical content), so why
                          not save on the FedEx bill? Perhaps one could even disassemble the
                          show kits, in fine old planetarium tradition, to create mini-shows or
                          portions of a program that could be used in different contexts?

                          Just my $0.02.


                          Ryan, a.k.a.
                          Ryan Wyatt, Science Visualizer
                          Rose Center for Earth & Space
                          American Museum of Natural History
                          79th Street at Central Park West
                          New York, NY 10024
                        Your message has been successfully submitted and would be delivered to recipients shortly.