Loading ...
Sorry, an error occurred while loading the content.

[PanoToolsNG] Re: Sony announce 25Mp 35mm sensor

Expand Messages
  • Keith Martin
    ... Got it. Although really it is is lower *quality* that we re talking about. Resolution, although related in a sense, means something slightly different. At
    Message 1 of 15 , Feb 1 9:19 AM
    • 0 Attachment
      Sometime around 1/2/08 (at 13:35 +0100) Erik Krause said:

      >close to the image circle there is a lower resolution, not
      >only due to lens design flaws but due to the fisheye mapping.

      Got it. Although really it is is lower *quality* that we're talking
      about. Resolution, although related in a sense, means something
      slightly different. At least, with digital images it is used to refer
      to the sensors and the final pixels.

      Thanks for the further info and the DoF link! I was thinking in terms
      of individual shots, but that's interesting data on that wiki. Stuff
      for me to ponder. :-)

      k
    • mrjimbo
      Keith, I m not a rocket scientist but did learn that the delay for the introduction of teh Betterlight new 10k scan back was all about that no lenses resolved
      Message 2 of 15 , Feb 1 9:20 AM
      • 0 Attachment
        Keith,
        I'm not a rocket scientist but did learn that the delay for the introduction of teh Betterlight new 10k scan back was all about that no lenses resolved what it would do properly.. Some how they did get to a compromise and did release the back.. In my converstaions with them at that time they spoke of the issues related to that level of resolving using optics.. Further we must realize that today optics are multi part.. So it's probably not just a matter of saying make another one that does it.. In the smaller sensors they have been packing more and more pixels.. but as spoken in these posts that has been at a price.. So it makes sense to make larger sensors..so the info that is captured isn't shrunk as much. The optics are actually doing a conversion...making a big image fit on a small sensor.. It appears that what we are experiencing is degradiation when we get to a certain threhshold at out current optical technology. The present answer is larger cameras it seems.. So tommorrows Nikon may look like my Pentax 6x7 with a face lift and a Nikon logo on it ( hopefully a little lighter too)....or a new version of a Sinar 8x10 with a fixed sensor in the back of it.with large pixel sizes... Whooo Hooo.

        jimbo


        ----- Original Message -----
        From: Keith Martin
        To: PanoToolsNG@yahoogroups.com
        Sent: Friday, February 01, 2008 3:41 AM
        Subject: RE: [PanoToolsNG] Re: Sony announce 25Mp 35mm sensor


        Sometime around 1/2/08 (at 01:42 -0800) Paul D. DeRocco said:

        > > From: Erik Krause
        >>
        >> ...then 10MP on a crop factor 1.6 sensor is beyond the resolution of
        >> any fisheye, too. 25MP on a full frame sensor has the same absolute
        >> resolution (pixel density) like a 10MP sensor at crop faktor 1.6
        >
        >Probably. Even on my 6Mp 10D, which is a 1.6x crop sensor, I can see a lot
        >of CA near the edges, so it's obvious that I wouldn't be getting any more
        >sharpness if I stuck it on my 10Mp 40D. And a 25MP FF sensor would probably
        >be even worse, because it reaches into the worst part of the lens.

        But then, a full-frame sensor camera does mean working with different
        lenses to get the equivalent effect. So a rough equivalent of the
        10.5mm would be 16mm, wouldn't it? Something like the old 16mm
        fisheye that I used briefly on my old Canon A1. Assuming the
        manufacturing and glass quality was similar, that would give
        approximately the same view but reduced chromatic abberation.
        (Slightly reduced depth of field too, but that's physics for ya!)

        I don't think it is really a matter of being beyond the resolution of
        a fisheye, as that's just analog-world optics. The Sigma 8mm and
        Nikon 10.5mm fisheyes are designed to produce acceptable images on a
        cropped-area sensor, and trying to capture an image using a broader
        part of the image means going beyond the design intentions. So...
        isn't the important thing simply using a lens that is actually meant
        to cover a full-frame sensor?

        (I think that's what you meant in your first post, but I wasn't sure...)

        k




        [Non-text portions of this message have been removed]
      • Fabio Bustamante
        Hey Carel, So in theory it would be possible to develop a ~4mp image from a 12.8mp RAW file, is it right? Have you ever tried this? How different in practice
        Message 3 of 15 , Feb 2 4:05 PM
        • 0 Attachment
          Hey Carel,

          So in theory it would be possible to develop a ~4mp image from a 12.8mp
          RAW file, is it right? Have you ever tried this? How different in
          practice would that be from reducing a 12.8mp picture to a 4mp size with
          a good interpolator?

          I can hardly imagine a 4mp image that surpasses it's 12.8mp version in
          any way...

          Is this really used in star photography?
          >
          > One could use all these pixels in combination with DCRaw in the Super-pixel
          > mode to circumvent the Bayer matrix induced artifacts. The main disadvantage
          > of the super-pixel method is that you end up with an image that is only
          > 1/4th size of the original, so with this sensor you would end up with a
          > 6Mpixel image, but more detailed image.
          > http://deepskystacker.free.fr/english/technical.htm#rawdecod
          >
          > Carel
          >
        • Carel
          ... No, it would not surpass the original sized quality, but might be an interesting way to use this overkill of 24Mpixels for web purposes. I only have 5D
          Message 4 of 15 , Feb 2 6:05 PM
          • 0 Attachment
            Fabio Bustamante-2 wrote:
            >
            > Hey Carel,
            >
            > So in theory it would be possible to develop a ~4mp image from a 12.8mp
            > RAW file, is it right? Have you ever tried this? How different in
            > practice would that be from reducing a 12.8mp picture to a 4mp size with
            > a good interpolator?
            >
            > I can hardly imagine a 4mp image that surpasses it's 12.8mp version in
            > any way...
            >
            > Is this really used in star photography?
            >
            >

            No, it would not surpass the original sized quality, but might be an
            interesting way to use this overkill of 24Mpixels for web purposes. I only
            have 5D images, but a test is on the way. I will use the super-pixel method
            with DCRaw versus ACR. My reasoning was along the lines of Bernhard Vogl,
            his complaints about the shortcomings of the Bayer array and his observation
            that one retains more detail when downsizing an image that was taken with a
            longer lens to the size of the same image taken with a shorter lens. But
            maybe that is not a good analogy.

            >Is this really used in star photography?

            The inclusion of this method in DeepSkyStacker would indicate so. When I
            asked about this on my recent visit to the Mt Wilson Observatory, it did not
            seem to ring a bell, while the recently discussed method of getting a
            sharper image by using a slightly misalligned stack of images ("dribbling")
            did.

            Carel
            --
            View this message in context: http://www.nabble.com/Sony-announce-25Mp-35mm-sensor-tp15207204p15249800.html
            Sent from the PanoToolsNG mailing list archive at Nabble.com.
          Your message has been successfully submitted and would be delivered to recipients shortly.