Loading ...
Sorry, an error occurred while loading the content.

Re: [genphoto] Scanning Res

Expand Messages
  • Chuck Linsley
    I don t see any other answer to this yet, so I ll give it a try. ... That is more-or-less right. Pixel is short for picture element, and is, basically, a
    Message 1 of 5 , Apr 3, 2006
    • 0 Attachment
      I don't see any other answer to this yet, so I'll give it a try.

      IowaGob1@... wrote:
      >
      > I'm confused, is not a dot a pixel? I guess a dot is on a printer while
      > a pixel is on a monitor. Wrong?

      That is more-or-less right. "Pixel" is short for "picture element," and
      is, basically, a dot. You can use the two terms pretty much interchangeably
      for scanners and monitors. However, some printers may put more than one
      ink dot on the paper for each pixel they print, in order to give better
      control over the final color. For example, my Epson printer is able to
      print 720dpi in one direction (across the paper?) and 1440dpi in the
      other. Since neither of these are commonly used pixel resolutions, I
      assume it is printing more than one dot per pixel.

      > Pixels have different numbers of bits according to the type of scan,
      > 2 pixels per dot for black and white which I never use,
      > 8 pixels per dot up to what, 256 pixels per dot for grey scale which
      > works best for newspapers, magazines and most all non color
      > documents.
      > 16 to ?? pixels per dot for RGB (red green blue) color or True Color

      Here, you are confusing pixels per dot, color (or gray) shades per
      pixel, and bits per pixel. Black and white has two shades, black and
      white, per pixel; this requires one bit.

      Gray scale usually uses 8 bits per pixel to give 256 shades of gray.

      Full color images usually use 8 bits per color to give 256 shades each
      of red, green and blue, for 16 million possible colors. Some scanners
      can scan at more than 8 bits per color, in which case the image is
      stored with 16 bits per color (even if the scanner is really capable
      of only 10 or 12 bits -- computers process data in multiples of 8 bits),
      for 281 trillion possible colors. This is far more colors than the
      human eye is capable of distinguishing, the electronics of the scanner
      unavoidably introduces some small, random inaccuracy ("noise" -- like
      the static on a radio tuned to a weak station) to the scanned image;
      using the extra bits makes the noise small enough that it becomes
      invisible.

      > Again, nobody has answered my question, why does my scanner
      > software not mention PPI, only DPI?

      Because the people who wrote the documentation were sloppy. Or the
      marketing droids who reviewed and approved the documentation didn't
      understand the difference, and made the writers use the wrong
      terminology.

      > How do I make the conversion
      > from DPI to PPI on my scanner software?

      Change the "D" to "P".

      --
      Chuck Linsley
      linsley@...
    • IowaGob1@juno.com
      Thank you Chuck, that makes sense and I begin to understand. Pretty complicated. Jerry Hale Deltona, FL
      Message 2 of 5 , Apr 3, 2006
      • 0 Attachment
        Thank you Chuck, that makes sense and I begin to understand.
        Pretty complicated.

        Jerry Hale
        Deltona, FL
      Your message has been successfully submitted and would be delivered to recipients shortly.