Loading ...
Sorry, an error occurred while loading the content.

Re: Lossy v. Lossless Compression

Expand Messages
  • p83822
    Only to say that jpegtran can sometimes squeeze an extra handful of percent of lossless compression. Not really what you were asking I know, but I mention it
    Message 1 of 12 , Aug 31, 2010
    View Source
    • 0 Attachment
      Only to say that jpegtran can sometimes squeeze an extra handful of percent of lossless compression. Not really what you were asking I know, but I mention it because AFAIK it's 'safe' so it can be a transparent task on the build server. (I'm sure that this is one of the 'other' tools you mention and that you've already tried it.)

      If you crack the generally applicable part below, I for one would love to know, please. :-) BTW: are you talking PC only or hand-held as well? I've found I can get away with dreadful images on those small, low-res screens which is, of course, a win over the wire and also a win in local resources used. But again, this has always involved people actually looking at images.

      Neil

      P.S. Do you crush gifs - if you have any kicking around your site that is. I've noticed editors put in huge headers which say little more than 'I was created using...'. As gifs are often small the proportional weight can be high - often enough to force a second roundtrip. Easily removed using a text editor or something like http://chemware.co.nz/tgo.htm



      --- In exceptional-performance@yahoogroups.com, Sergey Chernyshev <sergey.chernyshev@...> wrote:
      >
      > The trick here is how to measure when to stop without involving a user...
      >
      > As for the general jpeg quality, it's definitely part of image
      > post-processing that needs to happen in editorial or design teams. It is not
      > that easy to enforce though...
      >
      > Sergey
      >
      >
      > On Tue, Aug 31, 2010 at 3:41 PM, Matthew <mattroche@...> wrote:
      >
      > >
      > >
      > > I have been using Smush.it and a few other lossless compression approaches
      > > and wanted to survey experiences with lossy algorithms.
      > >
      > > To be frank, some images have relatively little size improvement with
      > > lossless, making it little worth the effort. However, in a few cases I have
      > > just used iPhoto to test a lossy compression and have gotten 90-95%
      > > improvements with little apparent degradation.
      > >
      > > Are there others here who have experimented with this? Are their algorithms
      > > or libraries that are better or worse? Could you recommend a target quality
      > > that would be generally applicable?
      > >
      > > Thanks,
      > > Matt
      > >
      > > --- In exceptional-performance@yahoogroups.com<exceptional-performance%40yahoogroups.com>,
      > > "daveartz" <daveartz@> wrote:
      > > >
      > > > Stoyan/Pat, you guys forgot caching :) Probably the single biggest thing
      > > I'm trying to get folks at AOL to do, with the single biggest ROI (on repeat
      > > view). And Pat don't say you covered this in "Reduce HTTP Requests" :P
      > > >
      > > > (and that's why I don't like that rule, too vague, might as well say
      > > Reduce KB and be done with our 2 rules :)
      > > >
      > > > Stoyan - Why stop at "lossless" image compression? "Lossy" image
      > > compression is a big issue for us. Check out this baby from AOL Shopping:
      > > >
      > > >
      > > http://ah.pricegrabber.com/product_image.php?masterid=37223072&width=400&height=400
      > > >
      > > > Coming from a feed in the PNG format (134K), should be JPEG (18K).
      > > >
      > > > An optimizer's work is never done...
      > > >
      > > > While I think Pat & Stoyan's are all must-do's from my perspective, allow
      > > me to play devil's advocate on why it's good to at least be aware of the
      > > others.
      > > >
      > > > In one recent case, jQuery selectors slowed down a particular site by 20%
      > > and another by 10%. Thus, rules were born:
      > > http://www.artzstudio.com/2009/04/jquery-performance-rules/
      > > >
      > > > Another example would be stripping comments/obfuscating JS. Depending on
      > > how happy the developer is to comment their code, this can be a big issue
      > > and large % of weight/time.
      > > >
      > > > I agree that too many rules may have diminishing returns, but depending
      > > on the site (I should say developer or designer), one of those rules we
      > > leave off our checks could actually sting the most and potentially be a big
      > > source of the problem.
      > > >
      > > > So I'd be sure to include rules that have historically caused at least a
      > > 10% load time penalty in whatever "list" is created.
      > > >
      > > > Dave
      > > >
      > >
      > >
      > >
      >
    Your message has been successfully submitted and would be delivered to recipients shortly.