Loading ...
Sorry, an error occurred while loading the content.

Re: [RSS2-Support] Number of items and bandwidth.

Expand Messages
  • Bill Kearney
    The trouble, of course, is telling who got the old items. Partial delivery of XML fragments is one possiblity but there s nothing that supports it. Using
    Message 1 of 10 , Nov 13, 2002
    • 0 Attachment
      The trouble, of course, is telling who got the old items.

      Partial delivery of XML fragments is one possiblity but there's nothing that
      supports it.

      Using static RSS files that are 'published' makes it easy to share the data with
      a lot of people without requiring significant server-side processing.

      It's a sacrifice worth getting redundant data.

      -Bill

      ----- Original Message -----
      From: "jystervinou" <jy@...>
      To: <RSS2-Support@yahoogroups.com>
      Sent: Wednesday, November 13, 2002 10:55 AM
      Subject: [RSS2-Support] Number of items and bandwidth.


      > Hi,
      >
      > I was wondering (tell me if i totally miss something) :
      >
      > All the feeds seem to have 15 or more items.
      >
      > This means that if a Weblogger posts one new item, his rss feed will
      > be updated with only one new item, and 14 "old" items.
      >
      > So the news agregators will fetch the feed, and only take the new
      > item.
      > So at each request, the agregator fetches items he already got
      > earlier.
      >
      >
      >
      > Why not put in the feeds only the new items?
      > RSS feeds would be a lot smaller and the amount of bandwidth would be
      > reduced a lot.
      >
      >
      > JY.
      >
      >
      >
      > To unsubscribe from this group, send an email to:
      > RSS2-Support-unsubscribe@yahoogroups.com
      >
      >
      >
      > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
      >
    • Dave Winer
      Morbus got it right. That would require a dynamic feed that knows produces a custom feed for each reader, and would be a tradeoff of CPU performance for net
      Message 2 of 10 , Nov 13, 2002
      • 0 Attachment
        Morbus got it right. That would require a dynamic feed that knows
        produces a custom feed for each reader, and would be a tradeoff of
        CPU performance for net bandwidth. I think CPUs are still more dear
        than net bandwidth, and aside from that the RSS network is designed
        to be static not dynamic, so this is not an option, imho.
      • Phil Ringnalda
        ... What is new? *washes hands* Not every aggregator runs on the same schedule, or on a schedule at all. To deliver all new items to each aggregator, you would
        Message 3 of 10 , Nov 13, 2002
        • 0 Attachment
          jystervinou wrote:
          > Why not put in the feeds only the new items?
          > RSS feeds would be a lot smaller and the amount of bandwidth would be
          > reduced a lot.

          What is new? *washes hands*

          Not every aggregator runs on the same schedule, or on a schedule at all. To
          deliver all new items to each aggregator, you would have to make your feed
          dynamic, rather than serving a simple static file, and then either have a
          login system to track what items had been delivered to what aggregators, or
          misuse HTTP by delivering a partial document in response to an
          If-Modified-Since request. While that might not be a bad idea for feeds that
          are already dynamically generated, I doubt that there are many people in the
          situation of having all the CPU in the world to generate their feed anew
          each time it's requested, but without enough bandwidth to serve it up.

          Phil Ringnalda
        • Eric Vitiello
          ... Of course, it depends on the circumstances as well. I have a fairly beefy machine, with CPU utilization hovering at 10%. That measn I have 90% of my CPU
          Message 4 of 10 , Nov 13, 2002
          • 0 Attachment
            >Morbus got it right. That would require a dynamic feed that knows
            >produces a custom feed for each reader, and would be a tradeoff of
            >CPU performance for net bandwidth. I think CPUs are still more dear
            >than net bandwidth, and aside from that the RSS network is designed
            >to be static not dynamic, so this is not an option, imho.

            Of course, it depends on the circumstances as well. I have a fairly beefy machine, with CPU utilization hovering at 10%. That measn I have 90% of my CPU free for use on average... however, Bandwidth costs me money. I've begun work on dynamically generating the RSS files based on the user and conditional GET.

            some people have extra bandwidth, some have extra CPU.

            --Eric

            --
            Eric Vitiello, Perceive Designs <http://www.perceive.net>
            Got Geek? Shirts, etc. <http://www.cafepress.com/got_geek>
          • Sjoerd Visscher
            ... Not everyone (esp. people with a dial-up connection) have the aggregator running dayly. So if you were away for a weekend, and return, you want your
            Message 5 of 10 , Nov 13, 2002
            • 0 Attachment
              jystervinou wrote:

              > Hi,
              >
              > I was wondering (tell me if i totally miss something) :
              >
              > All the feeds seem to have 15 or more items.
              >
              > This means that if a Weblogger posts one new item, his rss feed will
              > be updated with only one new item, and 14 "old" items.
              >
              > So the news agregators will fetch the feed, and only take the new
              > item.
              > So at each request, the agregator fetches items he already got
              > earlier.
              >
              > Why not put in the feeds only the new items?
              > RSS feeds would be a lot smaller and the amount of bandwidth would be
              > reduced a lot.

              Not everyone (esp. people with a dial-up connection) have the aggregator
              running dayly.
              So if you were away for a weekend, and return, you want your aggregator
              to get all posts from the last 2 days.

              Sjoerd
            • Willem Broekema
              ... Another option is using standard HTTP cookies, analog to community sites that state There have been 13 new comments since your last visit. The server may
              Message 6 of 10 , Nov 13, 2002
              • 0 Attachment
                Phil Ringnalda wrote:

                > To deliver all new items to each aggregator, you would have to make
                > your feed dynamic, rather than serving a simple static file, and then
                > either have a login system to track what items had been delivered to
                > what aggregators, or misuse HTTP by delivering a partial document in
                > response to an If-Modified-Since request.

                Another option is using standard HTTP cookies, analog to community sites
                that state "There have been 13 new comments since your last visit."

                The server may set a cookie with as value for example the current
                date+time, or the internal database record id of the most recent item in
                the feeds it is about to return to the client, or any other data that
                the server can use to *easily* (not very CPU-intensive) determine the
                (relative to the previous fetch) new items.

                The client should simply accept the cookies it receives and sent them
                with the next request, without trying to parse and/or interpret its value.

                This adds one more header field for aggregators to store, besides the
                ETag and Last-Modified.


                - Willem
              • Gerhard Poul
                This is a comment about this topic that you may find interesting on kuro5hin: http://www.kuro5hin.org/comments/2002/11/10/122820/97/34#34 Willem Broekema
                Message 7 of 10 , Nov 26, 2002
                • 0 Attachment
                  This is a comment about this topic that you may find interesting on
                  kuro5hin:

                  http://www.kuro5hin.org/comments/2002/11/10/122820/97/34#34

                  "Willem Broekema" <willem@...> wrote in message
                  news:3DD2D809.8020706@......
                  > Phil Ringnalda wrote:
                  >
                  > > To deliver all new items to each aggregator, you would have to make
                  > > your feed dynamic, rather than serving a simple static file, and then
                  > > either have a login system to track what items had been delivered to
                  > > what aggregators, or misuse HTTP by delivering a partial document in
                  > > response to an If-Modified-Since request.
                  >
                  > Another option is using standard HTTP cookies, analog to community sites
                  > that state "There have been 13 new comments since your last visit."
                  >
                  > The server may set a cookie with as value for example the current
                  > date+time, or the internal database record id of the most recent item in
                  > the feeds it is about to return to the client, or any other data that
                  > the server can use to *easily* (not very CPU-intensive) determine the
                  > (relative to the previous fetch) new items.
                  >
                  > The client should simply accept the cookies it receives and sent them
                  > with the next request, without trying to parse and/or interpret its value.
                  >
                  > This adds one more header field for aggregators to store, besides the
                  > ETag and Last-Modified.
                  >
                  >
                  > - Willem
                  >
                  >
                  >
                  > To unsubscribe from this group, send an email to:
                  > RSS2-Support-unsubscribe@yahoogroups.com
                  >
                  >
                  >
                  > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
                  >
                  >
                  >
                • Gerhard Poul
                  ... exactly, but what happens if you re away for 2 days and there were more than 15 newly posted items?
                  Message 8 of 10 , Nov 26, 2002
                  • 0 Attachment
                    > > All the feeds seem to have 15 or more items.
                    >
                    > Not everyone (esp. people with a dial-up connection) have the aggregator
                    > running dayly.
                    > So if you were away for a weekend, and return, you want your aggregator
                    > to get all posts from the last 2 days.

                    exactly, but what happens if you're away for 2 days and there were more than
                    15 newly posted items?
                  Your message has been successfully submitted and would be delivered to recipients shortly.