Loading ...
Sorry, an error occurred while loading the content.

Re: peer browser for artists

Expand Messages
  • m.friedman@cruxus.com
    The dilemma is how to reduce demand on bandwidth (and wallet) while keeping control over the resource. To put it another way, how can you decentralize the
    Message 1 of 26 , Oct 7, 2001
    • 0 Attachment
      The dilemma is how to reduce demand on bandwidth (and wallet) while
      keeping control over the resource. To put it another way, how can
      you decentralize the resource and at the same time retain centralized
      control over its content.

      Various P2P models wherein all users are peers could certainly
      decentralize the resource, although control of content goes out the
      window. If the resource is a dynamic one (ie, changing content), P2P
      solutions might exacerbate or create new distribution and demand
      problems. For example, P2P search models would have to deal with a
      constantly changing/updating resource, while those depending on
      synchronized distribution or using mediated P2P could create
      bandwidth problems for all peers or for the mediating server.

      A compromise solution for the site's owner would be to distribute
      content among a small number of trusted peers who are willing to
      volunteer bandwidth (as suggested above). Each could host all the
      contents or various portions of it, while decisions about inclusion
      of content would remain with the current site's owner. A small,
      private P2P network among the trusted peers could be used to
      coordinate/update/synchronize content. For download, a link on the
      original site would randomly direct users to one of the trusted sites
      if each held total content, or to specific sites for different
      classes of content. All this would be transparent to users.
    • Justin Chapweske
      ... I disagree. A Swarmcast or Bittorrent solution causes no such problems. The content provider still maintains total control over the presentation of their
      Message 2 of 26 , Oct 7, 2001
      • 0 Attachment
        >
        >
        >
        >Various P2P models wherein all users are peers could certainly
        >decentralize the resource, although control of content goes out the
        >window. If the resource is a dynamic one (ie, changing content), P2P
        >solutions might exacerbate or create new distribution and demand
        >problems. For example, P2P search models would have to deal with a
        >constantly changing/updating resource, while those depending on
        >synchronized distribution or using mediated P2P could create
        >bandwidth problems for all peers or for the mediating server.
        >
        I disagree. A Swarmcast or Bittorrent solution causes no such problems.
        The content provider still maintains total control over the
        presentation of their content through the web site, and at least
        Swarmcast automatically falls back on a regular HTTP download if there
        are no peers available, so except for the overhead of installing the
        client, it is never any worse than a regular HTTP download.

        --
        Justin Chapweske, Onion Networks
        http://onionnetworks.com/
      • Tony Kimball
        I think any disagreement stems from dissimilar views of resource identity. m.friedman s view seems to be location-centric, while that of Swarmcast is
        Message 3 of 26 , Oct 7, 2001
        • 0 Attachment
          I think any disagreement stems from dissimilar views of resource
          identity. m.friedman's view seems to be location-centric, while that
          of Swarmcast is content-centric. The content-centric view obviates
          the problems described, which derive from a location-centric view: In
          Swarmcast, a resource which has 'changed' is simply a different
          resource. The constant location is just a rondez-vous point which
          links to different meshes over time.

          I only mention this because I don't think Justin's reply made this
          point clear for the original poster.

          Quoth Justin Chapweske on Sunday, 7 October:
          : >
          : >Various P2P models wherein all users are peers could certainly
          : >decentralize the resource, although control of content goes out the
          : >window. If the resource is a dynamic one (ie, changing content), P2P
          : >solutions might exacerbate or create new distribution and demand
          : >problems. For example, P2P search models would have to deal with a
          : >constantly changing/updating resource, while those depending on
          : >synchronized distribution or using mediated P2P could create
          : >bandwidth problems for all peers or for the mediating server.
          : >
          : I disagree. A Swarmcast or Bittorrent solution causes no such problems.
          : The content provider still maintains total control over the
          : presentation of their content through the web site, and at least
          : Swarmcast automatically falls back on a regular HTTP download if there
          : are no peers available, so except for the overhead of installing the
          : client, it is never any worse than a regular HTTP download.
        • Bram Cohen
          ... BitTorrent also has the no worse than HTTP property, although it always uses it s own protocol, so you do need to install client software (a very simple
          Message 4 of 26 , Oct 7, 2001
          • 0 Attachment
            On Sun, 7 Oct 2001, Justin Chapweske wrote:

            > Swarmcast automatically falls back on a regular HTTP download if there
            > are no peers available, so except for the overhead of installing the
            > client, it is never any worse than a regular HTTP download.

            BitTorrent also has the 'no worse than HTTP' property, although it always
            uses it's own protocol, so you do need to install client software (a very
            simple step, which is only done once).

            -Bram Cohen

            "Markets can remain irrational longer than you can remain solvent"
            -- John Maynard Keynes
          • m.friedman@cruxus.com
            ... Mine is neither a location- or content-centric view - I m being more practical than that. I was simply suggesting what I thought was a relatively simple,
            Message 5 of 26 , Oct 7, 2001
            • 0 Attachment
              --- In decentralization@y..., Tony Kimball <alk@p...> wrote:

              > I think any disagreement stems from dissimilar views of resource
              > identity. m.friedman's view seems to be location-centric, while
              > that of Swarmcast is content-centric.

              Mine is neither a location- or content-centric view - I'm being more
              practical than that. I was simply suggesting what I thought was a
              relatively simple, low- or no-cost solution to the problem at hand.
              Is there any reason why my solution wouldn't work to solve the site
              owner's problem?

              I need to be educated about Swarmcast. My understanding is that its
              advantage is speed of download accomplished by distributing the
              source of downloadable content among many sites. How does it deal
              with changing content? To say a resource that changes is a different
              resource makes sense (albeit tautologically), but if the owner wanted
              to add, delete or alter content, how is this change transmitted among
              clients so that the advantage of Swarmcast is preserved?
            • Kelly Abbott
              ... Thanks for the link. Do you plan on developing a version for macs? The audience for zuadobank is 75% mac-based. Also, I got a 404 when I clicked on that
              Message 6 of 26 , Oct 7, 2001
              • 0 Attachment
                > From: Bram Cohen <BRAM@...>
                > Subject: Re: [decentralization] peer browser for artists
                >
                > On Sat, 6 Oct 2001, Kelly Abbott wrote:
                >
                >> So my questions are:
                >> 1. Is the last mile solution possible?
                >
                > It's more suited to very large files, but my project, BitTorrent, does
                > exactly that -
                >
                > http://bitconjurer.org/BitTorrent/
                >
                > there's a quickie demo of it here -
                >
                > http://bitconjurer.org/BitTorrent/demo.html
                >
                Thanks for the link. Do you plan on developing a version for macs? The
                audience for zuadobank is 75% mac-based.

                Also, I got a 404 when I clicked on that link.



                > thanks for spoke-and-axle.com - I'll write all those people once
                > BitTorrent is ready.

                Better just work through me. They won't know what you're talking about.

                Peace,
                K
              • Kelly Abbott
                ... Do you know what such a script would look like? Where I could find one to start with? ... Sure, and one could even set up a schedule to download new images
                Message 7 of 26 , Oct 7, 2001
                • 0 Attachment
                  > From: Aaron Swartz <aswartz@...>
                  > Subject: [decentralization] Re: peer browser for artists
                  >
                  > On Saturday, October 6, 2001, at 09:50 PM, Kelly Abbott wrote:
                  >
                  >> So the donator would have to install the cgi only? That is,
                  >> they put the
                  >> script in the appropriate place and when it is called it grabs
                  >> the image
                  >> from zuadobank, downloads it to the donator's server, then it
                  >> once it is
                  >> cached immediately begins downloading from the donator's server to the
                  >> client who requested it?
                  >
                  > Yep. If they were tight on disk space, they would also have to
                  > set limits on how large their cache could grow.
                  Do you know what such a script would look like? Where I could find one to
                  start with?

                  >
                  >> Does the client experience any lag?
                  >
                  > The first time an image is requested (or after it's deleted if
                  > the donator needs the disk space) the client experiences some
                  > lag since downloads need to go thru two servers (ideally, it'd
                  > be set up so the whole image didn't have to be downloaded to the
                  > donator before it was sent out to the client). Afterwards, the
                  > donator caches the file on their machine and the lag goes away.
                  >
                  Sure, and one could even set up a schedule to download new images between
                  servers say ever sunday at midnight.


                  >>> You could even replace the donator1 scheme with a round-robin
                  >>> DNS system, having images.zuadobank.com point to a series of
                  >>> servers that agreed to donate bandwidth.
                  >> How would one do that?
                  >
                  > You'd have to set up multiple A records with whomever provides
                  > the DNS for zuadobank.com (in this case HOST4U.net).
                  Oh, duh. I knew that.
                  >
                  >> How could I make it dynamic? That is, when someone signs on as
                  >> a donator, what process would they have to go through to get in
                  >> the system? And if they stopped hosting would they have to
                  >> 'sign out', as it were?
                  >
                  > Yeah -- this'd be more difficult since you'd need to interface
                  > with whatever DNS system was set up. I'd wait until you got the
                  > above part working first.
                  I guess we could set up a robot to check each donator site every five
                  minutes, drop the ones that are down for tat five minute span and route all
                  requests to the live ones.

                  >
                  >>> Of course this would require getting a number of servers to
                  >>> participate in the scheme, but if the site is popular, I don't
                  >>> think it'd be too hard.
                  >> Sounds a lot like spoke-and-axle.com .
                  >
                  > Wow, did you set this up while I was sleeping? That's an awesome site.
                  Yeah, I've been working on it since May. Thanks for the props.


                  K
                • Brandon Wiley
                  ... I don t think it s necessary are useful to switch the site from PHP-driven HTML to PHP-driven GTK widgets. If the users are used to HTML them you can keep
                  Message 8 of 26 , Oct 7, 2001
                  • 0 Attachment
                    > The php.net site mentions GTK http://gtk.php.net/ for writing client
                    > side apps in php. I haven't looked at it and it may not be appropriate
                    > here.

                    I don't think it's necessary are useful to switch the site from PHP-driven
                    HTML to PHP-driven GTK widgets. If the users are used to HTML them you can
                    keep the same interface but just have it rendered on the client instead of
                    the server. The benefit to switching to GTK would be to get rid of the
                    need for and limitations of the web browser, but it seems an appopriate
                    interface for this situation.

                    > >Luckily, I think that the real problem will be resolved simply by putting
                    > >the image on a cluster of mirror servers run by volunteers rather than on
                    > >the central host. I think that downloads consume so much bandwidth that
                    > >without them there will be plenty of room for browsing and feedback.
                    >
                    > Isn't this exactly the problem that Akamai have solved for large sites?
                    > I wonder if there's an open source clone of this? A swarmcast proxy
                    > cloud sounds like it would be at least similar.

                    Yes, this is what Akamai does. Akamai isn't code, it's a service. They
                    just mirror your files for you. Whatever code they have is for
                    guaranteeing quality of service for transfer rates and high availabilty of
                    files. I don't know of an OSS project to provide such things, but it's not
                    really necessary for this application.

                    Akamai mirrors static content. So you could get the same thing without
                    guaranteed quality of service by just setting up some mirror sites. A
                    Swarmcast cloud is like a confederation of mirror sites except that it
                    mirrors content in a dynamic way based on what people are downloading
                    rather than static content. Sometimes this is desirable and sometimes this
                    is silly.
                  • Brandon Wiley
                    ... I think that this is a very well articulated statement of what I consider to be the best solution for this particular application.
                    Message 9 of 26 , Oct 7, 2001
                    • 0 Attachment
                      > A compromise solution for the site's owner would be to distribute
                      > content among a small number of trusted peers who are willing to
                      > volunteer bandwidth (as suggested above). Each could host all the
                      > contents or various portions of it, while decisions about inclusion
                      > of content would remain with the current site's owner. A small,
                      > private P2P network among the trusted peers could be used to
                      > coordinate/update/synchronize content. For download, a link on the
                      > original site would randomly direct users to one of the trusted sites
                      > if each held total content, or to specific sites for different
                      > classes of content. All this would be transparent to users.

                      I think that this is a very well articulated statement of what I consider
                      to be the best solution for this particular application.
                    Your message has been successfully submitted and would be delivered to recipients shortly.