Decentralization and exhastive searches: mutually exclusive?
- -----BEGIN PGP SIGNED MESSAGE-----
Okay, this is a theoretical question that I'd like to ask without
getting bogged down in politics. I'm using Napster and Gnutella as
convenient examples and not because I approve or disapprove of either.
It's distributed and therefore robust. You cannot shut down
gnutellanet by shutting down one, or ten, or a thousand servers.
However, you cannot do exhaustive searches on it. The content
you're looking for might be out there and yet not be guaranteed
to show up on a search. Another disadvantage is that the
searches make inefficient use of bandwidth.
It is exhaustive. For better or worse, you'll find every single
instance of Michael_Jackson_Thriller.mp3 that anybody on the
network is serving. However, the server/s that store the
content listings and corresponding locations of the content are
all under the control of one company, which means they are
vulnerable to legal action, censorship, company-wide technical
failure, corporate abuse, and attack by hackers.
So, what about combining the best of the two? Decentralization and
exhaustive searching? Is it a logical impossibility, or is it merely
something that hasn't been done yet?
Arkanside gun Special Forces
Why are the above words in my signature? Check out:
-----BEGIN PGP SIGNATURE-----
Version: PGP 6.5.1
-----END PGP SIGNATURE-----
>The most efficient way to implement exhaustive searches on a decentralized network is
> One solution is to decouple the searching data-space from the content
> data-space and have seperate logical networks for them (seperate
> logical networks, even if many physical nodes do both).
to decouple the data space and the index space in time. The problem is the "time to
live" of the search packets.
The compromise is to insulate the user from the network, so searches are no
longer network traffic. (Ex: Fidonet and BBS')