Some time ago I posted Just a thought: free distributed search?, suggesting that maybe relying on the centralized approach of search engine companies like Google was unwise, and that some kind of decentralized approach could work better for searching. Recently, I was directed to an actual attempt to implement this kind of strategy called Majestic-12. It's a UK-based project which applies the distributed computing model made famous by [email protected] to the problem. Isn't that amazing?
Every once in awhile, I just get a hare-brained notion. Today's was, why do we use a central website for doing internet searches at all? Why Google?
Consider the success of the Planetary Society's distributed SETI project, and the distributed computing architecture that resulted from it. Consider the success of swarming download technology like BitTorrent. Consider how simple a basic web spider could be. Consider the efficiency of spidering networks locally. Consider the architecture of DNS.
See a pattern?