Roy Schestowitz wrote:
> Replication can be done more efficiently than that. Since much of the
> content
> (that you care about) is textual, one could compress content as set it
> aside. Compression algorithms can reduce natural text to about 10-20%
> of its original size.
I forgot to mention one thing: Googlebot is already accepting gzipped data
if your server can send it. So, in my mind, that would definitely mean
that this is how they store the data as well.
--
Cheers,
Dmitri
See Site Sig Below
--
##-----------------------------------------------##
Article posted with Web Developer's USENET Archive
http://www.1-script.com/forums/
Web and RSS gateway to your favorite newsgroup -
alt.internet.search-engines - 30882 messages and counting!
##-----------------------------------------------##
|
|