I thought the following was quite interesting. If true, Google will be far
more capable of crawling more content.
===========================================================================
beat.net writes "Robert X. Cringely details the plan for all the dark fiber
Google has been buying up: "The probable answer lies in one of Google's
underground parking garages in Mountain View. There, in a secret area
off-limits even to regular GoogleFolk, is a shipping container. But it isn't
just any shipping container. This shipping container is a prototype data
center. Google hired a pair of very bright industrial designers to figure
out how to cram the greatest number of CPUs, the most storage, memory and
power support into a 20- or 40-foot box. We're talking about 5000 Opteron
processors and 3.5 petabytes of disk storage that can be dropped-off
overnight by a tractor-trailer rig. The idea is to plant one of these
puppies anywhere Google owns access to fiber, basically turning the entire
Internet into a giant processing and storage grid. While Google could put
these containers anywhere, it makes the most sense to place them at Internet
peering points, of which there are about 300 worldwide.""
http://slashdot.org/article.pl?sid=05/11/20/1514244&tid=217
|
|