Introduction About Site Map

XML
RSS 2 Feed RSS 2 Feed
Navigation

Main Page | Blog Index

Wednesday, April 11th, 2007, 5:22 am

Search Engine Downtime Has a High Cost

Servers stack

BACK when I communicated with Google, I came to realise that they have engineers whose sole/main purpose was to ensure the site stays online at all times. A few days ago I had another odd realisation, but perhaps a very obvious one. To search engines, downtimes are a hugely damaging thing. If people are unable to search for something immediately, they will choose a different tool. They must. By testing the water elsewhere—as such a downtime would lead to—failure can encourage them to switch to the rival.

Ordinary sites, as opposed to such complex tools, do not have this problem. How many of us use a single search engine exclusively? What would happen if one day we found that the grass is greener elsewhere? Search, as opposed to a flow of information, tends to have immediate need. It cannot be deferred until the favourite site returns. So, defection can be a matter of availability and its impact should not be underestimated. Downtime on a corporate network rarely has any long-term impact, unlike search tools whose quality is a subjective thing.

Comments are closed.

Back to top

Retrieval statistics: 21 queries taking a total of 0.129 seconds • Please report low bandwidth using the feedback form
Original styles created by Ian Main (all acknowledgements) • PHP scripts and styles later modified by Roy Schestowitz • Help yourself to a GPL'd copy
|— Proudly powered by W o r d P r e s s — based on a heavily-hacked version 1.2.1 (Mingus) installation —|