Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: Page Rank is back

  • Subject: Re: Page Rank is back
  • From: Roy Schestowitz <newsgroups@schestowitz.com>
  • Date: Wed, 01 Jun 2005 15:47:30 +0100
  • Newsgroups: alt.internet.search-engines
  • References: <9g1n91hhhjtslvpg1cl4npil8rseud2jb2@4ax.com> <d7g2fl$3c6$0@pita.alt.net> <r8on91tdao278lrqiua4kahr9c1k7d5f5g@4ax.com> <d7h40u$iib$1@godfrey.mcc.ac.uk> <hv6o91hgkmmpsgajjrs5dmj2ls7op3oo1k@4ax.com> <mn0q911tuvbmehvc0kq7pshfbi4ef8vq8p@4ax.com> <grmq919ld9ih25muvlhss47fqot8ai2a1i@4ax.com> <Eidne.24179$Fv.15054@lakeread01> <p03r91hfpcrp4cl9a28oefmlnvgl98dnr7@4ax.com> <d7kdbl$2kpj$2@godfrey.mcc.ac.uk> <5qfr91pqnmn5lmkjo5hvkakal91mflnki4@4ax.com> <imgr91l5kt1ndlkvk1hhndogdf4uruleg3@4ax.com>
  • User-agent: KNode/0.7.2
SEO Dave wrote:

> On Wed, 01 Jun 2005 14:01:23 GMT, Big Bill <kruse@cityscape.co.uk>
> wrote:
> 
>>I wonder how he doesn't get penalised for duplicate content. Someone's
>>probably been through that here and I didn't read it though.
> 
> It's not duplicate content because it's not used in exactly the same
> way as others use it. Recall discussions about breaking the public
> domain books into chapters (most do it that way). I create random
> sized pages that are highly unlikely to be the same as others have
> created. If someone used the same concept, as long as their pages are
> a different size to mine there won't be a duplicate content problem.

Can you not see that this is grey/black-hat SEO? This begins to remind me of
spammers who say that some proportion of the people will be interested in
their 'offers' and that they only try to help small businesses. The victims
here are people who go after meaningful, reliable, /contextual/ content.

>>I suppose
>>there's a difference between us, I do write my own stuff, and I write
>>my own blog too.
>>
>>BB
> 
> And you get 40 visitors a day and I get over 30,000 visitors a day
> over all (most of it is due to the last 15 months work). If I offered
> you access to a script you could use to create a 10,000 page site
> every day would you say no thanks?

And this statement reminds me of the dodgy guy who offers you a $10,000
Rolex for a tenner. What if each of the 40 million (+~10 million regional)
domains had 10,000 pages? We'd be in total chaos. How can one handle so
much text?

It is problematic if the site size is only limited by the capacity of your
CPU, which generates sparase sites.

> Bill I bet I've wrote more unique content than you over all as well.
> All because I focus my efforts on existing public domain and affiliate
> content doesn't mean I don't create as well. Difference is I realised
> early I can't possibly create enough content in a short period of time
> to do what I have planned long term.

That's where the problem lies. You find shortcuts to achieving your goals
while stomping on other people's necks. I am worried that more SEO like you
are out there, draining /real/ sites out of visitors.

> Good luck to you though Bill, I hope you do better long term.

Not nice...

Roy

-- 
Roy S. Schestowitz
http://Schestowitz.com

[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index