trevor wrote:
> in all, web stat analysis can be like SEO: you have to know where to draw
> the line and not obsess about it. i'd recommend pulling your weblogs down
> locally then using something like WUsage or other freeware to analyze
> them yourself. i haven't been this obsessive in years, so i'm out of date
> on the cool software. when you run it locally you get to configure for
> the whole feature set of reporting, and believe me, most inexpensive ISPs
> don't give you anything like the full feature reporting. you should be
> able to follow detail down to the path of a single IP ADDR through your
> whole site and out again.
>
> but again, that level of analysis, frankly i only encounter this where
> there's a marketing team that's separate from the development team which
> is separate from the webmaster/admin crew. nobody has time.
I agree with your point. I found that too much time can be spent tracing
sources, looking at SERP's, etc. It sure is helpful to know some facts, for
example:
-Who references me? Should I back-reference?
-What are the most popular pages? Should I polish these?
-How often is a piece of software/code being downloaded?
-Are any hidden pages being penetrated?
-Are ratbots spidering, thereby stealing much traffic?
I guess it is yet another proof that gigahertz technology handles so much
information that we, insignificant minds, just can't cope with. The best we
can do is employ better tools like error logs that filter usage logs.
Roy
--
Roy S. Schestowitz
http://Schestowitz.com
|
|