Loading ...
Sorry, an error occurred while loading the content.
 

Re: [webalizer] filtering out indexing bots

Expand Messages
  • Bob
    I have a friend who said his organization s site was receiving 5000 hits a day. It took a while but I whittled that down to less than 800 human visitors,
    Message 1 of 16 , Mar 28, 2004
      I have a friend who said his organization's site was
      receiving 5000 hits a day. It took a while but I whittled
      that down to less than 800 human visitors, without
      having access myself to the workings or logs of
      his host's log analyzer. The 800 probably does not
      include script kiddies on a fishing expedition since
      404's are filtered by any sane log analyzer, but I'm
      sure there are more indexing bots puffing up my
      friend's self esteem, and I'm helping him to face
      reality.

      Beyond that, there is much more reason to care about
      shaping content to cater to visitor interests and successful
      search engine keywords and real human visits than to
      worry if a few thousand hits a day might necessitate
      a move to greater bandwidth, though as I said, there
      is no zero sum mutual exclusivity--the more feedback
      the better, and the more tools the better! In addition
      to webalizer I have sed scripts filtering raw logs
      which I look at in the utility "less", and those are
      hard enough to use anyway without null referrers
      causing info glut--or I'll look at null referrers only
      to see what bots are indexing!

      I appreciate someone saying webalizer doesn't use
      regular expressions for User Agents or Referrers.
      I'll change my webalizer.conf accordingly, to just
      use a sub-string.

      -Bob D
    Your message has been successfully submitted and would be delivered to recipients shortly.