Loading ...
Sorry, an error occurred while loading the content.

Need to cronjob the webalizer

Expand Messages
  • Mark
    Hi, Before anyone jumps the gun an points me to the FAQ... it is easier to say the hosting provider does not give access to the commandline of the server. So
    Message 1 of 3 , Apr 6 12:13 PM
    View Source
    • 0 Attachment
      Hi,

      Before anyone jumps the gun an points me to the FAQ... it is easier to say the hosting provider does not give access to the commandline of the server.

      So all the FAQ material on setting up webalizer as a cronjob is straight out the window.

      What the host has provided is the ablity to set up a cron job via the extendcp user pannel that allows by multiple choice the set up of jobs and using this I set one up to run every hour to make webalizer run... But... the index.cgi script that does the work fails at line 15, line 15 is simpley "die"

      I attempted to use the PHP http_get() but they do not have the PECL librays installed on the server, I assume that this would have solved my problem as it invokes a http request.

      The PHP script I wrote that scraped the history file works fine. The problem is that the stats only get updated when I or the editor invoke an update in the figures. I wanted this to be a cronjob but its just not happening.

      This is the part that its failing on:-

      if($cwd=~m#^(/home/(?:sites|cluster-sites/\d+)/[\w\.\-]+/)#) {

      and I guess that this where the problem is.

      Question is what do I need to change or add to get around this issue and please note, Perl, I have never used it or scripted in it so I am at a complete loss and the syntax aint exactly logical.

      The cronjob path is as follows:-

      /home/sites/tgnc.org.uk/public_html/stats/index.cgi

      and the email I get says:-

      Died at /home/sites/tgnc.org.uk/public_html/stats/index.cgi line 15.

      The web host do not provide software support but confirm that the cronjob is running as expected, I agree with them also.

      Can anyone provide me with the changes or can anyone help me write a hard coded version to get this running.

      Help is appreciated.

      Regards,
      Mark.
    • Mark
      I forgot to mention that webalizer runs fine when requested as a HTTP request but falls flat on its face when requested as a cron job. Something clearly is
      Message 2 of 3 , Apr 7 1:06 AM
      View Source
      • 0 Attachment
        I forgot to mention that webalizer runs fine when requested as a HTTP request but falls flat on its face when requested as a cron job.

        Something clearly is preventing a server invoked cron job from completing, if you visited the site URL for /stats/ you get a web page with all the stats, however, not via a cron job, you get an error.

        So some help is really appreciated.

        Cheers,
        Mark.

        --- In webalizer@yahoogroups.com, "Mark" <mark.giblin@...> wrote:
        >
        > Hi,
        >
        > Before anyone jumps the gun an points me to the FAQ... it is easier to say the hosting provider does not give access to the commandline of the server.
        >
        > So all the FAQ material on setting up webalizer as a cronjob is straight out the window.
        >
        > What the host has provided is the ablity to set up a cron job via the extendcp user pannel that allows by multiple choice the set up of jobs and using this I set one up to run every hour to make webalizer run... But... the index.cgi script that does the work fails at line 15, line 15 is simpley "die"
        >
        > I attempted to use the PHP http_get() but they do not have the PECL librays installed on the server, I assume that this would have solved my problem as it invokes a http request.
        >
        > The PHP script I wrote that scraped the history file works fine. The problem is that the stats only get updated when I or the editor invoke an update in the figures. I wanted this to be a cronjob but its just not happening.
        >
        > This is the part that its failing on:-
        >
        > if($cwd=~m#^(/home/(?:sites|cluster-sites/\d+)/[\w\.\-]+/)#) {
        >
        > and I guess that this where the problem is.
        >
        > Question is what do I need to change or add to get around this issue and please note, Perl, I have never used it or scripted in it so I am at a complete loss and the syntax aint exactly logical.
        >
        > The cronjob path is as follows:-
        >
        > /home/sites/tgnc.org.uk/public_html/stats/index.cgi
        >
        > and the email I get says:-
        >
        > Died at /home/sites/tgnc.org.uk/public_html/stats/index.cgi line 15.
        >
        > The web host do not provide software support but confirm that the cronjob is running as expected, I agree with them also.
        >
        > Can anyone provide me with the changes or can anyone help me write a hard coded version to get this running.
        >
        > Help is appreciated.
        >
        > Regards,
        > Mark.
        >
      • Mark
        Well after a rather uppity email off brad, who should have pointed out that webalizer does not come with any external components like .cgi scripts... This
        Message 3 of 3 , Apr 8 12:30 AM
        View Source
        • 0 Attachment
          Well after a rather uppity email off brad, who should have pointed out that webalizer does not come with any external components like .cgi scripts... This situation would have been resloved faster.


          The answer in this situation from my host who eventually resolved the issue, as they thought that the issues I was raising were with webalizer itself but not the scripting.

          They said that I needed to write a shell script like below: -

          #! /bin/bash
          cd /home/sites/******/public_html/stats
          /usr/bin/webalizer -D dns.cache -nlocalhost -p -o . /home/sites/******/logs/******-access_log*

          where ****** is the domain name

          This is then uploaded to the home directory (the one that contains the public_html folder) and then you either chmod the file using a PHP script or use an ftp client like fileZilla to set the permissions to 755 so the file becomes executable

          When that is done, set a conjob to run the file.

          Job done.

          So now anyone who comes up against this problem with a hosting provider that does not allow webalizer to be polled in the way I was trying, this may be the answer.

          So it is best to ask the host what the issue is an if a shell script will solve the issue.

          Thank you for your time.






          --- In webalizer@yahoogroups.com, "Mark" <mark.giblin@...> wrote:
          >
          > I forgot to mention that webalizer runs fine when requested as a HTTP request but falls flat on its face when requested as a cron job.
          >
          > Something clearly is preventing a server invoked cron job from completing, if you visited the site URL for /stats/ you get a web page with all the stats, however, not via a cron job, you get an error.
          >
          > So some help is really appreciated.
          >
          > Cheers,
          > Mark.
          >
          > --- In webalizer@yahoogroups.com, "Mark" <mark.giblin@> wrote:
          > >
          > > Hi,
          > >
          > > Before anyone jumps the gun an points me to the FAQ... it is easier to say the hosting provider does not give access to the commandline of the server.
          > >
          > > So all the FAQ material on setting up webalizer as a cronjob is straight out the window.
          > >
          > > What the host has provided is the ablity to set up a cron job via the extendcp user pannel that allows by multiple choice the set up of jobs and using this I set one up to run every hour to make webalizer run... But... the index.cgi script that does the work fails at line 15, line 15 is simpley "die"
          > >
          > > I attempted to use the PHP http_get() but they do not have the PECL librays installed on the server, I assume that this would have solved my problem as it invokes a http request.
          > >
          > > The PHP script I wrote that scraped the history file works fine. The problem is that the stats only get updated when I or the editor invoke an update in the figures. I wanted this to be a cronjob but its just not happening.
          > >
          > > This is the part that its failing on:-
          > >
          > > if($cwd=~m#^(/home/(?:sites|cluster-sites/\d+)/[\w\.\-]+/)#) {
          > >
          > > and I guess that this where the problem is.
          > >
          > > Question is what do I need to change or add to get around this issue and please note, Perl, I have never used it or scripted in it so I am at a complete loss and the syntax aint exactly logical.
          > >
          > > The cronjob path is as follows:-
          > >
          > > /home/sites/tgnc.org.uk/public_html/stats/index.cgi
          > >
          > > and the email I get says:-
          > >
          > > Died at /home/sites/tgnc.org.uk/public_html/stats/index.cgi line 15.
          > >
          > > The web host do not provide software support but confirm that the cronjob is running as expected, I agree with them also.
          > >
          > > Can anyone provide me with the changes or can anyone help me write a hard coded version to get this running.
          > >
          > > Help is appreciated.
          > >
          > > Regards,
          > > Mark.
          > >
          >
        Your message has been successfully submitted and would be delivered to recipients shortly.