Loading ...
Sorry, an error occurred while loading the content.

Re: [PBML] Get funbction.

Expand Messages
  • Tom Barron
    ... Hi, Gordon. Here s how I do it: #!/usr/bin/perl use HTML::Parser; use WebFS::FileCopy; my @it = get_urls( http://www.somewhere.com/index.html ); my $prs =
    Message 1 of 3 , Jun 3, 2000
    • 0 Attachment
      Gordon Stewart wrote:
      > ...
      > Ive been trying to figure out about how to 'http' request a page in
      > order to scan it etc..
      > ...
      > Any ideas ?

      Hi, Gordon. Here's how I do it:

      #!/usr/bin/perl
      use HTML::Parser;
      use WebFS::FileCopy;

      my @it = get_urls("http://www.somewhere.com/index.html");

      my $prs = HTML::Parser->new(api_version => 3,
      comment_h => [ \&handle_comment, "text"
      ],
      start_h => [ \&handle_stag,
      "tagname,attr" ]
      );

      $prs->parse($it[0]->content);

      sub handle_comment
      {
      # code to deal with a comment
      }

      sub handle_stag
      {
      # code to deal with a start tag
      }

      Other callbacks may be set up -- the various possibilities are
      documented in HTML::Parser. HTML::Parser and WebFS::FileCopy depend on
      various other things, so you'll have to be sure you've downloaded and
      installed all the dependencies -- just get HTML::Parser and
      WebFS::FileCopy and read their documentation and README files. It's all
      pretty straightforward.

      hth...
      Tom
    • Gordon Stewart
      Thanks for the quick reply. Ive tried the use HTML::Parser; but its not supported. I m using http://netfirms.com but their http://netfirms.com/support/cgi.html
      Message 2 of 3 , Jun 3, 2000
      • 0 Attachment
        Thanks for the quick reply.

        Ive tried the use HTML::Parser; but its not supported.
        I'm using http://netfirms.com

        but their http://netfirms.com/support/cgi.html file
        doesnt support it (as yet) - I'll e-mail them as to
        what 'USE' files I can access - that will take a few
        days or so for them to reply.

        but ive found a lib.com.pl file at :-
        http://cgi.resourceindex.com/Programs_and_Scripts/Perl/Libraries_and_Modules/

        (near the bottom)

        will that file work suitably ?

        PS - the netfirms.com is the only site I know of that
        supports CGi/PERl - and is FREE. - Any others I should
        of ?


        Thanks.
        Gordon.




        --- Tom Barron <tbarron@...> wrote:
        > Gordon Stewart wrote:
        > > ...
        > > Ive been trying to figure out about how to 'http'
        > request a page in
        > > order to scan it etc..
        > > ...
        > > Any ideas ?
        >
        > Hi, Gordon. Here's how I do it:
        >
        > #!/usr/bin/perl
        > use HTML::Parser;
        > use WebFS::FileCopy;
        >
        > my @it =
        > get_urls("http://www.somewhere.com/index.html");
        >
        > my $prs = HTML::Parser->new(api_version => 3,
        > comment_h => [
        > \&handle_comment, "text"
        > ],
        > start_h => [
        > \&handle_stag,
        > "tagname,attr" ]
        > );
        >
        > $prs->parse($it[0]->content);
        >
        > sub handle_comment
        > {
        > # code to deal with a comment
        > }
        >
        > sub handle_stag
        > {
        > # code to deal with a start tag
        > }
        >
        > Other callbacks may be set up -- the various
        > possibilities are
        > documented in HTML::Parser. HTML::Parser and
        > WebFS::FileCopy depend on
        > various other things, so you'll have to be sure
        > you've downloaded and
        > installed all the dependencies -- just get
        > HTML::Parser and
        > WebFS::FileCopy and read their documentation and
        > README files. It's all
        > pretty straightforward.
        >
        > hth...
        > Tom
        >
        >
        ------------------------------------------------------------------------
        > Take your development to new heights. Work with
        > clients like Dell and
        > pcOrder. Submit your resume to jobs@....
        > Visit us at
        >
        http://click.egroups.com/1/4358/2/_/12898/_/960041937/
        >
        ------------------------------------------------------------------------
        >
        >

        ____________________________________________________________
        Do You Yahoo!?
        Get your free @... address at http://mail.yahoo.co.uk
        or your free @... address at http://mail.yahoo.ie
      • Nikhil Gupta
        One is : http://www.virtualave.net But they don t support LWP module ( or the GET func as you can say ) because it consumes their system resources .. they also
        Message 3 of 3 , Jun 3, 2000
        • 0 Attachment
          One is :
          http://www.virtualave.net

          But they don't support LWP module ( or the GET func as you can say ) because
          it consumes their
          system resources .. they also don't support heavy duty cgis ..

          Can anyone tell me some host which supports CGI scripts ( free ) and has LWP
          module installed
          ( or any other module which I can use to retrieve webpages from other
          sites ) .. I'll need that
          host for my CGIs only.. or if anyone of you has his own server can he
          please give me free
          webspace , please..

          thanx in advance.

          Reply ASAP.
          Bi!

          -=:| Nikhil Gupta |:=-
          Don't forget to visit [ India's Top Sites ] at :
          http://itplanet.virtualave.net
          ----- Original Message -----
          From: Gordon Stewart <gordonistewart_nz@...>
          To: <perl-beginner@egroups.com>
          Sent: Saturday, June 03, 2000 08:35 PM
          Subject: Re: [PBML] Get funbction.


          > Thanks for the quick reply.
          >
          > Ive tried the use HTML::Parser; but its not supported.
          > I'm using http://netfirms.com
          >
          > but their http://netfirms.com/support/cgi.html file
          > doesnt support it (as yet) - I'll e-mail them as to
          > what 'USE' files I can access - that will take a few
          > days or so for them to reply.
          >
          > but ive found a lib.com.pl file at :-
          >
          http://cgi.resourceindex.com/Programs_and_Scripts/Perl/Libraries_and_Modules
          /
          >
          > (near the bottom)
          >
          > will that file work suitably ?
          >
          > PS - the netfirms.com is the only site I know of that
          > supports CGi/PERl - and is FREE. - Any others I should
          > of ?
          >
          >
          > Thanks.
          > Gordon.
          >
          >
          >
          >
          > --- Tom Barron <tbarron@...> wrote:
          > > Gordon Stewart wrote:
          > > > ...
          > > > Ive been trying to figure out about how to 'http'
          > > request a page in
          > > > order to scan it etc..
          > > > ...
          > > > Any ideas ?
          > >
          > > Hi, Gordon. Here's how I do it:
          > >
          > > #!/usr/bin/perl
          > > use HTML::Parser;
          > > use WebFS::FileCopy;
          > >
          > > my @it =
          > > get_urls("http://www.somewhere.com/index.html");
          > >
          > > my $prs = HTML::Parser->new(api_version => 3,
          > > comment_h => [
          > > \&handle_comment, "text"
          > > ],
          > > start_h => [
          > > \&handle_stag,
          > > "tagname,attr" ]
          > > );
          > >
          > > $prs->parse($it[0]->content);
          > >
          > > sub handle_comment
          > > {
          > > # code to deal with a comment
          > > }
          > >
          > > sub handle_stag
          > > {
          > > # code to deal with a start tag
          > > }
          > >
          > > Other callbacks may be set up -- the various
          > > possibilities are
          > > documented in HTML::Parser. HTML::Parser and
          > > WebFS::FileCopy depend on
          > > various other things, so you'll have to be sure
          > > you've downloaded and
          > > installed all the dependencies -- just get
          > > HTML::Parser and
          > > WebFS::FileCopy and read their documentation and
          > > README files. It's all
          > > pretty straightforward.
          > >
          > > hth...
          > > Tom
          > >
          > >
          > ------------------------------------------------------------------------
          > > Take your development to new heights. Work with
          > > clients like Dell and
          > > pcOrder. Submit your resume to jobs@....
          > > Visit us at
          > >
          > http://click.egroups.com/1/4358/2/_/12898/_/960041937/
          > >
          > ------------------------------------------------------------------------
          > >
          > >
          >
          > ____________________________________________________________
          > Do You Yahoo!?
          > Get your free @... address at http://mail.yahoo.co.uk
          > or your free @... address at http://mail.yahoo.ie
          >
          > ------------------------------------------------------------------------
          > Old school buds here:
          > http://click.egroups.com/1/4057/2/_/12898/_/960044757/
          > ------------------------------------------------------------------------
          >
          >
          >
        Your message has been successfully submitted and would be delivered to recipients shortly.