Loading ...
Sorry, an error occurred while loading the content.
 

RE: Trying to get File and Directory info off of external server quickly

Expand Messages
  • Goddard Lee
    ... I trust you are putting thousands into perl, mod_perl and other good things, then ;)
    Message 1 of 16 , Aug 2, 2005
      > From: Philip M. Gollucci [mailto:pgollucci@...]

      > I didn't write this, but ...[it]... makes us millions.

      I trust "you" are putting thousands into perl, mod_perl and other good
      things, then ;)
    • Torsten Foertsch
      ... mod_dav may be an option. Torsten
      Message 2 of 16 , Aug 2, 2005
        On Monday 01 August 2005 23:12, Boysenberry Payne wrote:
        > Hello All,
        >
        > I've got a two server platform one a static server for files and runs
        > the mysql server
        > and the other runs mod_perl. I'm trying to figure out the fastest way
        > to get info on directories
        > and files from the static server to the mod_perl server. Right now I'm
        > using Net::FTP which
        > is really slow, especially when they're a lot of files. Unfortunately,
        > I need to check the file info
        > quite frequently. I was wondering if anyone knew what was the fast way
        > to get this info, LDAP,
        > SSH, etc?

        mod_dav may be an option.

        Torsten
      • Boysenberry Payne
        Thank You Everyone, I think now that I know I can use $ftp- ls( -lR ), which I couldn t find anywhere in the Net::FTP docs or other O Reilly books I have, I
        Message 3 of 16 , Aug 2, 2005
          Thank You Everyone,

          I think now that I know I can use $ftp->ls( "-lR" ), which I couldn't
          find
          anywhere in the Net::FTP docs or other O'Reilly books I have, I can
          stick to Net::FTP without is being slow. What was causing my script
          to take so long was the multiple $ftp->cwd( $directory ), $ftp->ls() and
          $ftp->dir( $directory . $file ) calls for each directory in my
          directory loop.

          Now I use one cwd and ls("-lR") from my public html area then process
          the return array, which is a lot faster. It would be nice to be able
          to specify
          the directory as well as the "-lR" without using cwd( $directory ); does
          anyone know how to do it?

          Thanks for the tips on making my code more efficient too.

          Boysenberry

          This message contains information that is confidential
          and proprietary to Humaniteque and / or its affiliates.
          It is intended only for the recipient named and for
          the express purpose(s) described therein.
          Any other use is prohibited.

          http://www.habitatlife.com
          The World's Best Site Builder
          On Aug 1, 2005, at 6:28 PM, Randy Kobes wrote:

          > On Mon, 1 Aug 2005, Boysenberry Payne wrote:
          >
          >> I'm not sure if HEAD would work.
          >> Basically, I'm trying to read a directory's files.
          >> After I confirm a file exists and doesn't have zero
          >> size I check that it has the appropriate extension
          >> for the directory then I add the directory address,
          >> file name and extension to a table in our database.
          >
          > Can you get someone on the remote server to do a
          > cd top_level_directory
          > ls -lR > ls-lR # or find -fls find-ls
          > gzip ls-lR # or gzip find-ls
          > periodically, and then you can grab and parse ls-lR.gz or find-ls.gz?
          >
          > --
          > best regards,
          > randy kobes
          >
          >
        Your message has been successfully submitted and would be delivered to recipients shortly.