Loading ...
Sorry, an error occurred while loading the content.

Re: [Clip] Copying files via http using NoteTab

Expand Messages
  • Art Kocsis
    At 9/21/2013 09:51 AM, you wrote: I host some text files on my web page, and update them regularly. I would like to be able to copy from the web page to the
    Message 1 of 7 , Sep 22, 2013
    • 0 Attachment
      At 9/21/2013 09:51 AM, you wrote:
      I host some text files on my web page, and update them regularly. I would like to be able to copy from the web page to the appropriate folder on the user's computer automatically. No user GUI - just a stealthy download. I will take care of backing up the old file, etc. From my reading of the help files, this is not possible using clips, other than implementing some sort of an FTP downloader. These are simply short text files, although there may also be some zip files to download. If this can't be done using clips, how would you go about using NoteTab and maybe some other small program to download a file from a known site?

      http://gnuwin32.sourceforge.net/packages/wget.htm
      http://curl.haxx.se/

      Trivial task:

      ^!Set %wget%=^$GetShort("G:\Leech Apps\WGet\wget.exe")$
      ^!Set %url%="http://recipetools.gotdns.com/files/prodnames.txt"
      ^!Set %s%=^$GetOutput(^%wget% -O - "^%url%")$  [DLs to clip variable]

      ^!DOS "^%wget%" -O - "^%url%"                  [DLs to Std Out
      ^!DOS "^%wget%" "^%url%"                       [DLs to current folder as prodnames.txt
      ^!DOS "^%wget% -O Newname.txt" "^%url%"        [DLs to current folder as Newname.txt

      WGET is probably the most well known and versitile DL utility available. It is an extremely powerful leech tool that can DL anything from a single file to an entire web site, filtering by an almost unlimited set of criteria and optionally converting the web links to local links. It is an open source with a long history, is stable, is fast and is in no danger of disappearing.

      From the WGET readme file:

         It can follow links in HTML pages and create local versions of remote
         web sites, fully recreating the directory structure of the original
         site.  This is sometimes referred to as "recursive downloading."
         While doing that, Wget respects the Robot Exclusion Standard
         (/robots.txt).  Wget can be instructed to convert the links in
         downloaded HTML files to the local files for offline viewing.

         Recursive downloading also works with FTP, where Wget can retrieves a
         hierarchy of directories and files.

         With both HTTP and FTP, Wget can check whether a remote file has
         changed on the server since the previous run, and only download the
         newer files.

         Wget has been designed for robustness over slow or unstable network
         connections; if a download fails due to a network problem, it will
         keep retrying until the whole file has been retrieved.  If the server
         supports regetting, it will instruct the server to continue the
         download from where it left off.

      Another popular and powerful alternative is cURL:

         cURL  is  a tool to transfer data from or to a server, using one of the
         supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS,  IMAP,
         IMAPS,  LDAP,  LDAPS,  POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS,
         TELNET and TFTP).  The command is designed to work without user  inter-
         action.

         cURL offers a busload of useful tricks like proxy support, user authen-
         tication, FTP upload, HTTP post, SSL connections, cookies, file  trans-
         fer  resume,  Metalink,  and more.

      Art
    • Larry Hamilton
      FTP.exe is included in all versions of Windows since Win95, I think. I have a NoteTab OTL file that you can download from here:
      Message 2 of 7 , Sep 23, 2013
      • 0 Attachment
        FTP.exe is included in all versions of Windows since Win95, I think.

        I have a NoteTab OTL file that you can download from here: http://www.kairoscomputers.com/notetab/FTP_Notes.php.

        Or you can just review it online.

        Eric has good support for basic FTP in the FTP library that comes with NoteTab. At least I think it is still one of the default libraries.

        FTP is good because it is free and no download of another program.

        Wget and curl are also free and work with NoteTab.

        I used to use ftp.exe at work all the time, until they went to a secure ftp server and the basic ftp.exe does not support sftp. I am not sure about the versions in Win7 and newer. We use FileZilla since it is free. I have not looked into it's command line options for a couple of years, so it may not be any better at what it was back then. It would be nice if it had all the bells and whistles of the M$ ftp.exe for easier scripting.

        ~ Larry


        On Sat, Sep 21, 2013 at 12:51 PM, John Shotsky <jshotsky@...> wrote:


        I host some text files on my web page, and update them regularly. I would like to be able to copy from the web page to the appropriate folder on the user's computer automatically. No user GUI - just a stealthy download. I will take care of backing up the old file, etc. From my reading of the help files, this is not possible using clips, other than implementing some sort of an FTP downloader. These are simply short text files, although there may also be some zip files to download. If this can't be done using clips, how would you go about using NoteTab and maybe some other small program to download a file from a known site?

         

        One file to be downloaded:

        http://recipetools.gotdns.com/files/prodnames.txt

         

        Thanks,
        John

         




      Your message has been successfully submitted and would be delivered to recipients shortly.