Loading ...
Sorry, an error occurred while loading the content.

24046Re: [Clip] Copying files via http using NoteTab

Expand Messages
  • Art Kocsis
    Sep 22, 2013
      At 9/21/2013 09:51 AM, you wrote:
      I host some text files on my web page, and update them regularly. I would like to be able to copy from the web page to the appropriate folder on the user's computer automatically. No user GUI - just a stealthy download. I will take care of backing up the old file, etc. From my reading of the help files, this is not possible using clips, other than implementing some sort of an FTP downloader. These are simply short text files, although there may also be some zip files to download. If this can't be done using clips, how would you go about using NoteTab and maybe some other small program to download a file from a known site?


      Trivial task:

      ^!Set %wget%=^$GetShort("G:\Leech Apps\WGet\wget.exe")$
      ^!Set %url%="http://recipetools.gotdns.com/files/prodnames.txt"
      ^!Set %s%=^$GetOutput(^%wget% -O - "^%url%")$  [DLs to clip variable]

      ^!DOS "^%wget%" -O - "^%url%"                  [DLs to Std Out
      ^!DOS "^%wget%" "^%url%"                       [DLs to current folder as prodnames.txt
      ^!DOS "^%wget% -O Newname.txt" "^%url%"        [DLs to current folder as Newname.txt

      WGET is probably the most well known and versitile DL utility available. It is an extremely powerful leech tool that can DL anything from a single file to an entire web site, filtering by an almost unlimited set of criteria and optionally converting the web links to local links. It is an open source with a long history, is stable, is fast and is in no danger of disappearing.

      From the WGET readme file:

         It can follow links in HTML pages and create local versions of remote
         web sites, fully recreating the directory structure of the original
         site.  This is sometimes referred to as "recursive downloading."
         While doing that, Wget respects the Robot Exclusion Standard
         (/robots.txt).  Wget can be instructed to convert the links in
         downloaded HTML files to the local files for offline viewing.

         Recursive downloading also works with FTP, where Wget can retrieves a
         hierarchy of directories and files.

         With both HTTP and FTP, Wget can check whether a remote file has
         changed on the server since the previous run, and only download the
         newer files.

         Wget has been designed for robustness over slow or unstable network
         connections; if a download fails due to a network problem, it will
         keep retrying until the whole file has been retrieved.  If the server
         supports regetting, it will instruct the server to continue the
         download from where it left off.

      Another popular and powerful alternative is cURL:

         cURL  is  a tool to transfer data from or to a server, using one of the
         supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS,  IMAP,
         TELNET and TFTP).  The command is designed to work without user  inter-

         cURL offers a busload of useful tricks like proxy support, user authen-
         tication, FTP upload, HTTP post, SSL connections, cookies, file  trans-
         fer  resume,  Metalink,  and more.

    • Show all 7 messages in this topic