Loading ...
Sorry, an error occurred while loading the content.

8350Re: [Clip] Download photos via links?

Expand Messages
  • Alan C
    Jan 28, 2009

      If you have a Linux box or if you have Cygwin (compiled with wget) on your
      Windows then you can run (in Cygwin) my Perl script

      the Perl script reads a text file with urls.

      script then calls wget on each url

      each downloaded file is essentially "as if mirrored", IOW yes, a long
      filename of the website url (also, sometimes has/creates folder(s) under

      dl.pl file_with_urls.txt

      Call it like that ie: SciptName file_with_urls.txt

      cat prints to STDOUT or (the console screen). Script begins on/with the 2nd
      line next. (1st line of next is that I also copy/pasted from my
      terminal/console the cat command)

      al@p3srv:~$ cat ~/bin/dl
      #!/usr/bin/perl -w
      use strict;

      # http://www.gnome.org/projects/garnome/
      # http://download.kde.org/download.php
      # my $base = '
      # my $base = 'http://mondorescue.muskokamug.org/slackware/11.0/';
      # my $base = '
      # my $base = '
      # my $base = 'ftp://mirrors.kernel.org/fedora/core/5/i386/iso/';
      # my $base = 'ftp://mirrors.kernel.org/opensuse/distribution/SL-10.1/iso/';
      # my $base = 'http://slackbuilds.rlworkman.net/openoffice.org/';
      # my $base = 'http://www.slackware.com/~alien/slackbuilds/qemu/build/';
      # my $base = '
      # my $base = 'http://ahinea.com/tech/uni/';
      my $old = shift;
      open(OLD, "<", $old) or die "cant open $old: $!";
      my @lines = <OLD> ;
      close(OLD) or die "cant close $old: $!";

      #my $fil;
      #my @fils;
      foreach my $line ( @lines ) {
      # system("wget $base$line");
      system("wget $line");
      # system("wget --limit-rate=285K $line");
      # system("md5sum $line");
      # system("filmodsl $line");
      # system("upgradepkg $line");
      # system("removepkg $line");
      # system("cp $line /home/al/mbg/$line");
      #$line =~ s/\.tgz$//;
      # system("ls $line");
      # end of Perl script


      On Tue, Jan 27, 2009 at 7:40 PM, Don - HtmlFixIt.com <don@...>wrote:

      > John Shotsky wrote:
      > > I have several hundred (thousands, actually, but in discrete groups)
      > hyperlinks that go directly to photos on the web that I'd like
      > > to download.
      > >
      > > The Notetab file is simply a list of hyperlinks, where the text of the
      > hyperlink is to be used as the photo name.
      > >
      > > Is this something that can be done using NoteTab? If so, how would I go
      > about it?
      > >
      > > Thanks,
      > > John
      > >
      > It may not be the right forum, but clips is pretty sturdy ... maybe off
      > topic eventually as we talk about other programs, so I'll copy it over
      > there.
      > You can create a clip to do it, but I think you will find that other
      > tools are better.
      > If the links are on a page, you can make a local copy of the site and
      > links using httracks, it will actually make a complete archive with
      > relative links and clickable hyperlinks.
      > However I think even better is this:
      > http://www.ababasoft.com/utilityadvance/downloadurls.html
      > I recently used it to download a list of urls with great success.

      [Non-text portions of this message have been removed]
    • Show all 2 messages in this topic