Loading ...
Sorry, an error occurred while loading the content.

Re: Size limit listing files in a directory with ls from cron but not command line

Expand Messages
  • rolandkbs
    I suppose you have installed the coreutils. When you are logged in, your path shows to /opt/bin/ls first, but cron uses /bin/ls, which has the 2GB limit.
    Message 1 of 5 , Jan 14, 2011
    • 0 Attachment
      I suppose you have installed the coreutils. When you are logged in, your path shows to /opt/bin/ls first, but cron uses /bin/ls, which has the 2GB limit.
      Simply change your script to use '/opt/bin/ls -l' instead of just 'ls -l'.

      Regards
      Roland

      --- In nslu2-general@yahoogroups.com, "redgshost" <newsletters@...> wrote:
      >
      > This is a follow on from a post I made on linuxquestions.org. It has evolved to be what I think is an unslung specific question rather than the linux question I thought it was. The orginal posting is at this URL if you are interested.
      >
      > http://www.linuxquestions.org/questions/linux-general-1/partial-list-with-ls-l-in-bash-script-run-in-cron-but-full-list-run-from-command-line-851649/
      >
      > I am repeating some of what I said on the post above to make the problem clear.
      >
      > I have an unslung 6.8 nslu2 (slug) on which I have a script that copies large video files to a NAS from a topfield TF5800 PVR to which the slug is connected. The script then lists the files in the source and target directories and emails it to me so that I can check the files are being copied across properly. The script is a simple bash script.
      >
      > The topfield part works perfectly and the copy works also. In the script I list the files in the nas directory using linux (ls and dir) and it is the listing with ls that is giving problems.
      >
      > When I run the script on the slug from the command line, logged on as root, it works perfectly and is able to email me a complete, formatted list of files in the target directory using either ls -l or dir -l. ls -l is better because the dir-l leads to lots of escape characters. However if the script is run from cron, as root, then the dir -l command leads to a list of all the files but ls -l leads to a list that is size limited. The biggest file I can get to show is 2137460736 which is probably on the sightly small size for a video file from my PVR. On linuxquestions.org a user called catkin made this observation on size limit (i though it was displaying files randomly) and also made the observation that this appears to be a cut off with files sized 2^31 or less (2^31 = 214748365). After some testing using dir, echo * and ls -LS and ls -lSr it seems that ls is not working properly on an unslung 6.8 slug when used from cron. The problem is the size limit as described above. Catkin suggested that this may be related to isatty(). I have to admit I am beyond the limits of my knowledge with linux here so I am posting this as a bug report in case anyone gets round to fixing this or can explain how I get around it.
      >
      > Thank you
      >
      > red
      >
    • redgshost
      Thank you very much. This works perfectly. If you know I d be interested to know why the /bin/ls utility is crippled in this way but either way you have an
      Message 2 of 5 , Jan 16, 2011
      • 0 Attachment
        Thank you very much. This works perfectly.

        If you know I'd be interested to know why the /bin/ls utility is crippled in this way but either way you have an answered an question that has troubled me for a long time and done so very quickly. Thanmk you very much.

        Red



        --- In nslu2-general@yahoogroups.com, "rolandkbs" <roland.krebs@...> wrote:
        >
        > I suppose you have installed the coreutils. When you are logged in, your path shows to /opt/bin/ls first, but cron uses /bin/ls, which has the 2GB limit.
        > Simply change your script to use '/opt/bin/ls -l' instead of just 'ls -l'.
        >
        > Regards
        > Roland
        >
        > --- In nslu2-general@yahoogroups.com, "redgshost" <newsletters@> wrote:
        > >
        > > This is a follow on from a post I made on linuxquestions.org. It has evolved to be what I think is an unslung specific question rather than the linux question I thought it was. The orginal posting is at this URL if you are interested.
        > >
        > > http://www.linuxquestions.org/questions/linux-general-1/partial-list-with-ls-l-in-bash-script-run-in-cron-but-full-list-run-from-command-line-851649/
        > >
        > > I am repeating some of what I said on the post above to make the problem clear.
        > >
        > > I have an unslung 6.8 nslu2 (slug) on which I have a script that copies large video files to a NAS from a topfield TF5800 PVR to which the slug is connected. The script then lists the files in the source and target directories and emails it to me so that I can check the files are being copied across properly. The script is a simple bash script.
        > >
        > > The topfield part works perfectly and the copy works also. In the script I list the files in the nas directory using linux (ls and dir) and it is the listing with ls that is giving problems.
        > >
        > > When I run the script on the slug from the command line, logged on as root, it works perfectly and is able to email me a complete, formatted list of files in the target directory using either ls -l or dir -l. ls -l is better because the dir-l leads to lots of escape characters. However if the script is run from cron, as root, then the dir -l command leads to a list of all the files but ls -l leads to a list that is size limited. The biggest file I can get to show is 2137460736 which is probably on the sightly small size for a video file from my PVR. On linuxquestions.org a user called catkin made this observation on size limit (i though it was displaying files randomly) and also made the observation that this appears to be a cut off with files sized 2^31 or less (2^31 = 214748365). After some testing using dir, echo * and ls -LS and ls -lSr it seems that ls is not working properly on an unslung 6.8 slug when used from cron. The problem is the size limit as described above. Catkin suggested that this may be related to isatty(). I have to admit I am beyond the limits of my knowledge with linux here so I am posting this as a bug report in case anyone gets round to fixing this or can explain how I get around it.
        > >
        > > Thank you
        > >
        > > red
        > >
        >
      • Carl Lowenstein
        ... I m sure that the answer is that /bin/ls is just a link to busybox, and /opt/bin/ls is the real thing. Busybox is somewhat crippled to keep its size
        Message 3 of 5 , Jan 16, 2011
        • 0 Attachment
          On Sun, Jan 16, 2011 at 2:02 AM, redgshost <redgshost@...> wrote:
          >
          > Thank you very much. This works perfectly.
          >
          > If you know I'd be interested to know why the /bin/ls utility is crippled in this way but either way you have an answered an question that has troubled me for a long time and done so very quickly. Thanmk you very much.
          >
          > Red
          >
          > --- In nslu2-general@yahoogroups.com, "rolandkbs" <roland.krebs@...> wrote:
          > >
          > > I suppose you have installed the coreutils. When you are logged in, your path shows to /opt/bin/ls first, but cron uses /bin/ls, which has the 2GB limit.
          > > Simply change your script to use '/opt/bin/ls -l' instead of just 'ls -l'.
          > >

          I'm sure that the answer is that /bin/ls is just a link to busybox,
          and /opt/bin/ls is the real thing. Busybox is somewhat crippled to
          keep its size down.

          carl
          --
          carl lowenstein  <clowenstein@...>
        • Mike Westerhof
          ... One of the coolest tools in the embedded and small-system Linux space is busybox -- it combines in a single executable a lot of the basic Linux utilities.
          Message 4 of 5 , Jan 16, 2011
          • 0 Attachment
            redgshost wrote:
            > Thank you very much. This works perfectly.
            >
            > If you know I'd be interested to know why the /bin/ls utility is crippled in this way

            One of the coolest tools in the embedded and small-system Linux space is
            busybox -- it combines in a single executable a lot of the basic Linux
            utilities. Busybox has had full support for files > 2GB in size for a
            long time, but not back at the time Linksys was creating the NSLU2. In
            order to maintain full compatibility with the Linksys firmware (and fit
            in the same small flash space), Unslung keeps some of those same basic
            Busybox limitations. Optware doesn't have to live in the onboard flash
            of the NSLU2, and is how you get around the limitations of the built-in
            utilities.

            As the adage goes, the wonder is not in how *well* the bear dances, but
            rather that it dances at all.

            -Mike (mwester)
          Your message has been successfully submitted and would be delivered to recipients shortly.