Loading ...
Sorry, an error occurred while loading the content.
 

Expand Messages
  • Scott_Knight@ssmhc.com
    I am writing some perl scripts that need to open large files, that are being updated every minute by another continually running process. I actually only need
    Message 1 of 1 , Jan 5, 2001
      I am writing some perl scripts that need to open large files, that are
      being updated every minute by another continually running process. I
      actually only need a relatively small portion of the data (perhaps 100k of
      2m), but it can take a while to retrieve. Is there any danger of losing
      data if I have one of those files open (by creating a filehandle using
      perl's "open" function) when the other process goes to update that file? I
      will never need data at the end of the file (its in there by date, and I'll
      never need today's data) so I don't need to try to seek() to the end of the
      file or anything like that.
      Thanks...

      --
      Scott Knight, Network Analyst - SSM Health Care, Information Center
      email: scott_knight@... + phone: 314.644.7344 + fax: 314.647.1037
      "Dad, when you come home with only shattered pieces of your dreams, your
      little one can mend them like new with two magic words - 'Hi Dad!'"
      - Alan Beck in "Fathers and Sons" -
    Your message has been successfully submitted and would be delivered to recipients shortly.