- Hi People.
I am hoping that someone can answer a question for me,
or to tell me there is no answer, in which case I will still
I am working with a CGI program, on Unix, and when traffic
is heavy I have problems with exceeding the systems open
file limits. Unix opens the files, then says "oops! I have exceeded
my limit!" and drops it, effectively truncating the file. These are
simple text files...
I am opening with +< but even then, good bye data.
Is there any way in Perl to protect text files that are
opened for writing, instead of losing the data when the file
limits are reached, without using the flock() ?
I had hoped that by changing from
open(F, ">$data") or die "can't open file : $data $!\n";
open(F, "+<$data") or die "can't open file : $data $!\n";
and then seeking and truncating, I would solve the loss
data problem. There is obviously ( I think ) something
that I am missing here. I have tried everything I can
think of / find all to no avail.
Any advice would be appreciated.
PS.:It's a Unix SunOS 5.6
I have a site hosted there
perl's version is 5.004_04 built for sun4-solaris
i've ran several flock programs I made to check and see if it locked but it