I am populating an array from disk. Each value is a single byte.
A file of around 4Mb appears to gobble about 60Mb of RAM.
I have to keep the whole array in RAM for sequential and random
access during later processing.
This may not sound serious but much larger files are possible.
Using Activestate 5.8.4 on a 686.
Does anyone know of a method of minimising an array's RAM useage
without having to cache the aray/file in/out during processing.
Here is a cut-down version of the sub I use to load the file.
I show a Tk progress-bar while loading. That is why I have used read
to load the file so that the process can break-off to refresh the bar.
# Get a binary file in to an array with a progress bar
# $_ is the file path,
# $_ is a reference to the array that the binary should be loaded
# $_ is the text to be dispayed in the progress bar,
# $_ is the block size in bytes
# The larger the block size the quicker it will load, but the steps
of the progress bar will be coarser
# returns 1 if file was opened OK
my $blockCount=0; # The actual count of save blocks that gets used
to drive the progress bar.
if(open INPUT, "<".$_)
&MakeProgressPopUp($_); # this just instantates the progress bar
while (read(INPUT, $temp,$_) > 0)
# do the progress bar
$progress=$barWidth * $blockCount/$fileSize; # some global values
$prog->update(); # Refresh the progess-bar object
$tl->withdraw; # close the progress bar parent dialog
The subsequent processing of the array is too complex to include here
but breaks down to simple assignments and tests. Since the array is
fully populated there would, I assume be no advantage to loading the
file in to a hash.