Michael Peppler
Sybase Consulting
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
Sybase on Linux FAQ
Sybperl FAQ
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: Michael Peppler <mpeppler at peppler dot org>
Subject: Re: out of memory -- CTlib
Date: Feb 8 2001 4:14PM

Cox, Mark writes:
 > Any sugestions or help would be welcome.
 > I am using ct_lib to select large look-up tables from the data base for feed
 > processing.  I tend to assign all of the info in the data base into a hash
 > keyed on a specific value in the database and then read the file line by
 > line using the key as a quick lookup. What I am running into however is that
 > if I try to read in more than 100,000 records or so I get an 'Out of
 > Memory!' error.  Is there a more efficient way to read in a large number of
 > records into a hash table?  Any help or suggestions would be most welcome.

100,000 records in a hash table is quite a lot. Have you checked with
ps or top to see how much memory you are using? Do you have
limit/ulimit set?

I don't see any obvious problems with your code.
 > 			if (!($y % 10000) && ($y !=0)) {
 > 				print "$y Records processed at " , `date`;
 > 			}

You can use scalar(localtime) instead of `date` which will avoid a
fork()/exec() and should speed things up.

Michael Peppler - Data Migrations Inc. - -
International Sybase User Group -
Sybase on Linux mailing list: