Up Prev Next
From: "Gilmore-Baldwin, John" <jBaldwin at desmoine dot gannett dot com>
Subject: RE: memory problems
Date: May 3 2002 2:16PM
Here are some tips that have helped me with similar situations:
1. I believe that the way to empty a hash is (%hash = ();)
2. Emptying the hash doesn't free the RAM, as far as I know. To free the RAM, I think you need to (%hash = undef;)
3. If there's any possible way, I'd avoid loading up a bunch of blob data items into memory and processing them. I process one blob at a time.
4. Disconnecting from Sybase probably will only free up the Sybase data structures. This won't free up perl internal variables.
> From: JMiller@pressherald.com
> Reply To: SYBPERL-L@list.cren.net
> Sent: Friday, May 3, 2002 8:16 AM
> To: SybPerl Discussion List
> Subject: memory problems
> I have a sybperl (ver 2.06) script that chews up gobs of memory
> on my Solaris system. I am retrieving text fields from sybase.
> I read each result into a hash and process it. my script will
> reach 250+ MB plus before it runs out of memory. this was
> exacerbated when I had to bump up the textsize on the query
> to 2048000. I have tried to reinitialize the hash ( %hash = ""; )
> after each fetch but it doesn't help.
> any ideas how to reclaim/reuse this memory?
> one idea I had was to just disconnect/reconnect to sybase
> before I retrieve each field - this would work for me - but
> HOW do I disconnect from sybase without exiting the script?
> and will this release the memory?
> John J Miller