Michael Peppler
Sybase Consulting
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
Sybase on Linux FAQ
Sybperl FAQ
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: "Gilmore-Baldwin, John" <jBaldwin at desmoine dot gannett dot com>
Subject: RE: memory problems
Date: May 3 2002 2:16PM

Here are some tips that have helped me with similar situations:

1. I believe that the way to empty a hash is (%hash = ();)
2. Emptying the hash doesn't free the RAM, as far as I know. To free the RAM, I think you need to (%hash = undef;)
3. If there's any possible way, I'd avoid loading up a bunch of blob data items into memory and processing them. I process one blob at a time.
4. Disconnecting from Sybase probably will only free up the Sybase data structures. This won't free up perl internal variables.

> ----------
> From:
> Reply To:
> Sent: 	Friday, May 3, 2002 8:16 AM
> To: 	SybPerl Discussion List
> Subject: 	memory problems
> Hi, 
> I have a sybperl (ver 2.06) script that chews up gobs of memory 
> on my Solaris system.  I am retrieving  text fields from sybase.
> I read each result into a hash and process it.  my script will
> reach 250+ MB plus before it runs out of memory.  this was
> exacerbated when I had to bump up the textsize on the query
> to 2048000.  I have tried to reinitialize the hash ( %hash = ""; ) 
> after each fetch but it doesn't help.
> any ideas how to reclaim/reuse this memory?
> one idea I had was to just disconnect/reconnect to sybase 
> before I retrieve each field - this would work for me - but 
> HOW do I disconnect from sybase without exiting the script?
> and will this release the memory?
> thanks,
> John J Miller