PEPPLER.ORG
Michael Peppler
Sybase Consulting
Menu
Home
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
Freeware
Sybperl
Sybase::Simple
DBD::Sybase
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
FAQs
Sybase on Linux FAQ
Sybperl FAQ
Personal
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: Stephen dot Sprague at morganstanley dot com
Subject: blk_rowxfer and leakage
Date: Sep 28 2002 9:01PM

Anybody experience severe memory  leakage  when  using  the  blk_rowxfer
routine? It is particularly exacerbating when  the  row  contains  image
data.

I have a test script where I copy data from one sybase table (containing
one image column) and bulk insert it into anther table.

It clearly shows when the blk_rowxfer is commented out the mem size tops
out in my case at 43MB. When reinstated it just keeps ballooning until I
kill it. To me, that pretty much isolates it to that routine.  My  guess
something inside it is not deallocating something it should.

there is a routine ct_drop for deallocating bulk  copy  data  structures
but according to the CTlib doc it should be the last blk routine  called
- so it's not like I can call it between blk_rowxfer calls.

I'm open for suggestions.  OC lib version 12.0.


Thanks,
Steve
PS here's the script:


#!/ms/dist/perl5/bin/perl5.6
$|++;

    use strict;

    our $ROWNUM = 0;
    our $TARGET_DBH = 0;
    our $DB_ERROR = 0;
    our $BATCHSIZE = 10;

    my ($fs, $fu, $fp, $fd) = qw/MYSERVER MYUSER MYPASS MYDB/;
    my $from_sql="select word_doc from statements"; #image column

#12.5.0.1
    my ($ts, $tu, $tp, $td) = qw/MYSERVER MYUSER MYPASS MYDB/;
    my $target_table ="$td..xfer12"; #table with one image column
    my $num_cols = 1;


#modules for fetching the data
    use DBI;
    use DBD::Sybase;

#BLK routines for bulk inserting the data
    use Sybase::CTlib;


#connect to 'from' server
    my $dsn = "dbi:Sybase:server=$fs;database=$fd";
    my $from = DBI->connect( $dsn, $fu, $fp, { RaiseError => 0, PrintError => 1 } );

#set error handler
    $from->{syb_err_handler} = \&errHandler;

#set max blob size
    $from->{LongReadLen} = 15_000_000;




#connect to 'target' server as bcp connection
    my $props = {};
    $TARGET_DBH = Sybase::CTlib->new( $tu, $tp, $ts, "sjs", {CON_PROPS => {CS_BULK_LOGIN => CS_TRUE }});
    $TARGET_DBH || die "login error:(User=$tu, Server=$ts)\n";

#specific error handlers for CTlib
    my $scb = ct_callback(CS_SERVERMSG_CB, \&errHandler);
    my $ccb = ct_callback(CS_CLIENTMSG_CB, \&msgHandler);

#images
    my $imagesize = 15_000_000;
    $TARGET_DBH->ct_options(CS_SET, CS_OPT_TEXTSIZE, $imagesize, CS_INT_TYPE);

#initialize
    my $rc = $TARGET_DBH->blk_init( $target_table, $num_cols);
    die "fail on blk_init" if $rc == CS_FAIL;

#get the data
    my $cb = \&getOneRow;
    eval {
      $from->func($from_sql, [], $cb, 'nsql');
    };
    print "** eval: [$@] **\n" if $@;

    my $na = 0;
    my $rc = $TARGET_DBH->blk_done( CS_BLK_ALL, $na );
    die "fail on blk_done" if $rc == CS_FAIL;

exit;



    sub getOneRow {
      $ROWNUM++;
      print "ROWNUM=$ROWNUM\n";

      my $rc = $TARGET_DBH->blk_rowxfer( \@_ );
      die "blk_rowxfer failed" if $rc == CS_FAIL;

#batch every batchsize
      print("committing\n"), $TARGET_DBH->blk_done( CS_BLK_BATCH, $na) if $ROWNUM % $BATCHSIZE == 0;

      return 1;

     }