Up Prev Next
From: collardp at rabo-bank dot com
Subject: RE: How can I get the speed I need?
Date: Jul 20 1999 12:35PM
how about select into?
Peter Collard (x3863)
Senior Sybase DBA
The views expressed in this correspondence are those of the author and
do not necessarily represent those of Rabobank International
[mailto:owner-SYBPERL-L@listproc.net]On Behalf Of AGHZHONG@aol.com
Sent: Tuesday, 20 July 1999 13:18
To: SybPerl Discussion List
Subject: How can I get the speed I need?
I need to move massive amount of data (7 million rows) from one database to
The two databases have different tables and different columns. I cannot do
straightforward bcp in.
The only approved method right now is to bcp out from one database and read
the data file row by row and then insert into various tables to the other
This has to be done as fast as possible.
For a bulk estimate on how long the process will take, I tried a simple
case. A 2000
row data file is read row by row and then converted column by column and
finally insert into a (one for now) table using SybTools (built on top of
Sybase::CTlib) ExecSql function call one row a time. The result is very
disappointing. I can only insert about 20 rows a second. (Most of time is
indeed spent on insertion, not on those string manipulations at all.)
Then I tried to group a batch (2000 rows) of records together and wrap the
batch with "begin transaction" and "commit transaction" using the same
function call. The result seems to be slightly better. I can now insert
about 30 rows a second. This is still too slow for what I need to do.
Next thing I want to try is to build bcp-able array in memory as I read in
input file and bcp the array into the table in batch using the BCP tools
came with Sybperl. I don't know this can be any faster either, as BCP
built on top of DBlib and probably not much faster than the CTlib calls.
Any suggestions on how I can speed up? If I write a C program that targets
I need to do. How much faster can I go? Anyone has any experience on that?