PEPPLER.ORG
Michael Peppler
Sybase Consulting
Menu
Home
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
Freeware
Sybperl
Sybase::Simple
DBD::Sybase
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
FAQs
Sybase on Linux FAQ
Sybperl FAQ
Personal
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: =?iso-8859-1?Q?Pochet_Fr=E9d=E9ric?= <Frederic dot Pochet at cockerill-sambre dot com>
Subject: RE: How can I get the speed I need?
Date: Jul 22 1999 12:12PM

2 other possibilities:
	1.A) bcp out the data to flat files - merge/split in perl to have
new flat files - bcp in those flat files in the target database
	or 1.B) select into in tempdb - bcp out - bcp in the target db
	2) simply create a store proc that do the insert and fire it for
each data row -> 10 to 20 faster than the insert statement but still slower
than bcp (with no indexes on target tables)

-----Original Message-----
From: AGHZHONG@aol.com [mailto:AGHZHONG@aol.com]
Sent: mardi 20 juillet 1999 14:18
To: SybPerl Discussion List
Subject: How can I get the speed I need?


I need to move massive amount of data (7 million rows) from one database to 
another.
The two databases have different tables and different columns.  I cannot do
a 
straightforward bcp in.

The only approved method right now is to bcp out from one database and read 
the data file row by row and then insert into various tables to the other 
database.  
This has to be done as fast as possible.

For a bulk estimate on how long the process will take,  I tried a simple 
case.  A 2000
row data file is read row by row and then converted column by column and 
finally insert into a (one for now) table using SybTools (built on top of 
Sybase::CTlib) ExecSql function call one row a time.  The result is very 
disappointing. I can only insert about 20 rows a second.  (Most of time is 
indeed spent on insertion,  not on those string manipulations at all.)  

Then I tried to group a batch (2000 rows) of records together and wrap the 
batch with "begin transaction" and  "commit transaction" using the same 
function call.  The result seems to be slightly better.  I can now insert 
about 30 rows a second.  This is still too slow for what I need to do.

Next thing I want to try is to build bcp-able array in memory as I read in
the
input file and bcp the array into the table in batch using the BCP tools
that 
came with Sybperl. I don't know this can be any faster either, as BCP 
functions are 
built on top of DBlib and probably not much faster than the CTlib calls.

Any suggestions on how I can speed up? If I write a C program that targets
to 
what
I need to do.  How much faster can I go?  Anyone has any experience on that?

Thank you
Heather