PEPPLER.ORG
Michael Peppler
Sybase Consulting
Menu
Home
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
Freeware
Sybperl
Sybase::Simple
DBD::Sybase
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
FAQs
Sybase on Linux FAQ
Sybperl FAQ
Personal
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: "Khait, Yuri" <yurkhait at Mobility dot com>
Subject: RE: How can I get the speed I need?
Date: Jul 20 1999 1:20PM

How about splitting your bcp out file into multiple ASCII files and then
bcp them in separately, so far bcp is the fastest way for loading large
volumes of data.

Regards,
Yuri.

> -----Original Message-----
> From:	AGHZHONG@aol.com [SMTP:AGHZHONG@aol.com]
> Sent:	Tuesday, July 20, 1999 8:18 AM
> To:	SybPerl Discussion List
> Subject:	How can I get the speed I need?
> 
> I need to move massive amount of data (7 million rows) from one database
> to 
> another.
> The two databases have different tables and different columns.  I cannot
> do a 
> straightforward bcp in.
> 
> The only approved method right now is to bcp out from one database and
> read 
> the data file row by row and then insert into various tables to the other 
> database.  
> This has to be done as fast as possible.
> 
> For a bulk estimate on how long the process will take,  I tried a simple 
> case.  A 2000
> row data file is read row by row and then converted column by column and 
> finally insert into a (one for now) table using SybTools (built on top of 
> Sybase::CTlib) ExecSql function call one row a time.  The result is very 
> disappointing. I can only insert about 20 rows a second.  (Most of time is
> 
> indeed spent on insertion,  not on those string manipulations at all.)  
> 
> Then I tried to group a batch (2000 rows) of records together and wrap the
> 
> batch with "begin transaction" and  "commit transaction" using the same 
> function call.  The result seems to be slightly better.  I can now insert 
> about 30 rows a second.  This is still too slow for what I need to do.
> 
> Next thing I want to try is to build bcp-able array in memory as I read in
> the
> input file and bcp the array into the table in batch using the BCP tools
> that 
> came with Sybperl. I don't know this can be any faster either, as BCP 
> functions are 
> built on top of DBlib and probably not much faster than the CTlib calls.
> 
> Any suggestions on how I can speed up? If I write a C program that targets
> to 
> what
> I need to do.  How much faster can I go?  Anyone has any experience on
> that?
> 
> Thank you
> Heather