PEPPLER.ORG
Michael Peppler
Sybase Consulting
Menu
Home
Sybase on Linux
Install Guide for Sybase on Linux
General Sybase Resources
General Perl Resources
Freeware
Sybperl
Sybase::Simple
DBD::Sybase
BCP Tool
Bug Tracker
Mailing List Archive
Downloads Directory
FAQs
Sybase on Linux FAQ
Sybperl FAQ
Personal
Michael Peppler's resume

sybperl-l Archive

Up    Prev    Next    

From: bruno dot georges at barclayscapital dot com
Subject: RE: How can I get the speed I need?
Date: Jul 20 1999 1:04PM

I'll  call a stored proc that will create a kind of queue table with the
order you want , 
then bcp out 
and bcp in to the other server.

Other alternative will be a perl script to process the rows (not sure about
the memory requirements ..), Sybase is probably the best plce to arrange
your data .

I always use stored proc and Queue tables, it's easier to consolidate the
data and manage exceptions in case something goes wrong.

I recently wrote a Dataloader which compare the bcp out file with a .fmt
file before bcp in the target db.

Hope this helps.
Bruno 

> -----Original Message-----
> From:	AGHZHONG@aol.com [SMTP:AGHZHONG@aol.com]
> Sent:	20 July 1999 13:18
> To:	SybPerl Discussion List
> Subject:	How can I get the speed I need?
> 
> I need to move massive amount of data (7 million rows) from one database
> to 
> another.
> The two databases have different tables and different columns.  I cannot
> do a 
> straightforward bcp in.
> 
> The only approved method right now is to bcp out from one database and
> read 
> the data file row by row and then insert into various tables to the other 
> database.  
> This has to be done as fast as possible.
> 
> For a bulk estimate on how long the process will take,  I tried a simple 
> case.  A 2000
> row data file is read row by row and then converted column by column and 
> finally insert into a (one for now) table using SybTools (built on top of 
> Sybase::CTlib) ExecSql function call one row a time.  The result is very 
> disappointing. I can only insert about 20 rows a second.  (Most of time is
> 
> indeed spent on insertion,  not on those string manipulations at all.)  
> 
> Then I tried to group a batch (2000 rows) of records together and wrap the
> 
> batch with "begin transaction" and  "commit transaction" using the same 
> function call.  The result seems to be slightly better.  I can now insert 
> about 30 rows a second.  This is still too slow for what I need to do.
> 
> Next thing I want to try is to build bcp-able array in memory as I read in
> the
> input file and bcp the array into the table in batch using the BCP tools
> that 
> came with Sybperl. I don't know this can be any faster either, as BCP 
> functions are 
> built on top of DBlib and probably not much faster than the CTlib calls.
> 
> Any suggestions on how I can speed up? If I write a C program that targets
> to 
> what
> I need to do.  How much faster can I go?  Anyone has any experience on
> that?
> 
> Thank you
> Heather
--------------------------------------------------------------------------------------
For more information about Barclays Capital, please
visit our web site at http://www.barcap.com.

Internet communications are not secure and therefore the Barclays Group
does not accept legal responsibility for the contents of this message.
Any views or opinions presented are solely those of the author and do 
not necessarily represent those of the Barclays Group unless otherwise 
specifically stated.

--------------------------------------------------------------------------------------