April 12, 2012 at 4:49 pm
How can one optimize the full extraction from SAP system? I am doing a full (select *) extraction from SAP (ADO .net source) and dumping all to a sql table using oledb dest.Is there a way we can tune this extraction, may be increase packet size, adjust batch size, sql hint, anything??
April 12, 2012 at 6:44 pm
Yes.
Oh, alright... 😛
How many rows are we talking about? How wide?
Is the destination on the same server as the source?
Some say that dumping the table into a flat file from the source, moving the file to the destination, then doing a bulk load is the best approach.
There are also options for parallelism. http://henkvandervalk.com/speeding-up-ssis-bulk-inserts-into-sql-server <-- something like this. A similar approach might work for the source file dump if you decide to go that way.
I'm afraid I haven't specifically worked with SAP though.
Anyway, some of this might help.
April 13, 2012 at 12:10 am
Try to implement as many of these:
Top 10 SQL Server Integration Services Best Practices
And do not use SELECT *. 🙂
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 13, 2012 at 2:41 pm
Thank You. I do have another question too. I was trying to move one of my ssis package(not deploy, just move) from one server to another and continue building that package, i couldn't do copy paste.If it was on the same system,i could just copy the package and paste in the ssis node in the bids. But i can't do that when its across different server. How can i move my half build package to another server and continue building in there?
April 13, 2012 at 2:52 pm
You can copy packages from server to server with DTUTIL.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 16, 2012 at 12:32 pm
ok thanks.
I have a whole lot of big tables to dump from SAP to sql using SSIS ado sap data loader. one of the tables with around 2.5 million records i was able to load in around 1.19 minutes which is good. But other tables with some 16 million records are taking hours. Here's my environment.
1. The source and the target are remote (not the same machine obviously)
2. In the source adapter i am just doing a simple select statement of the fields i need to pull.
3. In the dest, i am doing table-view fast load with tablock, leaving everything else as it is ( max inser commit size is 2147483647)
4. Changed target db to bulk-logged mode
5. the target table does not have any indexes or constraints.
Its painfully slow for some tables. Please advise. TIA.
April 16, 2012 at 1:22 pm
The SSIS package design looks OK, there are no indexes, constraints, excessive logging, so this leaves the disks (storage), the network and maybe the I/O bus?
Maybe there are some other loads going on at the destination server at the same time, so the CPU is too busy or the I/O has become a bottleneck?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 16, 2012 at 1:51 pm
Koen Verbeeck (4/16/2012)
The SSIS package design looks OK, there are no indexes, constraints, excessive logging, so this leaves the disks (storage), the network and maybe the I/O bus?Maybe there are some other loads going on at the destination server at the same time, so the CPU is too busy or the I/O has become a bottleneck?
i don't think its the I/O bottleneck because as i mentioned, one table with some 2 million records was very fast. Some table dumps faster and it keeps getting slower as we have more records transferred. Can I mention batchsize in the select statement at source, or batchsize just for bcp and bulk insert? if i can, how? I have a package running for two hours with some 26 million records? Please help me how to diagonise the various possible bottlenecks in my situation ? thanks.
April 16, 2012 at 2:27 pm
How many rows are there already in the destination table? Maybe you should consider partition switching.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
April 16, 2012 at 5:46 pm
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply