July 6, 2006 at 9:16 am
Using the Import/Export Wizard, it is easy to set up SSIS jobs to import multiple tables in one task. However, SSIS appears to grab a connection to the source database and the destination database for each table indicated in the task sometime prior to doing the actual data transfer (I haven't identified whether these connections are made during the Validating, Prepare of Execute, or Pre-Execute phase). Even though the transfer seems to be done a few tables at a time, all the connections are held throughout the execution of the task.
This has two effects:
1. The task may fail if the number of connections exceeds the limit of user connections set for the database
2. There is a severe impact on the other users of the source database (in my case, a production system)
Is there any way to control the number of simultaneous connections that SSIS data transfer packages initiate?
I can, of course, define the package to have a small number of tables (thus limiting the connections), but in real life I need to transfer almost 700 tables, so limiting the number of tables per task to 30 or 40 produces an awkwardly large, and error prone, number of tasks. And since this definition must be done for multiple (not quite identical) databases, the extra effort of defining small collections of tables is signifcant.
July 10, 2006 at 8:00 am
This was removed by the editor as SPAM
July 10, 2006 at 10:32 pm
I am not having any idea about how to implement the number of connections in import/export wizard. but i think you can do same in SSIS. you just take For loop task from control flow inside that keep some Execute SQL task or data flow task which ever you feel is best. it executes for each table. i think this way you can resolve the problem.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply