February 13, 2015 at 3:45 am
See I'm not stuck on this question I have something that works I just want a second,thirth ... opinion of possibly improving.
I have the following scenario which cause an substantial memory issue a few months ago
Records from an external non-database source are read in, they contain data about various work orders.
These must then be checked on the existing data,completely new records need to be added,any change need to result in the existing records to be updated.
The orders are sorted by client and type and need to end up in tables that follow following naming:
dbo.workOrder_Type_ClientName
This means that in the package the destination is dynamic & stored in a variable. The name of the client & the location of the data files are stored in a master_parameter table,the first step in the package is then getting all the data file location for a type for all the clients.
It then loads in the data into temp tables (tmp.workOrder_Type_ClientName) from there the data needs to be transferred to the dbo tables.
Now I had a merge join to do this however while this worked fine in development it caused an issue in our test environment,as the existing dataset got larger it started generating these cache files (& I did check running the package from BIDS & could not see that behaviour).
Now I replaced this with a stored procedure to do the merge,these are humongous dynamic sql (and yes I put in checks to prevent sql injection).
These dynamic sql are sometimes hard to read and I was wondering if there was a better way to do this.
February 13, 2015 at 2:01 pm
How many different end tables are there, and do they need to be created dynamically, or are you just switching dynamically between existing ones?
February 16, 2015 at 3:36 am
The end tables will be present atm only 1 exist.
There needs to be a dynamic switching,cause estimations are about 20-30 clients who will initially use the system.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply