May 24, 2022 at 5:00 pm
Hi
I have a stored procedure that writes rows to tables where it repeatedly calls a SP and passes values to perform the write
It does this by first building up some table detail in a temp table - Such as table name, id plus some other bits
It then uses a while loop and loops through this temp table, calling an SP and passing some of the temp tables values each time
Could be anywhere between 100 and about 2000 calls to the SP
So, each call is made serially and each time it writes data to a different table
I'm just wondering if this could run faster if I can somehow make multiple calls in parallel
Is that possible?
I fell like it's either a restructure of the SP that's required or a fundamental change to the approach e.g. SSIS
Thanks
- Damian
May 24, 2022 at 5:19 pm
What would be better would be to make it so that groups of rows in the temp table that will be written to the same tables are handled all at once. That would require a change to the stored proc. Since that stored proc may be used by other things, then a new stored proc would probably be in order.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 24, 2022 at 5:21 pm
An even easier way for the future might be to make an update-able partitioned view that would auto-magically evaluate the data from the temp table and auto-magically write it to the correct tables.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 24, 2022 at 5:25 pm
That's an interesting thought.
Thanks Jeff - I'll try out these approaches
- Damian
May 24, 2022 at 7:21 pm
As an alternative this could possibly/probably be better handled by SSIS than what you describe as your current solution.
Instead of writing to a temp table, just pass the data to a Conditional Split component where you split on the table name and pass the data to different output connections. This will in effect make your write process parallel (batching is automatically included on output). Of course this will not be a fully dynamic solution, since you have to define table output connections for each possible output table in advance. But still, unless you need a fully dynamic number of tables, SSIS may be a viable solution.
May 25, 2022 at 2:56 am
As an alternative this could possibly/probably be better handled by SSIS than what you describe as your current solution.
Instead of writing to a temp table, just pass the data to a Conditional Split component where you split on the table name and pass the data to different output connections. This will in effect make your write process parallel (batching is automatically included on output). Of course this will not be a fully dynamic solution, since you have to define table output connections for each possible output table in advance. But still, unless you need a fully dynamic number of tables, SSIS may be a viable solution.
That would also work... it's kind of like the partitioned view except done graphically.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply