May 21, 2021 at 1:20 am
I process some 300,000,000 input records and store resulting 90,000,000 records during the course of a run.
When the resulting table resides in the same database as the input it takes ~20 sec per a day worth of input data.
When I use different database on the same server using the same physical drive the very same process takes ~150 sec per day.
Both databases are not in use by anybody else.
Any hint on how to improve the situation would be appreciated.
May 21, 2021 at 5:37 pm
This is how I solved the issue:
use InputDatabase;
run my scripts where my inserts look like this:
insert into OutputDatabase.Scheme.Table
I still don't understand why it matters so much if the current database is not the same as input.
May 21, 2021 at 6:34 pm
its possible that the 2 db's are in different compatibility level
May 22, 2021 at 2:45 am
This is how I solved the issue:
use InputDatabase;
run my scripts where my inserts look like this:
insert into OutputDatabase.Scheme.Table
I still don't understand why it matters so much if the current database is not the same as input.
What did the scripts look like before?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 25, 2021 at 5:03 pm
On top of what Jeff was asking (what the scripts looked like before), I would be curious to see the execution plans for a slow run vs a fast run.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply