July 9, 2014 at 3:06 pm
So we have a very large SSIS package that is inserting records from one table in a database to the same table in another database. It is just a simple select query with one parameter that filters it. We have been running this package for weeks with never a problem. Today we went to run it on one database only 20,000 records and the tempdb files grew from 2.5gb to 28gb. When it failed it said it ran out of disk space for the sort. It is running in the Fast Load mode. Why would it be sorting on a simple insert? Could indexes on the table inserting into be the issue? And if so why wouldn't this have happened before?
We are using SQL SERVER 2012 SP1 CU5.
If anyone has any thoughts or you need more information please let me know. I am extremely frustrated and not sure where to go next.
July 10, 2014 at 1:13 am
Take a look at the actual execution plan of the query to see if tempdb is used somewhere.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 11, 2014 at 11:29 am
I have looked at the execution plan. That is not the issue. I have attached a picture of the tasks
July 11, 2014 at 3:16 pm
So it looks like the issue is the actual bulk insert. It won't complete. But if you use an execute sql task just doing a regular insert it works. Any thoughts?
July 12, 2014 at 10:00 am
What if you disable check constraints and enable table lock in the fast load options?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
July 18, 2014 at 3:17 pm
So it turns out it has something to do with the indexed views associated with the table. The work around was to drop the views do the bulk insert and then recreate the views. The part that makes no sense is why wasnt this an issue from the beginning.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply