December 11, 2013 at 7:16 am
Hi All,
I am getting problem on bulk insert issue through SSIS package.
when bulk of the record inserting in the particular table from excel through SSIS package and at the same time the other want to read the data
so they are not able to read the data because of the blocking issue.
Can anyone please provide me script on this ?
Thanks...
December 11, 2013 at 7:47 am
I think it is quite normal that readers are blocked when a bulk insert operation is going on. This is to prevent dirty reads.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 11, 2013 at 8:01 am
Koen Verbeeck (12/11/2013)
I think it is quite normal that readers are blocked when a bulk insert operation is going on. This is to prevent dirty reads.
+1 agreed this is sensible behaviour.
Maybe your package can be optimised - typically how many rows are you inserting and how long are the locks lasting?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
December 11, 2013 at 10:39 pm
Thanks for the reply...
I want to clear my question , I think this will be easy to understand.
we have a production DB. we have millions of data which client is keep sending and we have created SSIS package to copy data from excel to the DB.
At the same time when data is loading into the DB, the end user want to catch the data, because some time data is loading into the working hour also, at the same time end user can not sit idle. so that time the deadlock occurs.
for that reason I want the script or any alternate solution so that i can resolve the issue.
Thanks.
December 11, 2013 at 11:08 pm
arooj300 (12/11/2013)
Thanks for the reply...I want to clear my question , I think this will be easy to understand.
we have a production DB. we have millions of data which client is keep sending and we have created SSIS package to copy data from excel to the DB.
At the same time when data is loading into the DB, the end user want to catch the data, because some time data is loading into the working hour also, at the same time end user can not sit idle. so that time the deadlock occurs.
for that reason I want the script or any alternate solution so that i can resolve the issue.
Thanks.
Your client is sending 'millions of rows' ... in Excel? As the latest version of Excel has a max number of rows per worksheet of 1,048,576, this sounds like a cumbersome multi-worksheet solution and I'd seriously be looking into moving to a CSV load instead. Anyway, I am getting sidetracked.
Are we dealing with updates, inserts or both?
Some sort of partitioning solution may be possible and there would be almost zero deadlocks - how is the data organised, is it by some sort of transaction date?
-edit: fix typo
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
December 11, 2013 at 11:55 pm
arooj300 (12/11/2013)
Thanks for the reply...I want to clear my question , I think this will be easy to understand.
we have a production DB. we have millions of data which client is keep sending and we have created SSIS package to copy data from excel to the DB.
At the same time when data is loading into the DB, the end user want to catch the data, because some time data is loading into the working hour also, at the same time end user can not sit idle. so that time the deadlock occurs.
for that reason I want the script or any alternate solution so that i can resolve the issue.
Thanks.
Insert into a staging table and then partition switch the new data into the final destination table.
(if you have enterprise edition...)
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply