November 23, 2010 at 1:17 pm
We have a problem with one process. We get the file full of INSERT UPDATE and DELETE. Problem is the file is too big. It goes up to 5G.. To execute that file against SQL Server from Management Studio is not possible because transaction is too big.
Executing SQLCMD with this file as an input also fails.. because size of transaction is too large.
Question:
How can I split that transaction into many smaller ones without breaking the file itself? (note: I can't change the way file is made).
I appreciate any of your help.
thanks,
Boris.
if one wants it.. one will justify it.
November 23, 2010 at 2:11 pm
<<We get the file full of INSERT UPDATE and DELETE>>
Is this a transit file? If not, what is it? Can you post a few lines?
For better assistance in answering your questions, please read this[/url].
Hidden RBAR: Triangular Joins[/url] / The "Numbers" or "Tally" Table: What it is and how it replaces a loop[/url] Jeff Moden[/url]
December 3, 2010 at 8:54 pm
rather then executing straight from the file, you could import the file into a table then execute the statements dynamically from the table. Using SSIS the 5GB file would not be a problem.
December 4, 2010 at 8:10 am
🙂 interesting.. this will take me twice as long. I run out of patience and just said to devs to not send this crap any more. Instead we are getting normal delimitted files which take no time at all to import.
BUT the only possible way was to actually parse the file and insert BEGIN TRAN COMMIIT TRAN after every n-number of lines.. Optimal was 300.. But yet again, this is a wrong way of importing data and one should laways raise the red flag at that point..
With your suggestion I would agree if data files were 5-6Mbt.. not in our case where files could reach 5-6G.. Thats just my opinion, which is of coruse subjective, as usual.. Marry Christmas, Happy Hannukah'all!!!
if one wants it.. one will justify it.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply