November 23, 2010 at 1:15 pm
We have a problem with one process. We get the file full of INSERT UPDATE and DELETE. Problem is the file is too big. It goes up to 5G.. To execute that file against SQL Server from Management Studio is not possible because transaction is too big.
Executing SQLCMD with this file as an input also fails.. because size of transaction is too large.
Question:
How can I split that transaction into many smaller ones without breaking the file itself? (note: I can't change the way file is made).
I appreciate any of your help.
thanks,
Boris.
if one wants it.. one will justify it.
November 23, 2010 at 1:21 pm
November 23, 2010 at 1:24 pm
Here are a few suggestions:
Check the disk space on the disk that stores the DB and log file.
Check the Files page of the database properties to see if the database or log file size is restricted. If it is, then change it to allow for growth. You may have it set to grow, but with an outside limit that someone thought would be large enough, but is not.
Truncate the log file. If it continues to grow and you cannot truncate the file, check to see which recovery model is being used. If the Full model is used then check/setup a backup scheme that includes backing up the log file periodically. Only backed up transaction data can be truncated from the log file.
November 23, 2010 at 1:33 pm
Jerry, thanks for kind words, but the transaction log file has nothing to do with it.. Increasing pocket size will slow down everything including this process as well...
It is the compiler, I believe which chokes on the huge transaction. I wrote a little perl script which parses the original file and puts a wrapper after every X lines of inserts or updates starting with BEGIN TRANSACTION ending with COMMIT TRANSACTION into a new file.
It generates even larger file which I am able to execute successful. So the reason I am here is because the script doesn't always deal with format conversions properly.. either because of ActivePerl or Windows, or other things.. Is there more intelligent way out there?
I am sure that I am not the only one dealing with this.. com'n SQL monsters.. help me out here..
if one wants it.. one will justify it.
November 23, 2010 at 1:34 pm
Roy, my apologies. I already started it here.. It is rather gray-area topic..
if one wants it.. one will justify it.
November 23, 2010 at 1:40 pm
Why not use Power shell? Read the file line by line and execute each line. Then you do not have to modify the file at all.
Just thinking out loud.
-Roy
November 23, 2010 at 2:00 pm
I'd go with Roy's idea. Run each command individually. There are a number of ways to do that, with the command shell probably being the easiest.
Alternately, strangle the developer who came up with the idea, and switch to having them output table contents into a text file, and then use normal old ETL to process the thing. On a reasonable server (or my home pc), a multi-gig text file with tabular data in it, won't be any problem at all for SQL Server to digest, either through SSIS or through any number of other ETL methods.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
November 23, 2010 at 2:22 pm
GSquared (11/23/2010)
Alternately, strangle the developer who came up with the idea
Always a viable option!
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
November 23, 2010 at 2:27 pm
GSquared, I said that many times.. now I am told that it brings only negativity.. 🙂 So.. stop being negative or I will let my boss read this thread 🙂
Roy, I am not familiar with PowerShell just yet. I am sure that you have done that already.. can you post the PowerShell sample script that does this kind of operation?
thanks guys!
if one wants it.. one will justify it.
November 23, 2010 at 2:29 pm
GSquared, if I had a quarter for every bad developer... I wouldn't be posting this. I loaded giga-s of tabular data into the SQL Server in the past and I know that it isn't an issue at all.
if one wants it.. one will justify it.
November 23, 2010 at 2:43 pm
I do not have any scripts as it is for doing this kind of stuff. I only have one for back up. But to read file line by line it is very simple.
foreach ($line in Get-Content “C:\Test.txt”)
{
Do your execute here.
}
But I think you should look at some of the Power shell samples on the net to understand the requirements.
maybe you should look here to understand more[/url].
-Roy
November 23, 2010 at 2:49 pm
that does look like something worth looking at. I knew that vbscript had issues with large files. Now with X64 bit windows things may be different. Anyone tried the large files full of SQL? (more then 2G)
Wayne,
imagine an insert statement. if you did, now, multiply your imagined line by 5 million or more. That's our SQL file (data file).
if one wants it.. one will justify it.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply