January 26, 2010 at 11:30 am
I have a large script file, around 200MB in size, that SQL Server tells me there is not enough storage to complete the operation if I try to open it up in the Management Studio. Anybody have a good solution to run this other than opening the .SQL file and clipping out blocks of SQL Statements at a time to run?
January 26, 2010 at 11:48 am
Try using sqlcmnd or the earlier cmd line tools
http://msdn.microsoft.com/en-us/library/ms180944.aspx
Andrew
January 26, 2010 at 11:49 am
Is this a bulk load script or is the script file various different sql statements and commands?
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 26, 2010 at 12:06 pm
CirquedeSQLeil (1/26/2010)
Is this a bulk load script or is the script file various different sql statements and commands?
Various SQL statements.
January 26, 2010 at 12:14 pm
With the contents being various different sql commands and statements, I would opt for breaking the file out. 200MB seems somewhat large for a script file - unless it is a data load / update script.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 26, 2010 at 8:14 pm
Just curious, Jason. Why would data loads be so different as to require such a large script? 200 MB seems high for such a thing. And, no... not challenging you on this... I'm just curious as to what kind of data loads would require so much code.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 26, 2010 at 8:56 pm
Jeff Moden (1/26/2010)
Just curious, Jason. Why would data loads be so different as to require such a large script? 200 MB seems high for such a thing. And, no... not challenging you on this... I'm just curious as to what kind of data loads would require so much code.
I've seen the occasional flat file that was that large. Not TSQL per se, just a flat file full of data that needed to be imported. In any scenario, it is not fun and not something that I would want to on a regular basis.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 26, 2010 at 9:33 pm
Ah... got it. Thanks.
Heh... it's odd what someone get's used to... I love to automate "heavy metal ETL". A lot of the work I do involves Call Detail Records and "click" records for large companies like Expedia.com and most of them are deemed "impossible to import in T-SQL or SSIS" which, of course, I don't listen to :-P. I can't get enough of it.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 26, 2010 at 9:41 pm
The basis of the question was to direct which answer to provide. If data load, then an ETL type process. If it was just a bunch of sql scripts - then, well you saw my answer.
And yes, it is amazing what one gets used to doing. I wouldn't tell somebody that this kind of load is impossible, big yes - but then it is a perspective thing. We do a bunch of smaller ETL type jobs all throughout the day (<= 100mb).
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 26, 2010 at 11:06 pm
Shifting gears, I don't believe the OP was talking about data files, though... It sounds like the OP has a T-SQL script that's 200MB long and is trying to execute it.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 27, 2010 at 12:12 am
That's what I understand to be the current issue as well. Do you have any other ideas to run that script than to break it out into multiple scripts?
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 27, 2010 at 11:06 am
No... heh... I've never seen a single script so large. As a side bar, you've just got to ask how someone tested it if it can be executed as a single script.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 27, 2010 at 11:10 am
Jeff Moden (1/27/2010)
No... heh... I've never seen a single script so large. As a side bar, you've just got to ask how someone tested it if it can be executed as a single script.
Good point. That would be interesting.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
April 19, 2010 at 11:30 am
I do happen to have a data script that is 1.2 GB, no file editor program can open it. It was generated using redgate sqldata compare, the problem is that I need to tweak it a little bit, for example, I need to specify the name of the db where to run the commands, but no editor can handle this monster. Any ideas? And no, redgate tool does not let you create several script files, only allows you to split the script in transactions on bigger than (any number) , but all remain on the same file....
April 19, 2010 at 11:58 am
You could use the method on the attached link, provided your script file does not contain any batch separators (GO).
Execute Script from a File
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=99084
However, my guess is that with a script that size it will contain batch separators.
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply