January 15, 2013 at 8:44 am
To keep this simple i'm going to include an example however there are actually several scripts that i need to do this to.
I have a manual process where i have, say, 30 scripts. Each one is dropping and creating all sorts of different sql objects. functions, tables, indexes, etc.
I want to be able to execute these scripts as a chunk. Like wrap each script in a stored proc. My problem is i need the "GO" in each to separate the if exists drop and create. Is there another way for me to wrap these scripts so i can make one call and the scripts runs?
Please advise if u have any questions. For some reason i dont feel i'm explaining it very well.
For example. Is there something i can wrap the below in so i make one call?
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[udf_GetCleanUsername]') AND type in (N'FN', N'IF', N'TF', N'FS', N'FT'))
DROP FUNCTION [dbo].[udf_GetCleanUsername]
go
CREATE FUNCTION [dbo].[udf_GetCleanUsername] (@Data VarChar(50))
Returns VarChar(50)
As
Begin
Declare @Temp VarChar(50)
Declare @i Integer
Declare @Output VarChar(50)
Set @i = 1
Set @Output = ''
While @i <= Len(@Data)
Begin
Set @Temp = SubString(@Data, @i, 1)
If@Temp IN ('0','1','2','3','4','5','6','7','8','9') OR
@Temp IN ('A','B','C','D','E','F','G','H','I','J','K','L','M','N','O','P','Q','R','S','T','U','V','W','X','Y','Z')
Set @Output = @Output + @Temp
Set @i = @i + 1
End
Return (Select @Output)
End
January 15, 2013 at 8:56 am
Why don't you just combine all the scripts into a single file? Then you just add GO where needed and it can all run at once.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
January 15, 2013 at 9:17 am
i suppose i could if i had to however some of them are dependent. So, I'd have to modify the contents or add some sort of try catch or error logic. These scripts are several thousand lines long and extremely complex. i was hoping i could just call each .sql file somehow and wrap the error logic around each call. Something along those lines. I don’t really have time to modify the contents of these scripts not the testing cycles. I would have thought that sql 2008 would have designed a way to wrap strips w/ GO in them.
January 15, 2013 at 9:19 am
i suppose i could use sql cmdshell to call a .sql file and put them in sequencial steps in a job. That would also allow for me to track runtime and if it failed or succeded
January 15, 2013 at 10:01 am
GO is a batch separator, not a T-SQL command, so it really doesn't work that way. That's why you can change to something other than "GO" in your SSMS preferences, but you can't change something like "SELECT" there.
I'm used to combining scripts and building in the necessary error handling. Makes it a lot easier to manage the whole thing when I'm ready to migrate from my dev environment to my test environment, and then again to production, if it's one file. Can't accidentally run the third file second that way, or anything like that.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 15, 2013 at 10:28 am
SSIS package? Have a table with list of .sql files to run and have a loop container to run each?
---------------------------------------------------------------
Mike Hahn - MCSomething someday:-)
Right way to ask for help!!
http://www.sqlservercentral.com/articles/Best+Practices/61537/
I post so I can see my avatar :hehe:
I want a personal webpage 😎
I want to win the lotto 😀
I want a gf like Tiffa :w00t: Oh wait I'm married!:-D
January 15, 2013 at 11:40 am
lol. CELKO. Easy there. Not only is this not my code but there’s darn near 200,000 lines of it across 30 scripts that will have to run next week. I can’t make changes to the scripts themselves w/o having to send it though QA again. No Can Do. This code is a run once set that after completion will be trashed. It’s part of a process that will move data from one place to another. Actually it is one small step in a much larger process. All of which I will be managing. When I’m done the new db will be created and loaded w/ 4.7 terabytes of data. To be honest I just grabbed the first thing I saw in the first file I clicked on and put it in the ticket. I’ve never actually seen that function before. As I mentioned before I don’t have the time nor testing cycles to rewrite what was written. However I do find it compelling that u would make such assumption and feel it is you place in this world to spew that nonsense on others. As if we are privileged to hear that dribble. Perhaps I should have made myself more clear in my original post and then you wouldn’t have made so many illogical assumptions. I don’t know. What I do know is someone needs a girlfriend or woman rather. Hit me up. I’m buyin because u my friend are treading on a very fine line.
Please think about that.
January 15, 2013 at 11:45 am
CELKO
You do not know how to program in SQL to the point you are danger to your employer and the public. Please think about that.
Joe, you do not know how to read and interpret a question. The OP was not asking how to improve the function, nor was there anything in there remotely asking for your opinion. Do you really think the use of a UDF qualifies someone as a "danger" to the public at large? And if your nostalgia is so easily coerced by simple things like @i, why don't you find a FORTRAN or COBOL forum to annoy, where you can re-live your presently irrelevant glory days?
Greg
_________________________________________________________________________________________________
The glass is at one half capacity: nothing more, nothing less.
January 15, 2013 at 12:45 pm
I agree with GSquared - combine things logically into groups. I've had deployment scripts of several thousand lines that I've run to release things to production before. When something gets really big, I have a master SQL file where I reference the other SQL files to run in order. When deployment time arrives, I simply follow the instructions I've written for myself. If I've practiced properly (run through it in order in test) then there shouldn't be any problems.
Good luck with your big deployment.
January 15, 2013 at 1:10 pm
I'd recommend using the SSIS as a previous poster recommended, allowing you to run each SQL file individually.
I also recommend someone to stop pissing in Celko's cheerios. Damn that was harsh.
January 15, 2013 at 1:18 pm
Agreed. Not everyone is obsessed with ISO standards.
January 15, 2013 at 1:39 pm
Haha.... I made the effort to read some of Mr Celko's previous posts sometime ago... Lol he really lets loose sometimes but you know he does make some valid points. It's just his style is so damn rude!!
Take it wth a shovel of salt. 😀
Anyway, as GSquared said, GO is not a TSQL command, it's just a word that SSMS uses to know that you want to send a bunch of code in one batch.
You say you have 30 scripts and some are dependent on others then you must also know the order the must run yes?
If so then why not
A) open new .sql file, copy and paste content of the first script, add a GO then paste in the second script and so on.
b) schedule a sql agent job and place each script in its own step.
C) create SSIS package with as a series of execute TSQL components
D) batch file
Thanks.
---------------------------------------------------------
It takes a minimal capacity for rational thought to see that the corporate 'free press' is a structurally irrational and biased, and extremely violent, system of elite propaganda.
David Edwards - Media lens[/url]
Society has varying and conflicting interests; what is called objectivity is the disguise of one of these interests - that of neutrality. But neutrality is a fiction in an unneutral world. There are victims, there are executioners, and there are bystanders... and the 'objectivity' of the bystander calls for inaction while other heads fall.
Howard Zinn
January 15, 2013 at 2:17 pm
Yeah, it's pretty safe to ignore the rabid weasle Joe in this instance. Nothing he wrote here actually has any value to the discussion. (Sometimes it does, and it's just poorly communicated. But not this time. Waste of magnetic dots on a platter this time.)
If you have 30 files to run, I don't see copying those into a single file as being a realistic option. Too cumbersome to copy-paste.
A batch file that runs the files in order, using sqlcmd, is probably the best bet in that case.
Are you in a position where you can take over a test server, run the scripts (by batch file or otherwise), and do a before/after check? If so, set up a test server just like the server you will eventually run the scripts on, back everything on it up, and run the scripts. Then you can fix/tweek them if they throw errors, then retore the test server and run them again.
Rinse ... repeat ...
Once they run without error, then compare what you end up with against your expected end result, and you'll know pretty well what to expect from your go-live.
If you can't test before-hand in that manner, just assume the whole thing will blow up in production, and go from there. Have triage and emergency crews on-hand, expect problems, and make sure you have restorable backups before you run it. Plan for about 10 times the downtime that would be reasonable. Sometimes you just have do a "damn the torpedoes, full speed ahead" type thing, and make sure your damage-control team is on the ball. But avoid that if you can at all do so, and do a dry-run on a test server.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 15, 2013 at 3:25 pm
The "sqlcmd" utility does recognize "GO", as well as many other things, including invoking OS cmds.
I think you should be able to use "sqlcmd" to run the scripts.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
January 16, 2013 at 7:48 am
1) Sql cmd exe is disabled on all prod sql instances due to security issues.
2) I can’t add them all to 1 script because it’s dropping and creating objects as well as loading data. Because the scripts are laden w/ “go” throughout if the portion before a go fails the second portion will run regardless. So, for example. If after say the second batch is an insert that fails and say 400 batches later is another insert into a table that uses the initial table whose insert failed. Then it would insert 0 records and so on. I can’t have that.
So the separation of the scripts is kind of a failsafe to ensure the process doesn’t get too far if something fails. Again, I didn’t write this. Just stating the facts. So combine all into one is not an option at this point. If file three fails I need to run it till it succeeds before I run file four. Each script is written in a way that it’s self-healing in a sense. If a script fails half way I can rerun the whole script and everything is dropped and recreated. Which is actually part of the problem. It take around 18 hours to run all the files and I have a short window of time to get it done once we do the final push.
The rub is u can’t drop and create certain object w/o a GO in between. As my original example states. U can’t drop a function then create a function w/o running them in separate batches. Unless I’m missing something. Is there some other method other than using GO? To be honest, it shouldn’t blindly drop and recreate. It should be checking for existence and creating if it doesn’t exist instead of blindly dropping and recreating.
It should have drop and create objects script then a data load script.
Keep in mind the current process works It’s a bit manual but it works. So it’s not imperative I find a solution. It’s just been on my mind lately. So worst case I kick one off. Then 1.32 hours later when I expect it’s done I manually check. If all is good I kick off the next script. I was just hoping to automate it and have it send me success or failure emails.
Actually at the very least I could have it send me a completion email at the end of each script. I think I’ll add that now. Just thinking out loud. Again. I wouldn’t waste too much mental energy on this because I do have a working method.
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply