October 23, 2005 at 1:35 am
Hi, I plan to create a program to allow user to execute batch scripts to update the database by double click the program instead of execute every scripts by query analyzer. At first, I try to use the ADODB's Execute method to run the scripts, but I get the error "Incorrect syntax near 'go'" because the method do not support the 'go' command. After do the search online about 4 hours I still cannot find a best solution for my problem. So, I post the message to the forum to see whether anyone can help me to solve it.
Actually, I find some posible methods to solve this problem, but I don't have confident it is a best solutions or not. First method is to write a function to remove the 'go' command before I execute the script. It is look like very dangerous cause removing the 'go' command may generate other error. Second method is to use the isql utility, but I don't have much of experiance about this method.
Thank you.
October 23, 2005 at 1:05 pm
I just tried to reply to this and everything disappeared so here it is again!
I have tried a couple of methods using ADODB Command Execute method and found that the simplest method is to use one command execute step to create the stored procedure containing your batch updates and then execute the newly created stord procedure.
You can store the whole stored procedure on the server and then simply execute that, rather than recreate it each time unless you have to.
The other methods are to use a file or a recordset to hold each step of the batch and run them indivdually, I do not like the file method much so I will just say use a unique string like |go| instead of just go and then use the Instr() and Mid() functions to identify each step and then run it.
The method I use most often (Apart from the one at the top) is to create a table containing each step of the batch - a field for the step title, one for the SQL itself and then a few others if required - ie. Has Parameters, then fields to store those or flags to prompt users for input. Also store a field for the step sequence so that they run in order. Lastly I have a field that flags whether to stop all processing of the batch and what to do if there is an error, rather than running each step blindly.
I would then open the recordset in step order and prcess an execute for each step, If there is an error I can then be clear on which step caused the error and carry out the corrective action in the application. Processing muliple stepped batches without the Go statement as ADoDB does not recognise it, I found to be problematic when identifying errors.
Hope that helps
David
October 23, 2005 at 6:43 pm
Dear David,
Thank you for your reply. If I not mistake, the varchar length in the sql server 2000 maximum is 8000. And the scripts I have may exceed this limit. Can I know how do you cater this situation when the script is more than 8000 characters?
Thank you.
October 24, 2005 at 12:51 am
If you are referring to storing the scripts in a table as above, use a text (Memo in MS Access) column for the script,
Text: Variable-length non-Unicode data with a maximum length of 2^31 - 1 (2,147,483,647) characters. Should be big enough.
If you are reading the column containing the script into a procedure that whill run it with the EXECUTE command in SQL (See Books Online) you would have to select the column with the script and then split it into reasonable size chunks (4000 or less) in Varchar variables then run the EXECUTE such as
Execute @Varstring1 + @Varstring2
October 24, 2005 at 1:03 am
My team uses the osql tool to solve this problem.
OSQL is a command line too which allows you to execute, among other things tsql scripts.
the parameters included in my sample are:
osql -S localhost -d DevDBName -E -b -i "ClearData.sql"
Just use your favourite programing language to shell out an call this script. This will execute as if being run through query analyser, avoiding any problems associated with ado.
October 24, 2005 at 12:10 pm
"go" is not part of T-SQL. It is an instruction to command line tools (such as QA or osql) indicating "this batch is complete, go execute it."
However, you should code a stored procedure to do the job of your script rather than passing an 8000 character script to SQL server. This will work *much* faster because the server will precompile the script so it doesn't have to syntax check as large a script and will build an execution plan for it at compile time (when the procedure is defined) rather than at execution time.
It is also much more secure as it is harder to pass parameters that hijack the code you're getting the server to compile preexecution. Allowing users to pass general T-SQL batches to a server is a security disaster area.
Simply set up a string that looks like "exec sp_my_procedure_name parameters" and call the database object's "Execute" method on that string. Look up "create procedure" for help on setting up the procedures.
October 24, 2005 at 12:52 pm
[Please accept my apologies for being blunt.]
Reading through this again, it seems like you have major problems with the way you're using your database. I can only think of one scenario in which you need to pass in T-SQL batches that can be up to 8000 characters long and that is creating or altering the stored procedures that are used as production job steps.
If you are processing existing data, you should be calling user stored procedures that do that job. If you're not using stored procedures, you're using your server insecurely and inefficiently.
If you are loading new data, you should either use "bulk insert" or "bcp" or "DTS". Passing in hundreds of statements that look like "insert into Table values (x,y,z)" etc. is a very inefficient use of everybody's time.
What's to stop your users making a mistake and sending in something like "delete from Employees" instead of "delete from Employees where EmployeeId=12345"? It happens.
October 28, 2005 at 7:16 am
Thank you for the help.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply