April 24, 2014 at 2:33 am
I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.
My file list looks like follows.
InsertToOrderTypes.sql
UpdateClientAddress.sql
DeleteDuplicateOrders.sql
InsertToEmailAddress.sql
ConsolidateBrokerData.sql
UpdateInventory.sql
EliminateInvalidOfficeLocations.sql
My log table in the database looks like this.
select * from sqlfileexecutionlog
FileNameRunTimeResult
---------------------
DeleteDuplicateOrders.sql03/12/2014 14:23:45:091Success
UpdateInventory.sql04/06/2014 08:44:17:176Success
Now I want to create a batch file to run the remaining files from my directory to my sql server. I also want to wrap each of these sql file executions in a transaction and log success/failure along with the runtime and filename into sqlfileexecutionlog table. As I add new sql files into this directory, I should be able to run the same batch file and execute only the sql files that have not bee run.
It will be great if someone can give me inputs or sample code to achieve this via regular command shell as well as powershell.
April 24, 2014 at 4:32 am
SQLCurious (4/24/2014)
I have about 1200 sql files in one of my folders. Almost all of these files do data inserts and updates, so they should be run only once. As and when required I have manually ran around 150 of them already. Whenever I ran any of these scripts, I log that file name into a log table in my sql server including the execution time. Since running 1000+ more files takes a lot of time, I want to automate running of these files through a batch file. But I also want to filter the files that are already run.
I am just wondering whether this is the right approach, looks to me as both error prone and very time consuming. Have you looked into implementing this in a SSIS package/framework? If the content of the files are loaded into a "command" table, a single package could run all the scripts and utilize the logging framework in SSIS.
😎
April 24, 2014 at 10:20 am
This should work if you run it from the folder where your files are, quick and dirty:
for %%R in (*.sql) do sqlcmd -i "%%R" -S {server_name} -U {user} -P {pwd} -d {db_name}
This won't stop on any errors, but you can add some handling.
April 25, 2014 at 5:22 am
Well try this one i am not sure what you want do
but to my best
Create a Dos batch file using the following batch script
ECHO :ON ERROR Exit>>"All in one Script.sql"
Echo PRINT 'Start $Sec$: ' + CONVERT(VARCHAR,GETDATE(),108)>>"All in one Script.sql"
Echo :setvar path %0>>"All in one Script.sql"
for /r . %%f in (*.sql) do call :ACTION "%%~nxf"
goto END
:ACTION
echo. >>"All in one Script.sql"
echo :r $(path)"\%~1">>"All in one Script.sql"
echo GO >>"All in one Script.sql"
echo PRINT '[%~1] Completed Successfully - $Sec$: ' + CONVERT(VARCHAR,GETDATE(),108)>>"All in one Script.sql"
echo GO >>"All in one Script.sql"
echo. >>"All in one Script.sql"
:END
the above code loop through all the sql files, if you want do any changes for a single file code use the action block
just paste the batch file inside the folder and run it or just pass the folder as a parameter
it just prepare a Sql command mode script file you need to verify it before you run it
Every rule in a world of bits and bytes, can be bend or eventually be broken
MyBlog About Common dialog control
A Visualizer for viewing SqlCommand object script [/url]
April 28, 2014 at 10:47 am
Thanks for all your inputs and suggestions. I will work towards implementing this in SSIS which seems to have better control. I appreciate everyone's time and effort in providing valuable suggestions.
April 28, 2014 at 11:41 am
SQLCurious (4/28/2014)
Thanks for all your inputs and suggestions. I will work towards implementing this in SSIS which seems to have better control. I appreciate everyone's time and effort in providing valuable suggestions.
Let us know if you need help with the SSIS implementation, there are quite a few samples available.
😎
April 28, 2014 at 12:15 pm
It will be great if you can provide me some :). I am also looking for a framework which can be generally used to logging the execution results (time taken, success, failure, rowcounts etc.,) for each of the packages executed. Once it is setup, I would like it to be reused for any additional packages I may be adding in future. I would grately appreciate if you can provide me with such examples.
April 28, 2014 at 12:42 pm
I'll dig into it and let you know. Certain I have something and probably there are others that have similar, lets hope they share:-P
😎
April 29, 2014 at 7:59 pm
Why are you running code from files, instead of having the code in stored procedures, then executing the stored procedures. Is there an advantage to files ?
May 5, 2014 at 5:20 pm
These are mostly for one-time deployment. I wouldn't be reusing them at all.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply