November 30, 2009 at 7:59 am
Hello All,
I am having a problem in implementing the following scenario:
I have to run scripts against my database using SSIS package and SQL agent job. This package is deployed on server and a SQL agent job is defined which runs every 5 min.
I want to create dynamic log output files when a particular script is run against my database and if the script fails it needs to send email with the log file attached.
What I have implemented is:
created a foreach loop container -> pointed to pick up *.sql files and then defined a move file system task to move the file to a different folder, to mark it as run.
This is running fine. but want to enhance some more so that I can put in the dynamic log file generation if the script fails and then email that log file to the user.
Any reference or ideas are appreciated !
Thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
November 30, 2009 at 10:14 am
Hello
If the **script** is on the "Control Flow" design screen then you could add a Execute SQL task the uses sp_send_dbmail with attention paid to the file attachment option. You would need to build a comma separated list of the filenames you wanted to attach. But it would work, just a bit of work building the file list (maybe in a script task that does some Directory work and output the file list through a package variable).
HTH
JB
SQL_Quest-825434 (11/30/2009)
Hello All,...
This is running fine. but want to enhance some more so that I can put in the dynamic log file generation if the script fails and then email that log file to the user.
...
December 1, 2009 at 11:12 am
Hi,
Can you give any example for this or any reference. I have tried implementing logging but it does not take the file name which ran against the database and some how it creates 2 log files for each failure !
any help is appreciated !
thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 1, 2009 at 11:39 am
December 1, 2009 at 1:45 pm
John Rowan (12/1/2009)
How are you executing the SQL scripts against the DB?
here is what is set up now and is working fine.
There is a folder for each DB server, under that are the DBs folders.
The script file is dropped in the directory with the Data base name against which the script
will run and the there is a SQL agent job that will pick it up and run it.
Also the file has a .sql extension.
The job kicks off every 20 mins and will check the directory for valid scripts and run them once they are complete it moves the script files to a separate directory.
** I am enhancing this process to include detailed logging for the failed scripts and email the log files to concerned group.
Thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 2, 2009 at 3:42 pm
what I want to enhance is :
If a user runs 3 files e.g update.sql ; delete.sql; count.sql
the log file should be update[datetimestamp].txt; delete[datetimestamp].txt; count[datetimestamp].txt
I am able to figure out how to append [datetimestamp] using expression, but I need some help with the filename.
Also, is it possible to send the log files via email ? :w00t:
Thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 2, 2009 at 3:49 pm
Sorry for the delay on this, it kind of fell off my radar. The intent of my last question was to learn which control/data flow tasks you've used to do the script execution. Knowing how you've set your package up will help determine where we can plug additional tasks in to do the logging/email work that you want to add.
December 2, 2009 at 6:59 pm
John Rowan (12/2/2009)
Sorry for the delay on this, it kind of fell off my radar. The intent of my last question was to learn which control/data flow tasks you've used to do the script execution. Knowing how you've set your package up will help determine where we can plug additional tasks in to do the logging/email work that you want to add.
Hi,
In my control flow, I have
ForEach Loop container -> Enumerator is -> Foreach File Enumerator
Enumerator configuration:
Folder: G:\SSISTest\ScriptRunner\Files
Files: *.sql (fully qualified)
variable mapping: User::Filename --> Index: 0
Then I have a script task:
Public Class ScriptMain
Public Sub Main()
Dim filename As String = Dts.Variables("filename").Value.ToString
Dim objReader As New StreamReader(filename)
Dim sLine As String = ""
Dim sql As String = ""
Do
sLine = objReader.ReadLine()
If Not sLine Is Nothing Then
sql = sql & sLine
End If
Loop Until sLine Is Nothing
objReader.Close()
Dts.Variables("filesql").Value = sql
Dts.TaskResult = Dts.Results.Success
End Sub
Protected Overrides Sub Finalize()
MyBase.Finalize()
End Sub
End Class
readonlyVariables: filename
ReadWriteVariables: filesql
Then I have Execute SQL task:
Resultset: None
Connecction type: OLE DB
Connection: Source_ServerName --> points to the local instance
source type: variable
source variable: user::filesql
Then I have move file system task
Destination Connection:
Destination variable: user::Destination --> G:\SSISTest\ScriptRunner\RunScripts
Operation: MoveFile
Source Connection:
Source variable: user::filename --> I have given a value = test
then I have defined a connection manager to flat file for Logging
I have used an expression to append the datetime stamp
"G:\\SSISTest\\ScriptRunner\\Failed\\" + @[User::filename] +
(DT_STR,4,1252)DATEPART( "yyyy" , @[System::StartTime] ) +
RIGHT("0" + (DT_STR,4,1252)DATEPART( "mm" , @[System::StartTime] ), 2) +
RIGHT("0" + (DT_STR,4,1252)DATEPART( "dd" , @[System::StartTime] ), 2) +
RIGHT("0" + (DT_STR,4,1252)DATEPART( "hh" , @[System::StartTime] ), 2) +
RIGHT("0" + (DT_STR,4,1252)DATEPART( "mi" , @[System::StartTime] ), 2) +
RIGHT("0" + (DT_STR,4,1252)DATEPART( "ss" , @[System::StartTime] ), 2) +
".txt"
but this will output the log file as
G:\SSISTest\ScriptRunner\Failed\test20091202150922.txt
This outputs 2 log file with different timestamps ! thats strange !
which I dont want. I want it as
e.g : if the script to be run is update.sql ; delete.sql
the output should be update20091202150922.txt ; delete20091202150922.txt
and these log files should be emailed to respective group.
I know this is a bit tricky so I need some help if possible.
I hope I am clear of what I want to achieve. I have attached an image of my package also.
Edit ** There is no data flow task .. Only Control Flow is used !
Thanks for all your help,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 3, 2009 at 2:46 am
First off, thanks for the detailed post. It is most helpful to know where you are at so we can make the appropriate adjustments and get you moving in the right direction. With that said, I think you'll either love or hate my reply as I do have some insights on this since I've designed a similar process. Itβs not exactly what you want to do, but it is similar enough to where you should be able to change it up a touch and end up with what you are looking for.
So, first I'll give you a brief explanation of what my package was designed for and how it works. I have a series of packages that I run that support processes that need to have the ability to call the packages on the fly instead of scheduling the packages. Some of the packages need the ability to invoke the other packages so I developed a Service Broker based package architecture where packages are 'called' by sending a message into a SB queue. Long story short, one of those packages was created to execute SQL against any SQL Server database.
So my package reads a message off of the SB queue that tells it the SQL command, the database instance and name, where to log the results, and where to create the .sql script. Now, my process has to create the SQL script from the xml message. Your process begins here as you've already got the SQL commands in a file. The package then executes the SQL via sqlcmd (more on that later), checks for errors, and sends a success/failure message back to the originating queue.
Now, based on what I've learned in designing this, here's how I suggest setting up your package:
1. Keep your For Each Loop Container set up just as you have it.
2. Drop your Script Task that reads the file contents into a variable for execution. SSIS is pretty quirky when it comes to data types. The Control Flow has different types than Data Flows. The Data Flow types more closely mimic those used by relational databases and therefore support large object data types (varchar(max) = DT_TEXT). The Control Flow does not support large data types. You'll find that the String data type is soon overrun. I think the limit is 8000 characters. This may seem like a lot, but once you build out a process to execute SQL scripts, you'll want to not be limited by how much SQL you can put into a file (at least I would think not).
3. Because you'll not be storing the file contents in a Control Flow string variable, you can drop the Execute SQL Task as well.
4. Create an Execute Process task. This task will invoke SQLCmd, run your script, and allow you to capture errors from the file. Configure the task as such:
-- Executable: C:\WINDOWS\system32\cmd.exe
-- Arguments: set this one up via the expression builder. Here is the expression that mine runs off of. Again, it's not going to be exactly what you want, but you'll get enough out of this to make yours work.
" /C SQLCMD -S " + @[User::SQLInstance] + " -d " + @[User::DBName] + " -o " + @[User::LogFolder] + REPLACE( @[User::QueryName] , " " , "_") + "_sql.log -i " + @[User::ScriptFolder] + REPLACE(@[User::QueryName], " ", "_") + ".sql"
You'll see here that I'm calling SQLCmd, passing in the SQL Server instance name, database name, location of the log file, dynamically create the log file name (this will be where you make yours different), the location and name of the script to execute. Presto! This expression sets up the script to run via SQLCmd and creates a log file output from the script execution just like you'd typically see in the result pane of SSMS.
You can leave all other settings as default. I played around with using the Standard Error variable, but I scrapped it for the error handling that I'll talk about next.
5. Create an Event Handler on the above Execute Process task for the OnError event.
6. Inside the OnError Event Handler, create an Execute SQL task and configure it as such:
-- Result Set : Single Row
-- SQL Source Type: Direct Input
-- SQL Statement : SELECT 0 as ErrorCode, '' as ErrorDescription (NOTE: This is a dummy statement used to allow the validation process to succeed, we'll actually use an expression for the source, but this was needed to trick SSIS validations).
-- Result Set Tab:
-- Result Name: 0 / Variable Name: User::ErrorCode (create this as int)
-- Result Name: 1 / Variable Name: User::ErrorDescription (create this as string)
-- Expressions: set up an expression for the SQLStatementSource property. Here's the expression I'm using:
"SELECT " + (DT_STR, 15, 1252)@[System::ErrorCode] + " as ErrorCode, '" + @[System::ErrorDescription] + "' as ErrorDescription"
The System level ErrorCode and ErrorDescription variables are only scoped to the OnError Event Handler so this task is simply getting the system variables and populating user variables so we can use the values outside of the Event Handler. Alternatively, you could use a Script task to do this.
One more step while in the Event Handler. I assume that when a script fails, you don't really want the package to fail; rather, you want the email to go out and the package to move on to the next available script. To create this behavior, open up the variables tab and look for the Propagate variable scoped to the OnError object. If you don't see it, you'll have to click on the button that shows System variables. By default, this variable is set to True. This means that when an error occurs on a task, that error is propagated up to the parent container. In this case, a failed SQL script would bubble that error up to the For Each Loop container and stop your package execution. Stopping package execution means that you cannot handle the error, send emails, etc. Set the variable to False. Now, when a script fails, the error will not halt package execution and we can configure an email task to send out the log file.
7. Create a Send Mail task inside the For Each Loop Container, after the Execute Process task. Set the Precedence Constraint as such:
-- Evaluation Operation: Expression and Constraint
-- Value: Completion (not failure, remember- we don't want the Execute Process task to fail the package as we want the loop to move on to the next file)
-- Expression: I have not tested this as my package does not contain this step, but this should work:
@[User::ErrorCode] != 0
The Send Mail task should only execute when the script fails. Set the rest of the task up to find the log file and use it as an attachment.
That about sums it up. I think this encompasses all of your requirements, plus some of the pitfalls that you may or may not have run into yet. Sorry to be so long winded, but I thought this would help, plus I've been waiting for a batch process to complete so I had the time to get into detail. Anyways, I hope this helps. I'll be out of the office on PTO until next Tuesday so if you post questions and don't hear back, please be patient. I imagine it will take you a couple of days to revamp your package and test it out anyhow.
Good luck and please let me know how it goes.
December 3, 2009 at 10:01 pm
Thanks John for your detailed post !
Your approach is very good, but I have a limitation that I cannot modify the package, but can add functionality to it.
The only point I am stuck is passing the name of script that ran as a name to the output or log file.
And sending that file as an attachment.
e.g.
script to be run: update.sql ---> log file should be update[datetimestamp].txt
I dont know how to pass dynamically the name [i.e in this case update] to the log / output file generated meaning how would ssis know that what file it is executing when it is picked up by for each loop container ? :w00t:
If You can point me or show any direction to accomplish this, I will really appreciate your effort !
*** Edit
***************** π ***************************************
After reading your post for several times and trying to understand it, I was able to generate the output files with the script file name .
Now the I need your help for
5. Create an Event Handler on the above Execute Process task for the OnError event.
6. Inside the OnError Event Handler, create an Execute SQL task and configure it as such:
-- Result Set : Single Row
-- SQL Source Type: Direct Input
-- SQL Statement : SELECT 0 as ErrorCode, '' as ErrorDescription (NOTE: This is a dummy statement used to allow the validation process to succeed, we'll actually use an expression for the source, but this was needed to trick SSIS validations).
-- Result Set Tab:
-- Result Name: 0 / Variable Name: User::ErrorCode (create this as int)
-- Result Name: 1 / Variable Name: User::ErrorDescription (create this as string)
-- Expressions: set up an expression for the SQLStatementSource property. Here's the expression I'm using:
"SELECT " + (DT_STR, 15, 1252)@[System::ErrorCode] + " as ErrorCode, '" + @[System::ErrorDescription] + "' as ErrorDescription"
The System level ErrorCode and ErrorDescription variables are only scoped to the OnError Event Handler so this task is simply getting the system variables and populating user variables so we can use the values outside of the Event Handler. Alternatively, you could use a Script task to do this.
One more step while in the Event Handler. I assume that when a script fails, you don't really want the package to fail; rather, you want the email to go out and the package to move on to the next available script. To create this behavior, open up the variables tab and look for the Propagate variable scoped to the OnError object. If you don't see it, you'll have to click on the button that shows System variables. By default, this variable is set to True. This means that when an error occurs on a task, that error is propagated up to the parent container. In this case, a failed SQL script would bubble that error up to the For Each Loop container and stop your package execution. Stopping package execution means that you cannot handle the error, send emails, etc. Set the variable to False. Now, when a script fails, the error will not halt package execution and we can configure an email task to send out the log file.
7. Create a Send Mail task inside the For Each Loop Container, after the Execute Process task. Set the Precedence Constraint as such:
-- Evaluation Operation: Expression and Constraint
-- Value: Completion (not failure, remember- we don't want the Execute Process task to fail the package as we want the loop to move on to the next file)
-- Expression: I have not tested this as my package does not contain this step, but this should work:
@[User::ErrorCode] != 0
The Send Mail task should only execute when the script fails. Set the rest of the task up to find the log file and use it as an attachment.
I am following the exact same steps as you mentioned above, but I am not able to get the error message meaning the package executes all, if a script failed the only place I can see is the generated output file
I have modified the approach as :
1. for each loop container
i. script task (my original script task)
ii. Execute process task ( which you mentioned and it works perfect and output files are generated as well with the script name).
I am stuck with the event handler part and send mail task !
Why the event handler has execute SQL task with a dummy statement ?
Also the variables you mentioned
User::ErrorCode and User::ErrorCode
I created as Int32 and string with values 0 and 1
but stuck as the package is just executing, even if the script fails !!!
Please help me out
Many thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 4, 2009 at 11:01 am
I was also interested in this. In case someone is still looking for a solution, have a look here:
http://cognitivecache.blogspot.com/2008/08/log4j-writing-to-dynamic-log-file-for.html
December 4, 2009 at 11:42 am
becklery (12/4/2009)
http://cognitivecache.blogspot.com/2008/08/log4j-writing-to-dynamic-log-file-for.html%5B/quote%5D
Can this be used in SSIS ? If yes, can you explain in some detail as to where to use e.g in script task or any other task ?
Thanks,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 8, 2009 at 9:29 am
John,
Any ideas for the issue I am facing.
1. I want to fail the execute process task only if the script fails and log the error message.
2. Send email.
You can refer to my previous comments of where I am facing the issue.
Thanks John for your help,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
December 8, 2009 at 1:52 pm
I am stuck with the event handler part and send mail task !
Why the event handler has execute SQL task with a dummy statement ?
Also the variables you mentioned
User::ErrorCode and User::ErrorCode
I created as Int32 and string with values 0 and 1
but stuck as the package is just executing, even if the script fails !!!
Please help me out
Many thanks,
\\K
The event handler's execute SQL statement uses an expression to build out the SQL Statement. The dummy statement is used to prevent the validation step from failing if the expression evaluates to an invalid SQL Statement.
I am a bit confused as to where this stands. You said the package still executes on error? Isn't that what you want? It should produce an error email and continue package execution with the next script. Are you saying that the email is not being produced? Please clarify as to what you are seeing and what you would like to see.
December 8, 2009 at 8:18 pm
John Rowan (12/8/2009)
I am stuck with the event handler part and send mail task !
Why the event handler has execute SQL task with a dummy statement ?
Also the variables you mentioned
User::ErrorCode and User::ErrorCode
I created as Int32 and string with values 0 and 1
but stuck as the package is just executing, even if the script fails !!!
Please help me out
Many thanks,
\\K
The event handler's execute SQL statement uses an expression to build out the SQL Statement. The dummy statement is used to prevent the validation step from failing if the expression evaluates to an invalid SQL Statement.
I am a bit confused as to where this stands. You said the package still executes on error? Isn't that what you want? It should produce an error email and continue package execution with the next script. Are you saying that the email is not being produced? Please clarify as to what you are seeing and what you would like to see.
Thanks John for your reply !
Yes, I want the package to execute on error, but I want the execute Process task to fail and capture that error.
The reason for this is that due to execute process task calling SQLCMD, if a script fails, then I have to look at the output file to figure out which script failed. I know that this is not feasible to check all the output files to determine script failure.
I want to capture that failure message for any failing script and email that output file to the user group.
I am doing exactly what u have mentioned in your previous post for event handler part and because of that I am able to get to half way, so need more help to reach destination ! π
I hope I am clear of what I want to achieve.
Thanks again,
\\K
______________________________________________________________________________________________________________________________________________________________________________________
HTH !
Kin
MCTS : 2005, 2008
Active SQL Server Community Contributor π
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply