January 13, 2004 at 5:52 am
I have a DTS package that loads a SQL table from a CSV file. I can't guarantee that the primary key fields contain no dups, so I've built in steps to load a table with no PK Unique Constraint, check for dups in the PK column, delete the dups then load the real table from the cleaned up table.
My problem comes in when I want to detect whether or not there are duplicates so I can save that info and email it to the people providing the data so they can make corrections for future runs. The only way I've come up with is an ActiveX task that looks at the duplicate count and Succeeds if it's >0 and Fails if it's =0. If it's =0 I skip the Delete Duplicates and load the table. If it's >0 then I extract the info on the duplicates, email it, delete the duplicates then load the table.
This is all working fine except for one thing: The entire package reports a failure because I generated a Failure in the ActiveX to effect the workflow.
Is there a better way to do this? I thought about taking all the processing post-ActiveX and putting it into sub-packages, then having the ActiveX modify the sub-package name in the Execute Package step. This way the ActiveX always "Succeeds" and no errors are reported. This seems a bit obtuse and overly complex. Any other suggestions for implementing workflow without generating package level errors would be greatly appreciated.
January 13, 2004 at 8:07 am
This sounds like it could be handled in a stored procedure. Check for dups in a SQL statement in the procedure. If they exist, call another stored procedure to email them. Off the top of my head, I don't see a need to fail a step, since you want to continue "successfully" either way.
Larry
Larry
January 13, 2004 at 8:11 am
I do stuff like this a lot. I get data in from a multitude of outside sources. These sources aren't always reliable, so I have QA checks built into all of my dts packages that check to see what data they sent me and email me an alert telling me if they sent everything or if they left part of it out, and stuff like that.
The way I do it is to send the mail directly from the ActiveX task. Then I just move on to the next step. Here an example showing you what I mean. It checks an ftp site and reports back what files were sent and also verifies that the mapped drive to our photo server is accessible.
'**********************************************************************
' Visual Basic ActiveX Script
'************************************************************************
Function Main()
Dim objFSO
Dim objFolder
Dim objFiles
Dim objFile
Dim objDrive
Dim objConn
Dim strPath : strPath = "\\FTPServer\ftp$\grar\"
Dim strConn : strConn = "Driver={SQL Server};Server=MyServer; Database=Master;Trusted_Connection=YES"
Dim strMessage
Dim strRecipients : strRecipients = "me@myjob.com"
Dim strSubject : strSubject = "GRAR FTP Site Data QA"
Dim strPhotoDrive : strPhotoDrive = "N:"
Dim strSQL
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objFolder = objFSO.GetFolder(strPath)
Set objFiles = ObjFolder.Files
strMessage = "The following Files are on the ftp site:" & vbCrLf
strMessage = strMessage & "File Name" & vbTab & vbTab & "File Size" & vbTab & "Date Created" & vbTab & vbTab & "Date Modified" & vbCrLf
strMessage = strMessage & String(70, "-") & vbCrLf
For each objFile in objFiles
strMessage = strMessage & objFile.Name & vbTab & objFile.Size & vbTab & objFile.DateCreated & vbTab & objFile.DateLastModified & vbCrLf
Next
Set objFile = nothing
Set objFiles = nothing
strMessage = strMessage & vbCrLf
If ObjFSO.DriveExists(strPhotoDrive) Then
Set objDrive = objFSO.GetDrive(objFSO.GetDriveName(strPhotoDrive))
strMessage = strMessage & "Mapped drive ( " & strPhotoDrive & " -- " & objDrive.ShareName & " ) to PhotoServer exists." & vbCrLf
Else
strMessage = strMessage & "Mapped drive ( " & strPhotoDrive & " ) to PhotoServer DOES NOT EXIST." & vbCrLf
End If
Set objDrive = nothing
Set objFSO = nothing
strMessage = strMessage & vbCrLf
Set objConn = CreateObject("ADODB.Connection")
objConn.ConnectionString = strConn
objConn.Open strConn
strSQL = "Exec Master.dbo.xp_SendMail @recipients='" & strRecipients & "', @subject='" & strSubject & "', @message='" & strMessage & "'"
objConn.Execute(strSQL)
objConn.Close
set objConn = nothing
Main = DTSTaskExecResult_Success
End Function
January 13, 2004 at 8:18 am
This might work. I suppose I can do all the duplicate record handling in the ActiveX: If duplicate_count>0 then select the duplicate info, build the email, send it then delete the dups and return success, otherwise just return success and don't have any of the "duplicate" code in the DTS workflow....
January 13, 2004 at 8:27 am
Use ActiveX workflow script to stop processing if duplicate_count>0 by setting
Main = DTSStepScriptResult_DontExecuteTask
Far away is close at hand in the images of elsewhere.
Anon.
January 13, 2004 at 8:38 am
I don't want to stop processing, I want to branch around a set of tasks that are run only if duplicates are present, and resume processing at the next task common to both workflows. Sort of an "if-then-else" capability.
January 22, 2004 at 9:52 pm
Check out this article,
http://www.sqldts.com/default.aspx?218
--------------------
Colt 45 - the original point and click interface
January 23, 2004 at 9:49 am
The suggestion presented would work, but as to your question about conditional logic in DTS, yes, there is a way. What you have to do is have the active-x step not connected to the next 2 possible steps by a success/failure/complete. Have the next 2 steps disabled under workflow/workflow properties. In your conditional active-x step you will have an if then or some conditional logic that will in turn dynamically enable one of the next steps. At the end of the package just have an active-x script to clean up by making both the earlier steps disabled again. To do this you need script that modifies the properties of the steps.
'**********************************************************************
' Visual Basic ActiveX Script
'************************************************************************
Function Main()
set pak = dtsglobalvariables.parent
set step1 = pak.steps("DTSStep_DTSExecuteSQLTask_6")
set prop1 = step1.properties("DisableStep")
prop1.value = "False"
Main = DTSTaskExecResult_Success
End Function
This just shows how to enable the tasks step that has the step name DTSStep_DTSExecuteSQLTask_6. Keep in mind that the step name is different from the the task name, and each task has an associated step. If you are unsure what the step name is for a given task they just go in order with the first one named DTSStep_DTSType_#, but you can always check properties like the name through the dynamic properties task. This might seem cumbersome, but if you are considering using the ole file object and such this is no more complicated then that. Besides being able to dynamically modify properties of steps and tasks in DTS is where the real power of the software lies.
January 23, 2004 at 11:56 am
Interesting way to do it. I don't think it's ideal for this particular instance, but I can think of some instances where it would be ideal. For example, if a data source sends me 5 ftp files every night, but ocassionally only sends me 4, I might want to disable the import step for the missing file so that it doesn't generate an error.
Good tip.
January 23, 2004 at 12:03 pm
Interesting idea Tobie, but it wouldn't really work in my particular situation. I have a package that has a workflow like A->B->C->D->E. At step C I want to look at the # of records in a view and, if > 0 then run step C1, and then continue with step D. I think perhaps the best answer for me in this case is a combination of several of the above recommendations: Since step C is already an ActiveX task using VBScript I can look at the recordcount, if >0 then execute a child package containing just step C1 and then continue on the Success path with no problems.
January 23, 2004 at 1:04 pm
It may not be ideal for your case, but with the logic I proposed it is easy to go from A,B conditional B1 B2,C where you would skip B1 and B2 if the the condition was met. It's just a different paradigm than what you are use to with using the task flow connectors.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply