November 21, 2005 at 3:33 pm
All,
I am attempting to write a DTS package (ActiveX Script and a couple of Execute SQL Task scripts) that will manage the order of execution of several other DTS packages that have been transformed into SQL Server jobs but are not scheduled. The reason I cannot just schedule the jobs is that their execution is dependent on the completion of other events which take place outside of SQL Server. One such event is the creation of a file. Another is the insert of a particular column in a table in an Oracle database indicating that the data warehouse load for a particular subject area is complete.
I have a JobQueue table that contains the name of a job, the date/time it was queued, the date/time it was started, and the date/time it was completed (as well as a meaningless key value for an id column).
The JobQueue table starts each day empty. I want to write an ActiveX script in a DTS package that:
I have the part that manages each individual job complete. Each job updates the corresponding job_started column of the JobQueue table when it begins. Each job updates the corresponding job_completed column of the JobQueue when it is finished running.
The problem comes in at the call to sp_start_job. For some reason, before the job is able to update its row in the JobQueue table to indicate that it has started, the loop comes all the way around to checking to see if there are jobs waiting, thinks this one is (because its job_started column is still NULL), and attempts to run sp_start_job on it again, which results in an error because that job is already running.
My latest attempt at coding follows. As you can see, I've tried brute-force loop counting in an effort to slow down the master script in order to give the under-scripts time to execute, but that has not worked.
If anyone has done something like this, or have ideas on how to handle it, please let me know. Also, if there are more blanks for me to fill in and explain of my thinking regarding this project, please ask.
Thanks,
Dan
===== code follows =====
'*************************************************************
*********
' Visual Basic ActiveX Script
'*************************************************************
***********
Function Main()
Dim thisPackage
Dim loopCount
Dim jobsComplete
Dim jobsCount
Dim pauseCount
loopCount = 0
Set thisPackage = DTSGlobalVariables.Parent
thisPackage.Steps("DTSStep_DTSExecuteSQLTask_1").Execute
jobsComplete = DTSGlobalVariables("gvJobsComplete").Value
Do While (jobsComplete <> 3)
'Begin while loop
loopCount = loopCount + 1
'Check JobQueue table for queued but not started jobs
thisPackage.Steps("DTSStep_DTSExecuteSQLTask_2").Execute
WriteToLog("Checked for queued jobs")
pauseCount = 1
Do While (pauseCount < 250000)
pauseCount = pauseCount + 1
Loop
'For each of them:
jobsCount = DTSGlobalVariables("gvJobsCount").Value
If jobsCount > 0 Then
WriteToLog(" Found at least one queued job")
'Determine the next job name
thisPackage.Steps
("DTSStep_DTSExecuteSQLTask_3").Execute
pauseCount = 1
Do While (pauseCount < 250000)
pauseCount = pauseCount + 1
Loop
WriteToLog(" Determined queued job name")
logText = " Queued job name is: " + _
DTSGlobalVariables("gvNextJobName").Value
WriteToLog(logtext)
'Execute the next job
thisPackage.Steps("DTSStep_DTSExecuteSQLTask_4").Execute
pauseCount = 1
Do While (pauseCount < 250000)
pauseCount = pauseCount + 1
Loop
DTSGlobalVariables("gvNextJobName").Value = Null
WriteToLog(" Executed queued job")
pauseCount = 1
Do While (pauseCount < 250000)
pauseCount = pauseCount + 1
Loop
jobsCount = jobsCount - 1
End If
thisPackage.Steps("DTSStep_DTSExecuteSQLTask_1").Execute
jobsComplete = DTSGlobalVariables("gvJobsComplete").Value
WriteToLog("Checked to see if jobs were complete")
WriteToLog("")
'End while loop
Loop
loopStr = "Loop executed " + CStr (loopCount) + " times."
WriteToLog loopStr
Main = DTSTaskExecResult_Success
End Function
Sub WriteToLog(logText)
Const ForAppending = 8
Dim logFSO
Dim logFile
Dim logFileName
Dim ETLDir
ETLDir = \\servername\foldername
logFileName = ETLDir & "\JobQueueLog.txt"
Set logFSO = CreateObject("Scripting.FileSystemObject")
Set logFile = logFSO.OpenTextFile(logFileName, ForAppending, True)
logFile.WriteLine(logText)
logFile.Close
Set logFile = Nothing
Set logFSO = Nothing
End Sub
November 21, 2005 at 9:59 pm
From my experience, it is much cheaper (in terms of development effort and system resources) to let SQL Agent manage the polling loop. I would create a DTS package implementing your loop body and scheduled it as SQL Agent job running on a frequent schedule.
Regards,
Anatol Romanov
SQL Server MCP
Sydney, Australia
analysts {a t} fitbits.com.au
November 22, 2005 at 8:29 am
Anatol,
Thanks for your suggestion. I think that's what I'm trying to do. I plan on getting that ActiveX script working within a DTS package and then schedule that job.
Unfortunately, my experience so far is that I am not able to make it work due to the timing issues I described in the original post.
Perhaps if you had an example I could look at to see how your suggestion is different from what I have presented so far, I would be able to implement it as you suggested.
Thanks,
Dan McCue
November 22, 2005 at 9:11 am
Dan, what I suggested is NOT what you described in your first post:
---------------
The JobQueue table starts each day empty. I want to write an ActiveX script in a DTS package that:
I have the part that manages each individual job complete. Each job updates the corresponding job_started column of the JobQueue table when it begins. Each job updates the corresponding job_completed column of the JobQueue when it is finished running.
-------------
Try to simplify your DTS package by removing all the loops, as they are pure waste of CPU resource. Make a simple package which will check your Job Queue and if there is a job ready - it will run it. That's all the package should do. Then schedule this package to run as a job as frequently as is required. To set frequency value to less than a minute you can use multiple schedules with a shift. For instance, to run the polling job every 30 sec set up 2 schedules with 1 min frequency: one starts on a minute, the other one - 30 secs later.
Hope this helps,
Anatol
November 22, 2005 at 9:16 am
Thanks for the clarification. I understand how your suggestion is different now. I will give it a shot and let you know.
Thanks,
Dan
November 23, 2005 at 1:32 am
I think you might want to look at enablaing/disabling package steps in ActiveX scripts dotted about the package e.g. at the beginning...
set pkg = DTSGlobalVariables.Parent
set stpFinished = pkg.Steps("DTSStep_DTSActiveScriptTask_3")
stpFinished.DisableStep = True
Then later on...
'The trick to looping in DTS is to set the step at the start of the loop to an execution status of waiting
stpbegin.ExecutionStatus = DTSStepExecStat_Waiting
Then the real test if we should loop. In this case the function ShouldLoop (not listed) tests for excel files in a given directory and I count the no. of files processed...
set pkg = DTSGlobalVariables.Parent
set stpEnterLoop = pkg.Steps("DTSStep_DTSDynamicPropertiesTask_1")
if ShouldILoop = true then
DTSGlobalVariables("gv_Count").Value=DTSGlobalVariables("gv_Count").Value+1
fn=DTSGlobalVariables("gv_FileFullName").Value
fn=fso.getfilename(fn)
DTSGlobalVariables("gv_ListOfFiles").Value=DTSGlobalVariables("gv_ListOfFiles").Value+" "+fn
stpEnterLoop.DisableStep = False
stpEnterLoop.ExecutionStatus = DTSStepExecStat_Waiting
else
set stpFinished = pkg.Steps("DTSStep_DTSActiveScriptTask_3")
stpEnterLoop.DisableStep = True
stpFinished.DisableStep = false
stpFinished.ExecutionStatus = DTSStepExecStat_Waiting
end if
Main = DTSTaskExecResult_Success
I'm sorry if this isn't too clear, I cannot release the full code set, but I hope it's enough to put you on the right track.
December 28, 2005 at 5:44 pm
To Anatol -- I finally got some time to implement your suggestion above. It is working almost flawlessly. I believe the problems (which required the "almost") are due to what Derek suggested in his post above. I have implemented Derek's disable/enable step suggestions also and will be testing them during tomorrow's run (Dec 29 05).
Thank you for your suggestion. Once I am finished I will publish the ActiveX script and the details about my multiple schedules.
Regards,
Dan McCue
December 28, 2005 at 7:29 pm
You could also implement this totally using Anatol's scheduled job suggestion and forget using a DTS package.
A scheduled job has the basic looping and/or retry mechanism that's needed. eg: On Failure Goto Step X, or for retrying a step, specify retry attempts and retry interval on the advanced tab of the step.
--------------------
Colt 45 - the original point and click interface
December 30, 2005 at 1:04 pm
Forget managing this in DTS. This is what the job scheduler already excels at! Viewing the job history of the job makes it really easy to undertand your progess/retry status and amount of time each step takes, and where your bottlenecks are.
I hope this gives y'all some ideas for organizing future processes:
Processes that run in logical sequence dependent on the previous process can be a single job with the sequential processes as job steps. Each step is told to retry x times at x intervals if step fails. You force fail the step if it's dependency criteria are not met. Is the file there? No? Fail me. Yes? Carry on. Is that Oracle record there? No? Fail me. Yes? Carry on. If your step is a dts, this is extremely easy to implement using your workflow properties inside your dts package - force fail package on workflow failure where appropriate. This will bubble up failure to the job step. An easy way to force failure due to job step dependency is to check your step criteria and write pass/fail to a global var - say for example, write FAIL as an output parameter from an ExecSql when myRecord does not exist. Then read the var in an ActiveX. If pass, the ActiveX functions returns success. If fail, the ActiveX function returns failure. Then your package fails, so your job step fails. Since the job step fails, it waits and retries! Voila! That's all there is to it!
Since the scheduler runs in sequence, if you have two steps that can run simultaneously you can save time by doing this: code them as dts packages, put them inside a mother dts and execute them in parallel.
If you modularize your job thoughtfully into atomic steps, when a step gives up retrying and stops the job, it is easy to simply restart the job on that step when the data is ready!
Have fun!
[font="Courier New"]ZenDada[/font]
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply