Job Scheduling

  • I have a job that is now approaching 20 steps. If I were to place each functionally grouped set of steps in its own job, I would have 3 or 4 jobs, with 4 or 5 steps each. I'd love to do that, but the problem is that each step is dependant on the one before it. I could set them up to run on a schedule, trying to leave enough time for one to end before the next starts, but that's risky. Some of these jobs can fluctuate by quite a bit based on the amount of data that comes in that day.

    Is it bad practice to use sp_start_job as the last step in a job? I'd like to set it up so that if the job reaches the end successfully, it kicks off the next job. Is there a better way to do that? Is the very idea anathema to SQL Server? Do I need to keep it all as one job?

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

  • This may be a job for service broker. Earlier steps result in a message (or messages) being placed in a service broker queue and the subsequent steps are processed as they are received.

    An alternative solution is to just insert rows in a table at each stage. Each subsequent stage processes those rows as necessary then updates a status or writes them to another table where they are processed by subsequent steps.

    If you could provide more specific details I may be able to provide additional suggestions.

    The probability of survival is inversely proportional to the angle of arrival.

  • Take a look at this article: http://www.mssqltips.com/tip.asp?tip=2167

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Stefan Krzywicki (12/6/2010)


    I have a job that is now approaching 20 steps. If I were to place each functionally grouped set of steps in its own job, I would have 3 or 4 jobs, with 4 or 5 steps each. I'd love to do that, but the problem is that each step is dependant on the one before it. I could set them up to run on a schedule, trying to leave enough time for one to end before the next starts, but that's risky. Some of these jobs can fluctuate by quite a bit based on the amount of data that comes in that day.

    Is it bad practice to use sp_start_job as the last step in a job? I'd like to set it up so that if the job reaches the end successfully, it kicks off the next job. Is there a better way to do that? Is the very idea anathema to SQL Server? Do I need to keep it all as one job?

    So, what's the problem with having a single job with 20+ steps? (I have one with > 60, so if there's a problem, I'd like to know why!)

    But to answer your questions

    Is it bad practice to use sp_start_job as the last step in a job?

    and

    Is the very idea anathema to SQL Server?

    sp_start_job is a documented procedure, so I see no problems in using it.

    Is there a better way to do that?

    Not necessarily better, but have you considered an SSIS pkg that runs all of these steps?

    Do I need to keep it all as one job?

    Again, I see no problems with having 20 steps in a job.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • WayneS (12/6/2010)


    Stefan Krzywicki (12/6/2010)


    I have a job that is now approaching 20 steps. If I were to place each functionally grouped set of steps in its own job, I would have 3 or 4 jobs, with 4 or 5 steps each. I'd love to do that, but the problem is that each step is dependant on the one before it. I could set them up to run on a schedule, trying to leave enough time for one to end before the next starts, but that's risky. Some of these jobs can fluctuate by quite a bit based on the amount of data that comes in that day.

    Is it bad practice to use sp_start_job as the last step in a job? I'd like to set it up so that if the job reaches the end successfully, it kicks off the next job. Is there a better way to do that? Is the very idea anathema to SQL Server? Do I need to keep it all as one job?

    So, what's the problem with having a single job with 20+ steps? (I have one with > 60, so if there's a problem, I'd like to know why!)

    But to answer your questions

    Is it bad practice to use sp_start_job as the last step in a job?

    and

    Is the very idea anathema to SQL Server?

    sp_start_job is a documented procedure, so I see no problems in using it.

    Is there a better way to do that?

    Not necessarily better, but have you considered an SSIS pkg that runs all of these steps?

    Do I need to keep it all as one job?

    Again, I see no problems with having 20 steps in a job.

    I was worried that the more steps in a job, the shorter the stored job history.

    While the steps each rely on the one before, the steps aren't necessarily logically related. There are some that could be run at other times. I just think logically grouped tasks should each be in their own jobs. I'll have to look at the various jobs again and see if I should combine them all into 1 30 step job.

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

  • Stefan Krzywicki (12/6/2010)


    I was worried that the more steps in a job, the shorter the stored job history.

    You could always increase the amount of saved rows in the job history.

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • WayneS (12/6/2010)


    Stefan Krzywicki (12/6/2010)


    I was worried that the more steps in a job, the shorter the stored job history.

    You could always increase the amount of saved rows in the job history.

    Huh, I didn't know you could do that. Thanks!

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

  • WayneS (12/6/2010)


    You could always increase the amount of saved rows in the job history.

    I did this on my SQL2K server which is OK most of the time but IIRC the setting is for the max for all jobs not per job. Occasionally I have trouble accessing job history or EM dies. I have yet to see if it causes problems with 2008.

    In my jobs I set the ouput to go to one or more log files and use PRINT in the T-SQL to output processing info.

    Far away is close at hand in the images of elsewhere.
    Anon.

  • As for your original question. I would leave all 20 steps in one job and make sure the process flow/dependancies (success/failure) are set right.

    Having separate jobs means having a communication method between them which would be another point of failure to contend with (what happens if the job is successful but the storing of the result fails, would the job be rerunnable and all following jobs would have to be handled manually).

    I have one system with four separate jobs in and out of sql on different servers all running on staggered times and whilst most of the time it works OK it can be a pain if one job has a problem so you have to code very carefully to allow for it.

    Far away is close at hand in the images of elsewhere.
    Anon.

  • Stefan Krzywicki (12/6/2010)


    I have a job that is now approaching 20 steps. If I were to place each functionally grouped set of steps in its own job, I would have 3 or 4 jobs, with 4 or 5 steps each. I'd love to do that, but the problem is that each step is dependant on the one before it. I could set them up to run on a schedule, trying to leave enough time for one to end before the next starts, but that's risky. Some of these jobs can fluctuate by quite a bit based on the amount of data that comes in that day.

    I'd like to set it up so that if the job reaches the end successfully, it kicks off the next job.

    Understand that the way the sp_start_job proc works is that once it kicks off the job in question, the original job will continue to process. The step does not wait for the nested job to finish. It only waits for the startjob proc to execute. So if you're going to have steps after your "sp_start_job" step that need to wait until the nested job finishes before going on, then you should either stick those steps in the job steps you just kicked off, or create a whole 'nother job that is then kicked off in turn by the finish of the nested job.

    Or just keep everything in one job so you know the job steps before have completed.

    Brandie Tarvin, MCITP Database AdministratorLiveJournal Blog: http://brandietarvin.livejournal.com/[/url]On LinkedIn!, Google+, and Twitter.Freelance Writer: ShadowrunLatchkeys: Nevermore, Latchkeys: The Bootleg War, and Latchkeys: Roscoes in the Night are now available on Nook and Kindle.

  • Stefan Krzywicki (12/6/2010)


    I have a job that is now approaching 20 steps. If I were to place each functionally grouped set of steps in its own job, I would have 3 or 4 jobs, with 4 or 5 steps each. I'd love to do that, but the problem is that each step is dependant on the one before it. I could set them up to run on a schedule, trying to leave enough time for one to end before the next starts, but that's risky. Some of these jobs can fluctuate by quite a bit based on the amount of data that comes in that day.

    Is it bad practice to use sp_start_job as the last step in a job? I'd like to set it up so that if the job reaches the end successfully, it kicks off the next job. Is there a better way to do that? Is the very idea anathema to SQL Server? Do I need to keep it all as one job?

    Nothing bad practice about using sp_start_job at the end of a job - or even in the middle if the either the rest of the first job has no dependency on the second job or you are willing to poll for completion. There are times when there's no better option than to use sp_start_job this way (when the second job runs under a different security principal, for example). Also you can use sp_update_job or sp_update_job_schedule to enable the second job if it doesn't have to run immediately (I've made jobs self-disabling when I've done that).

    Of course there's also nothing wrong with jobs with many steps; probably need to include some logging though, rather than relying on job history (how many hundreds of rows to you want to have all jobs permitted to accumulate if you rely on job history for jobs with many steps?) and that means you have to manage the log files (deleting old ones).

    Tom

  • I prefer a bigger job with some custom logging over splitting it up. It makes it more obvious that it's all part of the same sequence. More "ACID".

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • GSquared (12/7/2010)


    I prefer a bigger job with some custom logging over splitting it up. It makes it more obvious that it's all part of the same sequence. More "ACID".

    Until you get to a job that takes more than 3 days to complete - and steps in that job are not dependent upon previous steps to complete.

    I have a compile job like that from a vendor - where say step 15 takes 7 hours to run. This step is dependent on step 3 completing successfully. And, we have step 10 that takes 3 hours, step 11 takes 4 hours, etc...

    I say, move this kind of thing to SSIS and let it parallel process where it can and build multiple dependencies where needed, instead of taking 3 days to run.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Jeffrey Williams-493691 (12/7/2010)


    GSquared (12/7/2010)


    I prefer a bigger job with some custom logging over splitting it up. It makes it more obvious that it's all part of the same sequence. More "ACID".

    Until you get to a job that takes more than 3 days to complete - and steps in that job are not dependent upon previous steps to complete.

    I have a compile job like that from a vendor - where say step 15 takes 7 hours to run. This step is dependent on step 3 completing successfully. And, we have step 10 that takes 3 hours, step 11 takes 4 hours, etc...

    I say, move this kind of thing to SSIS and let it parallel process where it can and build multiple dependencies where needed, instead of taking 3 days to run.

    Absolutely.

    I was speaking (writing) in general terms. There are always valid exceptions.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • GSquared (12/8/2010)


    Absolutely.

    I was speaking (writing) in general terms. There are always valid exceptions.

    I knew that 🙂

    Now, if I could only convince the vendor that this job needs to be rewritten...

    And I agree, I prefer having all the steps in a related process in the same job also.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply