Passing data in job steps with Global temp table??

  • Hello,

    I have created several stored procedures to get the products that are overdue in payments and update them into a "Suspend" status. Here's an overview:

    Sproc1: Find overdue customers' products, insert them into a global temp table ##Overdue

    Sproc2: From ##Overdue, get all that meets suspension requirement, insert them into another - ##Suspend

    Sproc3: From ##Overdue, get all taht meets "suspension warning" criteria, insert the into ##Warning

    Sproc4: Loop through ##Suspend and update the status in multiple tables.

    Sproc5: Loop through ##Warning and update the status in multiple tables.

    Sproc6: Get certain info from all the products listed in ##Suspend and ##Warning. (Need to output this to a flat file.)

    OK, I have 2 questions.

    1) I tried to put each of those sprocs into a separate step in a SQL Server Agent Job. I thought it would be better because if one fails, I can find out which failed easier. However, it's giving me the error "Invalid object name '##Overdue'." So obviously the 2nd step can't see the global temp table created in step 1. So, is there no way to do this?

    2) I was counting on separating the steps so when I get to Sproc6, I can just use SQL Server Agent to output the result to a file. If I were to mesh all the sprocs into one step, how do I output the result of one sproc using SQL? (BCP?)

  • I'm sure you'll get more and perhaps better ideas, but here are a couple.

    To get the last sproc to output to a flat file, you can use xp_cmdshell and OSQL.

    Another thought is to use permanent tables instead of global temp tables.

    Steve

  • Permanent tables is the way to go if you're going to keep track of status/completion info...

    The ## tables have worked for me in multi step DTS packages as long as each step uses the same connection object (in the dts package).  BUT this is really not a good way of doing things.

    Best way is put it all in procs/sub procs and make your process restartable (which may mean using permanent tables for status--if you cant make the proc intelligent in selecting rows for processing at each step.)

    You should have error checking/auditing that prevents uncontrolled SQL errors (like rule failures) prior to or during your process...

    You can have your proc spit out a result set as the last step and then execute/export all in one DTS step as a transform data task that exec's your proc and puts the results in a DTS OS file.

  • This process is going to run on a production server that is being replicated, and the DBA doesn't like to have extra tables created that don't need replication. So, I have to do this with temp tables.

    I've bunched them up into one step in the job, and it works fine, so I'm leaving it at that.

    I have also changed the "export file" step to the final step. It will now look for all rows inside the permanent "Email" table that will be created in the production reporting server. So now, I can separate that step out and have the job write a file for me.

    However, I have another question. Can I write the file on yet another server? The job step only allows me to write to the local hard drives. Is there any way to export the txt file to another server, without mapping the drive?

    Thanks!

  • Provided there is a share set up on the network resource, you can use the unc path.  Again, using xp_cmdshell, execute the "DOS" copy command -

    exec master..xp_cmdshell 'copy d:\test.txt \\servername\sharename\filename'

    Steve

  • Can you use another DATABASE

     

    HTH

     


    * Noel

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply