September 8, 2020 at 1:04 pm
The requirement is export the rows to a text file in a path when it meets the condition. Like say when there is a transaction of particular store, the row has to be created as a text file. If there are 1000 rows for that particular, I need 1000 text files to be formed in a path which should have only row in each. The text file should have only the latest row inserted for that particular store. The text file created will be read by other software.
Bloody RBAR 3rd party software. Who comes up with this stuff?
Anyway, IMHO, the thing to do is to have a trigger that simply captures the PK of the table and writes it to a staging table. The Staging table should have an IDENTITY column. Then, have a per minute job that checks the staging table to see if any rows are present. If there are, then do the xp_CmdShell thing in order by the IDENTITY column once for each row deleting the row that was just processed from the staging table.
If you need to keep track of which rows were sent, add a DateSent column to the original table and update it when you do the xp_CmdShell thing in the staging table.
As for the per-minute PITA that Sergiy talked about, create another job that scans jobs to see if they process on a per-minute basis and delete their job history automatically after, say, 24 hours or so.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing post 16 (of 15 total)
You must be logged in to reply to this topic. Login to reply