xp_cmdshell and transactions

  • Hi all,

    Got a situation which I'll simplify into the following question. Let's say I have this block of code:

    CREATE TABLE TestTable (ID INT, Val VARCHAR(20))

    BEGIN TRANSACTION

    INSERT INTO TestTable (ID, Val) VALUES (1, 'Hi')

    EXEC xp_cmdshell 'SomeFile'

    WAITFOR DELAY '00:20:00'

    COMMIT TRANSACTION

    And now, in the xp_cmdshell command, I'm executing a VB.NET script which does some processing. The VB.NET script requires the value that was inserted into TestTable.

    If I do a SELECT against the records from TestTable, the SELECT will not work until after the transaction is committed.

    I've tried using (NOLOCK) to read the records, but that doesn't allow me to obtain them. Is there any way I can get them?

    I understand the whole "dirty reads" thing, but in this situation I do actually want the records inserted during that transaction, because I have no other way of communicating them to that .NET file.

    If I can't find another solution, I have one fallback - write the records into a CSV file using xp_cmdshell, and then read from that CSV file within the .NET app. I'd rather stick to just DB stuff if I can though. Any ideas?

  • kramaswamy (2/27/2012)


    Hi all,

    Got a situation which I'll simplify into the following question. Let's say I have this block of code:

    CREATE TABLE TestTable (ID INT, Val VARCHAR(20))

    BEGIN TRANSACTION

    INSERT INTO TestTable (ID, Val) VALUES (1, 'Hi')

    EXEC xp_cmdshell 'SomeFile'

    WAITFOR DELAY '00:20:00'

    COMMIT TRANSACTION

    can you move the work into vb.net instead?

    vb can start a transaction,

    load a local DataTable with the values, do the work you need to do in vb, then have vb.NET insert into the TestTable, and finally commit the transaction?

    not sure what you are doing from your example, but i think it's better to do all teh work in a single place...all TSQL or all VB; the jump back from one to another's not what I'd look to try and do if i could avoid it,.

    then vb can create a transaction,

    And now, in the xp_cmdshell command, I'm executing a VB.NET script which does some processing. The VB.NET script requires the value that was inserted into TestTable.

    If I do a SELECT against the records from TestTable, the SELECT will not work until after the transaction is committed.

    I've tried using (NOLOCK) to read the records, but that doesn't allow me to obtain them. Is there any way I can get them?

    I understand the whole "dirty reads" thing, but in this situation I do actually want the records inserted during that transaction, because I have no other way of communicating them to that .NET file.

    If I can't find another solution, I have one fallback - write the records into a CSV file using xp_cmdshell, and then read from that CSV file within the .NET app. I'd rather stick to just DB stuff if I can though. Any ideas?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • The reason why it's split like that is because I want to do everything asynchronously. I'm using Service Broker for that purpose - the VB.NET application queues up a process to execute, which then launches a series of stored procs that populate data and finally launch a separate VB.NET application to create the final report.

    In an ideal world I'd be using Reporting Services for the report, but since we don't have a machine for that, I need to use Excel. My .NET app is thus using the interop Excel library to build the files.

  • ok , so if it's async, i'm assuming the data values have been inserted or updated to some table, and then the service broker is getting called....in a trigger, probably?

    so why is the async process doing a transaction that is inserting into another table? does it really need to insert into the table BEFORE the xp_cmdshell?

    is the xp_cmdshell process accessing that table?(it wasn't obvious from the pared down example, but that's the core issue right? the blocking you described?)

    maybe a little more detail on the process is in order, i think...

    I'm thinking you are blocking yourself with the transaction, and you probably don't need to do that.

    you might be able to dump the explicit transaction completely, or maybe you could feed the procedure a temp table, and insert/commit if the xp_cmdshell call does not error out.

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Okay, fair enough, I'll go into a bit more detail.

    My .NET app, when the user requests a report for a given day, fires off a stored procedure. The stored procedure then sends a message to the service broker to start the process. The message that it sends contains the type of the report, the date of the report, and some other necessary info.

    When the message is recieved by the broker at the recieving end, it fires off an activation procedure that handles processing of all the info.

    The activation procedure runs several procedures to populate the underlying data behind the report. Once all the procedures are done, it fires off another step which launches the .NET script that creates the Excel report. Finally, once the excel report is ready, it sends the termination message indicating the end of the process.

    The problem is that the .NET script needs to know what report it is creating.

    Actually though, now that I think about it, you've given me an idea. Initially I was concerned about putting the record into a table before the process begins, because I wasn't sure what I'd do if I found multiple records in that table for the .NET script to deal with - but now that I think about it, I can just create as many reports as are available in the table, and then remove those records after I process them. Going to try this and see if it works ...

  • Never mind - that solution has the same problem. I can't process the report until I know all the background records are loaded, but I can't know they're loaded unless I write something into a table, which won't get read until after the transaction is complete.

  • For the time being, I've resorted to using the file system approach - in my transaction I use an xp_cmdshell with an echo > file.csv command, which stores the information into a flat file, then I load that file in my VB.NET script to create the Excel report.

    If anyone has a better suggestion that avoids the need of using the file system I'm still open to suggestions though.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply