Delete ... Output INTO OpenRowset() ?

  • Evening All,

    I have an archive routine that runs every night, it inserts some rows and then performs a delete. Basic stuff.

    I have had to move the archive database to another server, and was hoping to use delete...output into to archive these rows an remove them from the source table.

    Is it possible to use output into with an openrowset..

    I can successfully do an insert to an openrowset but can't quite make the syntax valid with openrowset.

    Any thoughts, ideally without using a linked server.

    Cheers

    Alex

  • alex.sqldba (12/8/2016)


    Evening All,

    I have an archive routine that runs every night, it inserts some rows and then performs a delete. Basic stuff.

    I have had to move the archive database to another server, and was hoping to use delete...output into to archive these rows an remove them from the source table.

    Is it possible to use output into with an openrowset..

    I can successfully do an insert to an openrowset but can't quite make the syntax valid with openrowset.

    Any thoughts, ideally without using a linked server.

    Cheers

    Alex

    You could use SSIS.

    Assuming your delete proc is of the form

    DELETE x

    OUTPUT deleted.col1, deleted.col2, ...

    from tab1 x

    You can set this proc as a data source & feed the returned data rows into your destination table.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Hiya, thanks! I like this approach of using the delete/output as an SSIS Source.

    However.

    I am using the ADO.NET provider. And it's working fine, except in one particular occurrence: Should I stop the package whilst it's running, I see that rows are appended to the Archive Table, from the output portion -- but they are not deleted on the source.

    Is there a way to make it so that they ALL going AND get deleted or rolled back as one batch?

    I am concerned that on an occasion someone kills the job, it will duplicate those records the next time it succeeds.

    Cheers!

    Alex

  • alex.sqldba (12/9/2016)


    Hiya, thanks! I like this approach of using the delete/output as an SSIS Source.

    However.

    I am using the ADO.NET provider. And it's working fine, except in one particular occurrence: Should I stop the package whilst it's running, I see that rows are appended to the Archive Table, from the output portion -- but they are not deleted on the source.

    Is there a way to make it so that they ALL going AND get deleted or rolled back as one batch?

    I am concerned that on an occasion someone kills the job, it will duplicate those records the next time it succeeds.

    Cheers!

    Alex

    It's rather surprising to hear that the OUTPUT is sending rows which have not actually been deleted.

    A less elegant, but more bulletproof way of doing this would be to have the OUTPUT write to a temp table instead, and then use a SELECT from the temp table as your data source (you'd have to use EXECUTE WITH RESULT SETS to 'tell' SSIS what resultset structure to expect).

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • I've double confirmed that is the case using oledb provider as well as ado.net.

    It's annoying!

  • alex.sqldba (12/9/2016)


    I've double confirmed that is the case using oledb provider as well as ado.net.

    It's annoying!

    I had always assumed that the whole thing was a single implicit transaction. I'm not sure where to look which would explain this behaviour.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Aye. Well, ive been experimenting.

    If I set the Transaction Option to 'required' rather than supported, I get an error about being unable to start a distributed transaction.

    Edit: This is on the Data Flow Task [container] Properties.

    Which I think is because the DTC here is restricted.

    But now this is a bit of a brick wall, as to change that option requires restarting the msdtc, but, we have no test environment for this, only a live old style cluster.

    So I am bit nervous of stopping and starting the DTC on a cluster.

  • alex.sqldba (12/9/2016)


    Aye. Well, ive been experimenting.

    If I set the Transaction Option to 'required' rather than supported, I get an error about being unable to start a distributed transaction.

    Edit: This is on the Data Flow Task [container] Properties.

    Which I think is because the DTC here is restricted.

    But now this is a bit of a brick wall, as to change that option requires restarting the msdtc, but, we have no test environment for this, only a live old style cluster.

    So I am bit nervous of stopping and starting the DTC on a cluster.

    I would also be nervous. Then again, I'm just a lowly developer 🙂

    Have you tried putting an explicit transaction in the proc, just in case that changes things? This sort of thing:

    SET NOCOUNT, XACT_ABORT ON;

    BEGIN TRAN

    DELETE ...

    OUTPUT

    COMMIT

    END TRAN

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply