Problem with log shipping

  • Hi,

    I'm trying to set a manual log shipping (I've got Sql Server 2000 std). I've managed to create the sql script that will restore the database in order to have a warm server. When I run the script in Query Analyser, it work perfectly, but when I try to put it in a DTS package and run it from there, it just won't work. It always say that:"Exclusive access could not be obtained because the database is in use."

    I've closed everything that could have a connection to the database (EM, QA) and it still says the same message. I've check with sp_who2 who's connected, and nobody is on this database...

    I guess this is pretty simple question, but I can't find the answer

    Thanks,

    Eric

     

  • My guess is that DTS is grabbing a connection before attempting to execute the script.  I've found that generally when I get all users out of a database for an offline operation, and it throws that error, I am the one that is in the database.

    Curiosity- why are you putting the script into DTS?  I've set up homegrown log shipping as well, and am not using DTS at all. I'm not saying its wrong, just wanting to understand so that we might be able to help you better.

    My setup involves a job that does the shipping, which is not scheduled.  The tlog backup job executes the shipping job.  Currently, the file is shipped using a 'DOS' COPY command, issued via xp_cmdshell in a stored procedure, though a colleague has promised to help me set up http://ftp.   I have a scheduled job on the other end which does the restore, also handled within a stored procedure.  I have SQL tables in my dba database on each end with control/tracking data.  Hopefully this can give you some ideas.

    Steve

    (edited to get my first person/ second person right)

  • I agree with Steve.  Check the database in your connection object in the DTS package.  Recenly in another post, someone had a script to do a restore of database X that was failing with the same error, and the first line in the script was "Use X".  Make sure the database in the connection object is master or msdb.

    Aunt Kathi Data Platform MVP
    Author of Expert T-SQL Window Functions
    Simple-Talk Editor

  • If you're having problems with DTS not being able to lock the database, ready for log shipping, I'd really suggest that you use the regular backups (full, differential and transaction logs).

    That way, if your backups work (and you have real trouble if they don't), then your log shipping will (if the network is up).

    And even if the transfer of your latest backups fails (network down etc), at least you have a copy of your latest files.

    XCOPY with the /d flag takes care of moving only the latest files, not the whole shebang.

    A script I've been tinkering with in Perl does a few checks on all the candidate files in the repository for the logs to be imported on the warm standby (date saved, LSN sequences) to make sure alls in order, then automatically restores with no recovery.

    So far, no failures, or errors, and tying it into the job scheme gives nice messages just in case.

    I also run another local Perl script regularly to generate an HTML status page, so I know that the schedule's running properly (that's the one thing SQL Server doesn't alert you to.. When the jobs hang, or don't run!).

  • That was problem, my connection was not set to master!! It's now working!

    For your information, here's what I did:

    1) Backup DB + Log from server X (sql server) to server Y (just another server)

    2) Every night, I backup from server Y to a tape backup

    3) My DTS takes my logs on server Y and restore to server W (warm server)

    my proc is not finished yet, so I can't put more details. But what I'll do is running my DTS every hour to restore my logs.

    Thanks everyone for your help. Much appreciated!!

     

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply