February 8, 2011 at 9:33 pm
Hi all,
i need to restore group of bak file at recular interval and also generate log about this. which is the best method and i need the script also?
February 8, 2011 at 9:38 pm
schedule the restore job and in job step properties.....mention the location and name for output file.
Output file will be your log file for that perticular job in that specific location.
Lot of scripts available on this website for restore job.
----------
Ashish
February 8, 2011 at 9:46 pm
You can make use of sql jobs for this where you can put your multiple DB restore scripts and direct their output to file(You may consider it as restore log to any location of your server).You can then schedule these jobs as per your convience.
Sample scripts for taking backup of single db is attached for your refernce:-
USE [msdb]
GO
DECLARE @jobId BINARY(16)
EXEC msdb.dbo.sp_add_job @job_name=N'backup',
@enabled=1,
@notify_level_eventlog=0,
@notify_level_email=2,
@notify_level_netsend=2,
@notify_level_page=2,
@delete_level=0,
@category_name=N'[Uncategorized (Local)]',
@owner_login_name=N'sa', @job_id = @jobId OUTPUT
select @jobId
GO
EXEC msdb.dbo.sp_add_jobserver @job_name=N'backup', @server_name = N'BIPIN-EASI-A10'
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_add_jobstep @job_name=N'backup', @step_name=N'backup',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_fail_action=2,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N'TSQL',
@command=N'backup database master to disk=''d:\master.bak''',
@database_name=N'master',
@output_file_name=N'D:\backuplog',
@flags=2
GO
USE [msdb]
GO
EXEC msdb.dbo.sp_update_job @job_name=N'backup',
@enabled=1,
@start_step_id=1,
@notify_level_eventlog=0,
@notify_level_email=2,
@notify_level_netsend=2,
@notify_level_page=2,
@delete_level=0,
@description=N'',
@category_name=N'[Uncategorized (Local)]',
@owner_login_name=N'sa',
@notify_email_operator_name=N'',
@notify_netsend_operator_name=N'',
@notify_page_operator_name=N''
GO
February 8, 2011 at 9:58 pm
thanks to all,
here each bak file are different database for ex.
1. testdb.bak for testdb
2. storedb.bak for storedb
3. silksdb.bak for silksdb ..etc
all these should be restored. how can i read each backup file name through scheduling a job
February 8, 2011 at 11:10 pm
Use TSQL restore commands to put in sql editor of job creation step.If you have issue in writing these scripts and use your SSMS to generate these scripts for you.You can restore any single db using ssms and then click scripts option in left top of this restore dialogue box.
February 8, 2011 at 11:58 pm
hi
My case is a folder contain n files ,file count is not fixed and also n different db backup,some times 10 or 1 or 2 file may be there. what i want i need to restore all the n backup files.
a fixed script like "RESTORE DATABASE [archive] FROM DISK = N'c:\archive_23nov20101000.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10" it will restore single db alone.
how i need to change this script?
February 9, 2011 at 12:11 am
Since your backup file are not fixed and count of these .bak files is changing everytime.
We can achieve this if you have knowledge of Java,C++ or DOS shell commands.
->You need to have one jave program writtem and deployed in sql backup directory where you will be using JAVA File handling objects and methods which inturn will count the no of .bak files putting these file names in java array and executing sql restore commnds at runtime.
->We do have whole set of commands availble in command prompt which inturn can do same task as Java program will do nd further you can use osql commad utilty to rerstore the databases.
->If you can provide clearity on below points then I can suggest differnet way of doing this:-
what are sources of these backup files?Are these backup files are ftp from other servers to Destination server?
regards
Bipan
February 9, 2011 at 1:37 am
thanks bipan
February 9, 2011 at 1:38 am
thanks bipan
source of the backupfile is live server.
i will try.. for the dos script.
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply