The story came from a question someone asked me.
Does Master database support full recovery mode?
As I remembered, by default, Master database is in "Simple" recovery mode. I never try to put it into Full recovery mode because master database is not updated frequently, and it is very small. I only use daily backup for master database.
Here is the database properties page of master database on SQL Server 2008 R2.
So we can change it recovery mode to "FULL", but does it really work? I tried to backup it by SSMS
Well, I can only select "FULL" backup type. How about run backup command?
BACKUP DATABASE [Master] TO DISK = N'C:\master.dif' WITH DIFFERENTIAL
GO
I got error:
Msg 3024, Level 16, State 0, Line 1
You can only perform a full backup of the master database. Use BACKUP DATABASE to back up the entire master database.
Msg 3013, Level 16, State 1, Line 1
BACKUP DATABASE is terminating abnormally.
In SQL BOL, http://msdn.microsoft.com/en-us/library/ms365937(v=sql.105).aspx, for master database:
For backwards compatibility with earlier versions of Microsoft SQL Server, the recovery model of master can be set to FULL or BULK_LOGGED. However, BACKUP LOG is not supported for master. Therefore, even if the recovery model of master is changed to full or bulk-logged, the database continues to operate as if it were using the simple recovery model.
Forget mentioning that
Not only transaction log backup, but also differential backup can not be run on master database. Why? because you can not restore master database with norecovery mode, without norecovery, why we need to keep differential backup and log backup? 🙂
Master database contains the data blow
1. Login account info
2. System parameter (sp_configure)
3. Database info
4. Server objects, for instance the linked server, system trigger.
5. Other server level security info
So except for the regular backup, you need to take a full database backup after you change the data upper. However, if you maintain the system for customer, you will not be notified when customer change the master database. If the server is corrupted, as there is no differential and log backup for master database, you can not restore the database to point of time, there will be data lost.
I was thinking if there is an appropriate way to backup the master database automatically after any change.The first thing is we need to capture the change event .
1. Using system trigger monitor the event.
DDL trigger can monitor the server level DDL, using the script below can list all the DDL trigger hierarchically
================================
;WITH EVENTCATALOG([TYPE],[TYPE_NAME], [PARENT_NAME], LEVEL)
AS
(
SELECT [TYPE]
,[TYPE_NAME]
,[TYPE_NAME] AS PARENT_NAME
,0 AS LEVEL
FROM SYS.trigger_event_types
WHERE TYPE=10002 OR TYPE=296
UNION ALL
SELECT
TET.TYPE
,TET.TYPE_NAME
,EC.TYPE_NAME AS PARENT_NAME
,EC.LEVEL+1 AS LEVEL
FROM SYS.trigger_event_types TET
INNER JOIN EVENTCATALOG EC ON TET.parent_type=EC.TYPE
)
SELECT [TYPE],[TYPE_NAME], [PARENT_NAME], LEVEL
FROM EVENTCATALOG ORDER BY LEVEL, TYPE_NAME
================================
So we just need to create ddl trigger for the root event ALTER_SERVER_CONFIGURATION and DDL_SERVER_LEVEL_EVENTS. However, event ALTER_SERVER_CONFIGURATION cannot be monitored, fortunately, we don't need to run "ALTER SERVER CONFIGURATION" or SP_CONFIGURE frequently, so monitor event DDL_SERVER_LEVEL_EVENTS is enough. here is the sample code
================================
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TRIGGER [SystemTrigger] ON ALL SERVER
FOR DDL_SERVER_LEVEL_EVENTS
AS
DECLARE @data XML;
DECLARE @eventtype sysname;
DECLARE @PostTime sysname;
DECLARE @LoginName sysname;
DECLARE @UserName sysname;
DECLARE @object sysname;
DECLARE @tsql sysname;
DECLARE @message varchar(max);
DECLARE @path varchar(200);
SET @data = EVENTDATA();
SET @eventType = @data.value('(/EVENT_INSTANCE/EventType)[1]', 'sysname')
SET @PostTime = @data.value('(/EVENT_INSTANCE/PostTime)[1]','sysname')
SET @LoginName = @data.value('(/EVENT_INSTANCE/LoginName)[1]','sysname')
SET @UserName = @data.value('(/EVENT_INSTANCE/UserName)[1]','sysname')
SET @object = @data.value('(/EVENT_INSTANCE/DatabaseName)[1]','sysname')
SET @tsql=EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')
SET @message=' On server '+ @@servername+ ' login name '+ isnull(@LoginName,'UNKNOWN')+',user '
+ISNULL(@UserName,'UNKNOWN')+' run the command '+ isnull(@eventType, 'Unknown Database Operation')
+ ' on database '+ISNULL(@object,'UNKNOWN')+' at '+ISNULL(@PostTime,'UNKNOWN') + '. '
SET @message=@message + 'SQL Command is:'
SET @message=@message + isnull(@tsql,'')
raiserror (@message, 0, 0) with log
GO
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
ENABLE TRIGGER [SystemTrigger] ON ALL SERVER
================================
if we run any server level DDL, the command will be logged into sql server errorlog file. Here I dropped a database and detached another one, then in the sql server errorlog, I can find message:
2. Backup master database
In the first step, we capture the system level event, now we need to backup the master database automatically. It is a challenge, because you can not backup database in the DDL trigger. we need to find out another solutions:
a) Call sql job in the DDL Trigger
well, you can create a sql job which backup the master database only without schedule , then start the job from the DDL trigger.
================================
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER TRIGGER [SystemTrigger] ON ALL SERVER
FOR DDL_SERVER_LEVEL_EVENTS
AS
DECLARE @data XML;
DECLARE @eventtype sysname;
DECLARE @PostTime sysname;
DECLARE @LoginName sysname;
DECLARE @UserName sysname;
DECLARE @object sysname;
DECLARE @tsql sysname;
DECLARE @message varchar(max);
DECLARE @path varchar(200);
SET @data = EVENTDATA();
SET @eventType = @data.value('(/EVENT_INSTANCE/EventType)[1]', 'sysname')
SET @PostTime = @data.value('(/EVENT_INSTANCE/PostTime)[1]','sysname')
SET @LoginName = @data.value('(/EVENT_INSTANCE/LoginName)[1]','sysname')
SET @UserName = @data.value('(/EVENT_INSTANCE/UserName)[1]','sysname')
SET @object = @data.value('(/EVENT_INSTANCE/DatabaseName)[1]','sysname')
SET @tsql=EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)')
SET @message=' On server '+ @@servername+ ' login name '+ isnull(@LoginName,'UNKNOWN')+',user '
+ISNULL(@UserName,'UNKNOWN')+' run the command '+ isnull(@eventType, 'Unknown Database Operation')
+ ' on database '+ISNULL(@object,'UNKNOWN')+' at '+ISNULL(@PostTime,'UNKNOWN') + '. '
SET @message=@message + 'SQL Command is:'
SET @message=@message + isnull(@tsql,'')
WAITFOR DELAY '00:00:02'
EXEC msdb..sp_start_job 'backupmaster'
raiserror (@message, 0, 0) with log;
GO
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
ENABLE TRIGGER [SystemTrigger] ON ALL SERVER
================================
The solution is not perfect, because
- The ddl trigger calls sql job to backup the master db, if the job is running, call will be failed, and you will get error message
- If sql agent is not running, the backup doesn't work, and ddl trigger will report error
- Becareful when you use SSMS, for instance, if you create a database from SSMS, the "CREATE_DATABASE" and "ALERT_DATABASE" event will be triggered many times, and the DDL trigger will be run many times as well. You'd better replace the "DDL_SERVER_LEVEL_EVENTS" in the trigger with a smaller scope trigger, like create the trigger only for event "CREATE_DATABASE".
- When calling sql job to backup the master db, it is possible that the trigger hasn't finished but sql job has been run. So it is possible the backup of master db doesn't have the new changes......although it doesn't happen during my testing.
b) backup master database with sql job runing priodically
By this way, you even don't need DDL Trigger, just create a sql job to backup the master database periodically, maybe 1 minute or 5 minutes. The problem is you will have many backup files for master databases, and backing up master database frequently might have a little bit performance impact. So we can setup the flag in the DDL trigger, have the sql job check the flag, if it is set, then backup the master database.
This solution is also not perfect:
- More code and more program
- When set the flag, it is possible to cause blocking issue.
- Larger time interval between the change and the master backup.
So we can see without transaction log backup, how difficult it is to implement the disaster recovery.