December 20, 2005 at 11:52 am
Hi,
Well, I'm administrating the process of a datawarehousing database, and I need each time a row is updated into a table, that a trigger run and take the actual data of this row and insert it into a "archive table".
The purpose is to build a archive for data, a update clean the data, and we don't want to have them clean up; We want to keep the trace into a backup table, for our application working in past time.
How to build this trigger, can it be dynamic? There's is around 300 table, I would like to build my trigger one time, and applicable for all!
Thanks people, please to read any advice!
December 20, 2005 at 12:41 pm
Are you planning on having one archive table for each of your core tables, or one archive table that needs to house all inserts to all 300 tables?
December 20, 2005 at 12:46 pm
I'm planning to have an archive table for each table; One table's archive is identical to the original, but without primary key (to permit insert), only index.
December 20, 2005 at 12:55 pm
Because you will need to know each tables archive table in order to create the triggers, I don't see a way to build one trigger and have it work for all. You can, however, use SQL to generate your CREATE TRIGGER DDL code. This may need tweaking, but it may save you some time. Keep in mind that what I've done here in the trigger is insert everything from the inserted table into the audit table. This would mean that your audit tables and core tables must have identical schemas. Again, this may need tweaking, but it may give you something to start with......
DECLARE @TBID int,
@TableName varchar(50),
@SQLstring nvarchar(4000)
SET @TBID = 0
SELECT @TBID = MIN(ID) FROM sysobjects WHERE xtype = 'U'
AND sysobjects.ID is not null and sysobjects.ID > @TBID
WHILE @TBID is not null
BEGIN
SELECT @TableName = sysobjects.name
FROM sysobjects
WHERE sysobjects.id = @TBID
SET @SQLstring = 'CREATE TRIGGER [dbo].[tu_' + @TableName + '] ON [dbo].[' + @TableName + '] FOR INSERT' + CHAR(13)
SET @SQLstring = @SQLstring + 'AS' + CHAR(13)
SET @SQLstring = @SQLstring + 'BEGIN' + CHAR(13)
SET @SQLstring = @SQLstring + 'INSERT INTO [dbo].[Audit_' + @TableName + '] (SELECT * FROM INSERTED)' + CHAR(13)
SET @SQLstring = @SQLstring + 'END' + CHAR(13)
print @SQLstring
SELECT @TBID = MIN(ID) FROM sysobjects WHERE xtype = 'U'
AND sysobjects.ID is not null and sysobjects.ID > @TBID
END
December 20, 2005 at 1:08 pm
Really, you give me a good stuff to start. I'll do some test and see what I have for result. I know my question is special to process, but I'll got it!
I'll keep u updated about the result, thank you!
December 20, 2005 at 1:23 pm
I'm sure you noticed, but the script that I posted will generate trigger code for all of your user tables. If you need to exclude certain tables, you can add code to skip the generate section based on the sysobjects.name. Using the PRINT statement like I did will allow you to view all of the DDL code in your messages window. You can change PRINT to EXEC sp_executesql @SQLstring and all of your triggers will be added, or cut and paste from the messages window back into QA. Also, keep in mind that I did not include any environement settings such as SET QUOTED_IDENTIFER OFF, SET ANSI_NULLS....etc. You may want to add these based on how you structure your environment. Good luck.
December 20, 2005 at 2:45 pm
Great.
Thanks a lot again.
December 21, 2005 at 9:44 am
To add to your project. I would have a primary key if I were you. There are scripts here that will built audit tables for you, including the triggers for INSERT / UPDATE / DELETE. You can use that and it will build the triggers and audit tables with the push of a button for an entire database.
After this is done you should put in place a primary key, perhaps with the same field combination as you have in your 'production' table and add in an audit field (AuditID) as a bigint. Take care to consider the maximum size of a bigint and determine if this will fit your needs for the maximum size of the audit table prior to archiving.
My impression is that these tables will be very large. Consider this when applying your indexes. You may not want as many indexes on these tables as you have in the production environment. Rather limit the scope in how you will allow users to access the data and thereby eliminating the size of your indexes. This will save space and increase performance; even on your production tables. The reason being is that the larger your 'Audit Trail' tables become the slower the triggers will insert data.
If you do not use the scripts that I speak of here; and proceed on your own, then consider using AFTER INSERT triggers rather than ON INSERT. And be sure to have error handling so that if certain criteria is not met that your trigger RETURNS rather than rejecting the entire update for both the audit trail and production tables.
** Triggers are tricky and CAN be dangerous in a database. They are used only when no other options avail themselves. If you do not know your way around SQL, and triggers in particular you should consider having these written for you.
<hr noshade size=1 width=250 color=#BBC8E5> Regards,Jeffery Williams http://www.linkedin.com/in/jwilliamsoh
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply