August 26, 2010 at 2:20 am
I have the following trigger that writes updated field data to a table. All works correctly but all fields are returned instead of just the modified fields. I'm sure this can be done with the IF EXISTS line but I'm having problems addressing the column name variables within the IF EXISTS Statement. See below...
ALTER trigger ClaimUpdateTest2 on wis.dbo.claim FOR UPDATE as
Set NOCOUNT ON
IF replace(suser_sname(),'ukdngroup\','') <> 'Subhash Ludhera' RETURN
IF not exists (select * from inserted except select * from deleted) RETURN
Declare @fieldname varchar(128), @i int,@sql varchar(max), @sql2 varchar(300),@colname varchar(100)
create table #tmpColumnTable (id int,Column_Name varchar(100))
insert into #tmpColumnTable select ordinal_position,column_name from information_schema.columns where table_name='claim'
select * into #ins from inserted
select * into #del from deleted
set @i=1
while @i <= (select count(*) from #tmpColumnTable)
begin
set @fieldname=(select column_name from #tmpColumnTable where id=@i)
--set @sql2='select i.jobno from inserted i inner join deleted d on i.ukdnno=d.ukdnno where i.['+@fieldname+'] <> d.['+@fieldname+'])'
if exists (select i.jobno from inserted i inner join deleted d on i.ukdnno=d.ukdnno where i.['+@fieldname+'] <> d.['+@fieldname+'])begin
set @sql='insert into wis.dbo.tblaudit (date,username,ukdnno,jobnumber,[event],
,[field],[before],[after]) '+
'select getdate(),replace(suser_sname(),''ukdngroup\'',''''),d.ukdnno,d.micontjobno,''U'',''Claim'','''+@fieldname+''',d.['+@fieldname+'],i.['+@fieldname+'] from #ins i inner join #del d on i.ukdnno=d.ukdnno'
exec (@sql)
end
set @i=@i+1
end
August 26, 2010 at 3:38 am
I have managed to get the correct data to write to the audit table by placing the IF EXISTS clause and the UPDATE statement within the same dynamic sql. Only problem is that it takes 6-8 seconds to run (table contains 328 fields). Is there any way I can get this to run faster?
CREATE trigger ClaimUpdateDelete on wis.dbo.claim FOR UPDATE as
Set NOCOUNT ON
--IF replace(suser_sname(),'ukdngroup\','') <> 'Subhash Ludhera' RETURN
IF not exists (select * from inserted except select * from deleted) RETURN
Declare @fieldname varchar(128), @i int,@sql varchar(max)
create table #tmpColumnTable (id int,Column_Name varchar(100))
insert into #tmpColumnTable select ordinal_position,column_name from information_schema.columns where table_name='claim'
select * into #ins from inserted
select * into #del from deleted
set @i=1
while @i <= (select count(*) from #tmpColumnTable)
begin
set @fieldname=(select column_name from #tmpColumnTable where id=@i)
set @sql='
if exists(select i.jobno from #ins i inner join #del d on i.ukdnno=d.ukdnno where i.['+@fieldname+'] <> d.['+@fieldname+'])
insert into wis.dbo.tblaudit (date,username,ukdnno,jobnumber,[event],
,[field],[before],[after])
select getdate(),replace(suser_sname(),''ukdngroup\'',''''),d.ukdnno,d.micontjobno,''U'',''Claim'','''+@fieldname+''',d.['+@fieldname+'],i.['+@fieldname+'] from #ins i inner join #del d on i.ukdnno=d.ukdnno'
exec (@sql)
set @i=@i+1
end
August 26, 2010 at 4:09 am
The easiest way to improve performance would be not to insert Inserted and Deleted into a temp table. There doesn't appear to be any value in doing that, so just use them in the query as they are.
Unfortunately, to get significant improvements, you're going to have to lose that loop, and the only way I can see of doing that is to change your design. I would recommend having one audit table for each database table, and just inserting the before and after straight into that.
John
Edit: also, I can't see how your trigger isn't going to return any conversion errors, since you're inserting data from any of 328 columns into the same column in your audit table. Surely all those 328 columns don't have the same data type? And if they do, you're storing up some big problems in terms of performance (as you have already seen) and data integrity.
August 26, 2010 at 8:04 am
Thanks John...
I have to use temp tables as the trigger tables (inserted and deleted) are not visible when using dynamic sql.
I did originally send all row data from deleted as 'old' and inserted as 'new' creating 2 rows each time. Performance wise was excellent but was hard to check which fields had changed.
The Before and After fields are varchar's and I have not had any issues with converting flags or dates.
I have been using the SQL profiler and the duration of the most complex line (exec(sql)) touches 300 miliseconds every now and then. Therefore, I believe the issue to be down to the loop but I cannot see an easy way round this as it is used to check all 328 fields for changes.
August 26, 2010 at 8:15 am
Understood about temp tables - I should have spotted that myself.
It would indeed be hard to check which fields have changed, but you can write a separate routine to do that as and when you need to know, rather than impacting live performance by doing it for every column at runtime.
I'm suprised you don't get conversion errors when you try to put numbers, dates and so on into varchar columns. But even given that you don't, you're breaking the rules of normalisation by mixing different data types in the same column.
John
August 26, 2010 at 8:31 am
How would I write a routine to return previously modified field values inc user and date/time?
August 26, 2010 at 8:44 am
I can't think of an elegant way of doing it in T-SQL for a table with 328 columns. Maybe someone else can?
This may be one of those things that's better done using a different tool. For example, you could open the audit table in Excel and use conditional formating or pivot tables or such like to highlight the changes.
John
August 26, 2010 at 8:56 am
I think it would work quicker if the trigger send the inserted and deleted tables to a stored procedure. Then the trigger would be free and the stored procedure would carry oput the task. Question is - can you send temp tables to stored procedures???
September 13, 2010 at 8:44 pm
September 16, 2010 at 12:28 am
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply