March 8, 2012 at 9:13 am
Eugene Elutin (3/8/2012)
OTF (3/8/2012)
Eugene Elutin (3/7/2012)
It's not a good design to email from a trigger. You're killing performance of update. If I would be MS i would not allow to call xp_sendmail from a trigger at all. Had to deal with such implementation few times...You should consider other ways:
1. Trigger inserts into some kind of "changes to email" table (usually called Audit table).
2. Regular scheduled job or Service polls the table and sends email out.
Actually, if I'm not mistaken, this process you've mentioned is pretty much what the OP's
code will do... ie. send the mail items asynchronously as it appears he is using Database Mail which I believe uses Service Broker.
You're right though, probably not the best design approach.
You are rigth in terms that an email will be sent asynchronously, however enqueueing operation (to get MessageId back) is synchronous and it's slow enough to avoid doing so in a trigger.
Not if you actually use service broker to do this (and not just rely on sendDBmail). As in - set up the full conversation to be done asynchronously (sending the email itself AND getting ID's back), and just use the trigger to "drop the message" into the broker queue.
You can even use a scheduled approach to read the queue whenever you feel like it.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 8, 2012 at 9:15 am
Sean Lange (3/8/2012)
Beginner_2008 (3/8/2012)
Sean Lange (3/8/2012)
Beginner_2008 (3/8/2012)
Thanks for the suggestions.Can anybody please provide the best alternative with an example? I am new to Database mail.
Here is a good place to start. http://msdn.microsoft.com/en-us/library/ms175887.aspx
Any example for the case above is appreciated. I am able to implement it but I am not getting dynamicaly passed values in the trigger.
I will try to later implement CDC 2008, For now I want to use this trigger. I just had the code posted for making reference but i am trying the different column to be updated in my actual code. Please suggest.
This will explain it far better than I can. http://msdn.microsoft.com/en-us/library/ms190307.aspx
Before you use this trigger you HAVE to make sure you can handle multiple row updates. At some point you are going to have to call this proc for each row that is updated.
Sean I am able to do the mail, my question here is how can i pass the values dynamically. Gor example 'john' changes to 'james' the body of the mail should go with james as old and firstname updated to 'james'
So for the code i gave above I look to see the body of the mail as:
Customer with CustomerId= 1234 has been updated with previous CustomerName as John and the new CustomerName as 'James'.
Thanks.
March 8, 2012 at 9:31 am
Beginner_2008 (3/8/2012)
Sean Lange (3/8/2012)
Beginner_2008 (3/8/2012)
Sean Lange (3/8/2012)
Beginner_2008 (3/8/2012)
Thanks for the suggestions.Can anybody please provide the best alternative with an example? I am new to Database mail.
Here is a good place to start. http://msdn.microsoft.com/en-us/library/ms175887.aspx
Any example for the case above is appreciated. I am able to implement it but I am not getting dynamicaly passed values in the trigger.
I will try to later implement CDC 2008, For now I want to use this trigger. I just had the code posted for making reference but i am trying the different column to be updated in my actual code. Please suggest.
This will explain it far better than I can. http://msdn.microsoft.com/en-us/library/ms190307.aspx
Before you use this trigger you HAVE to make sure you can handle multiple row updates. At some point you are going to have to call this proc for each row that is updated.
Sean I am able to do the mail, my question here is how can i pass the values dynamically. Gor example 'john' changes to 'james' the body of the mail should go with james as old and firstname updated to 'james'
So for the code i gave above I look to see the body of the mail as:
Customer with CustomerId= 1234 has been updated with previous CustomerName as John and the new CustomerName as 'James'.
Thanks.
Not sure what you mean. Your first example you were getting the values of both previous and new values using the inserted and deleted tables.
You will need to change your logic to something more like this.
select *
from inserted i
join deleted d on i.ID = d.ID
--set variables and send email for each row
Or use your trigger to insert to an audit table so your trigger is not stuck with RBAR processing.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
March 8, 2012 at 9:43 am
Looks like OP is not going to listen...
Ok, you can take this and kill yourself :hehe::
CREATE TRIGGER CustomerUpdateMail
ON customer_info
FOR UPDATE
AS
SET NOCOUNT ON;
declare @message nvarchar(2000);
DECLARE crsUpd CURSOR FAST_WORWARD
FOR
SELECT 'Customer with ID= ' + CAST(d.CustomerID AS VARCHAR) + ' has been updated ' +
'with previous First Name is ' + d.f_name +
' and the new First Name is ' + i.f_name AS [message]
FROM deleted d
JOIN inserted i
ON i.CustomerId = d.CustomerId
WHERE ISNULL(d.f_Name, '') != ISNULL(i.f_Name, '')
OPEN crsUpd
FETCH NEXT FROM crsUpd INTO @message
WHILE @@FETCH_STATUS = 0
BEGIN
EXEC msdb..sp_send_dbmail @profile_name='test_mail',
@recipients ='test@exam.com',@subject = 'Customer Information Updated',
@body = @message;
FETCH NEXT FROM crsUpd INTO @message
END
CLOSE crsUpd
DEALLOCATE crsUpd
March 8, 2012 at 10:10 am
I am not going to use the cursor.
😛
March 8, 2012 at 10:20 am
Beginner_2008 (3/8/2012)
I am not going to use the cursor.😛
Then, you may miss some updates!
There was a story remembered by some one here, about the company went bust because of that... 😛
Good Luck!
March 8, 2012 at 10:46 am
Eugene Elutin (3/8/2012)
Beginner_2008 (3/8/2012)
I am not going to use the cursor.😛
Then, you may miss some updates!
There was a story remembered by some one here, about the company went bust because of that... 😛
Good Luck!
The assumption that inserted only contains 1 row is incredibly dangerous. You might for example update the domain name for all rows.
Update table set email = UserID + '@' + NewDomain where OldDomain = 'SomeOldValue'
The way you coded your trigger it will only send 1 email. You are going to have to find a way to send this email for all rows.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
March 8, 2012 at 12:32 pm
Matt Miller (#4) (3/8/2012)
Eugene Elutin (3/8/2012)
OTF (3/8/2012)
Eugene Elutin (3/7/2012)
It's not a good design to email from a trigger. You're killing performance of update. If I would be MS i would not allow to call xp_sendmail from a trigger at all. Had to deal with such implementation few times...You should consider other ways:
1. Trigger inserts into some kind of "changes to email" table (usually called Audit table).
2. Regular scheduled job or Service polls the table and sends email out.
Actually, if I'm not mistaken, this process you've mentioned is pretty much what the OP's
code will do... ie. send the mail items asynchronously as it appears he is using Database Mail which I believe uses Service Broker.
You're right though, probably not the best design approach.
You are rigth in terms that an email will be sent asynchronously, however enqueueing operation (to get MessageId back) is synchronous and it's slow enough to avoid doing so in a trigger.
Not if you actually use service broker to do this (and not just rely on sendDBmail). As in - set up the full conversation to be done asynchronously (sending the email itself AND getting ID's back), and just use the trigger to "drop the message" into the broker queue.
You can even use a scheduled approach to read the queue whenever you feel like it.
That would be re-inventing the wheel. Database Mail already leverage Service Broker.
A lot of the people on this thread are threatening to over-engineer this solution. Stop!
Calling msdb.dbo.sp_send_dbmail is analogous to inserting a row into a separate table of your own creation. The mail is actually sent asynchronously by a Service Broker worker thread...so not hit or harm in doing this inside a trigger.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
March 8, 2012 at 9:51 pm
opc.three (3/8/2012)
Matt Miller (#4) (3/8/2012)
Eugene Elutin (3/8/2012)
OTF (3/8/2012)
Eugene Elutin (3/7/2012)
It's not a good design to email from a trigger. You're killing performance of update. If I would be MS i would not allow to call xp_sendmail from a trigger at all. Had to deal with such implementation few times...You should consider other ways:
1. Trigger inserts into some kind of "changes to email" table (usually called Audit table).
2. Regular scheduled job or Service polls the table and sends email out.
Actually, if I'm not mistaken, this process you've mentioned is pretty much what the OP's
code will do... ie. send the mail items asynchronously as it appears he is using Database Mail which I believe uses Service Broker.
You're right though, probably not the best design approach.
You are rigth in terms that an email will be sent asynchronously, however enqueueing operation (to get MessageId back) is synchronous and it's slow enough to avoid doing so in a trigger.
Not if you actually use service broker to do this (and not just rely on sendDBmail). As in - set up the full conversation to be done asynchronously (sending the email itself AND getting ID's back), and just use the trigger to "drop the message" into the broker queue.
You can even use a scheduled approach to read the queue whenever you feel like it.
That would be re-inventing the wheel. Database Mail already leverage Service Broker.
A lot of the people on this thread are threatening to over-engineer this solution. Stop!
Calling msdb.dbo.sp_send_dbmail is analogous to inserting a row into a separate table of your own creation. The mail is actually sent asynchronously by a Service Broker worker thread...so not hit or harm in doing this inside a trigger.
True to a certain degree. That said, sending individual emails still is a cursor process, even if an asynch one. That's one thing you can improve on with your own service broker queue (since you can insert ALL email requests at once in set-based fashion into a queue of your own making). This would of course depend on whether you can be confident that you won't have a mass update some day, sending a cursor into a tailspin.
So - your mileage may vary. .
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
March 8, 2012 at 10:39 pm
Matt Miller (#4) (3/8/2012)
<truncated>True to a certain degree. That said, sending individual emails still is a cursor process, even if an asynch one. That's one thing you can improve on with your own service broker queue (since you can insert ALL email requests at once in set-based fashion into a queue of your own making). This would of course depend on whether you can be confident that you won't have a mass update some day, sending a cursor into a tailspin.
So - your mileage may vary. .
No cursor necessary. The code below may provoke a visceral reaction, as it is awkward, but it makes the most of the situation, is safe, is fast, and will scale just fine.
If batch updates are a concern and you know your tolerances you can protect yourself within the trigger by denying batches that affect more than a chosen number of rows (see example trigger below). Here is some code to demonstrate how to do what was requested without a cursor.
Demo code can be run on any instance:
USE tempdb ;
SET NOCOUNT ON ;
IF OBJECT_ID(N'tempdb..#inserted') IS NOT NULL
DROP TABLE #inserted ;
IF OBJECT_ID(N'tempdb..#deleted') IS NOT NULL
DROP TABLE #deleted ;
CREATE TABLE #inserted
(
person_id INT,
first_name VARCHAR(100),
middle_name VARCHAR(100),
last_name VARCHAR(100)
) ;
CREATE TABLE #deleted
(
person_id INT,
first_name VARCHAR(100),
middle_name VARCHAR(100),
last_name VARCHAR(100)
) ;
INSERT INTO #deleted
(person_id, first_name, middle_name, last_name)
VALUES (1, 'Dwight', 'David', 'Johnson'),
(2, 'LeBron', 'Raymone', 'James'),
(3, 'Dwyane', 'Tyrone', 'Wade') ;
INSERT INTO #inserted
(person_id, first_name, middle_name, last_name)
VALUES (1, 'Dwight', 'David', 'Howard'),
(2, 'Rick', 'Raymone', 'James'),
(3, 'John', 'Tyrone', 'Wade') ;
-- below this will resemble the body of the trigger
DECLARE @sql VARCHAR(MAX) = '',
@newline CHAR(2) = CHAR(13) + CHAR(10),
@tab CHAR(1) = CHAR(9) ;
-- use of QUOTENAME to surround values is critical to avoid SQL injection!! do not omit!!
EXEC msdb.dbo.sp_send_dbmail
@profile_name = ''test_mail'',
@recipients = ''test@exam.com'',
@subject = ''Customer Information Updated'',
@body = ''Customer with ID ' + CAST(i.person_id AS VARCHAR(10)) + ' has been updated.' + CASE WHEN i.first_name != d.first_name
THEN @newline + @tab + 'Previous First Name was '
+ QUOTENAME(d.first_name, '''')
+ ' and the new First Name is ' + QUOTENAME(i.first_name,
'''') + '.'
ELSE ''
END + CASE WHEN i.last_name != d.last_name
THEN @newline + @tab + 'Previous Last Name was '
+ QUOTENAME(d.last_name, '''')
+ ' and the new Last Name is '
+ QUOTENAME(i.last_name, '''') + '.'
ELSE ''
END + ''';'
FROM #inserted i
JOIN #deleted d ON i.person_id = d.person_id ;
-- where-clause omitted when all rows are expected to have at minumum one auditable change, add where-clause if needed
PRINT @sql
Now the example trigger:
IF EXISTS ( SELECT *
FROM sys.objects
WHERE object_id = OBJECT_ID(N'test_table')
AND type IN (N'U') )
DROP TABLE test_table ;
CREATE TABLE test_table
(
person_id INT,
first_name VARCHAR(100),
middle_name VARCHAR(100),
last_name VARCHAR(100)
) ;
GO
CREATE TRIGGER CustomerUpdateMail ON test_table
FOR UPDATE
AS
BEGIN
---------------------------------------------------------------------------
-- protect against a batch update of more rows than is typically expected
-- during normal system activity from sending out too many emails
DECLARE @rc INT,
@max_rows INT = 20 -- change this number to the max number of batch updates you want to allow
;
WITH cte
AS (
SELECT TOP (@max_rows + 1)
person_id
FROM inserted
)
SELECT @rc = COUNT(*)
FROM cte ;
IF @rc = @max_rows + 1
RAISERROR('%d is the max allowed for updating test_table', 11, 1, @max_rows) ;
---------------------------------------------------------------------------
---------------------------------------------------------------------------
-- build a command to send emails with a listing of profile changes
DECLARE @sql VARCHAR(MAX) = '',
@newline CHAR(2) = CHAR(13) + CHAR(10),
@tab CHAR(1) = CHAR(9) ;
-- use of QUOTENAME to surround values is critical to avoid SQL injection!! do not omit!!
EXEC msdb.dbo.sp_send_dbmail
@profile_name = ''test_mail'',
@recipients = ''test@exam.com'',
@subject = ''Customer Information Updated'',
@body = ''Customer with ID ' + CAST(i.person_id AS VARCHAR(10)) + ' has been updated.' + CASE WHEN i.first_name != d.first_name
THEN @newline + @tab + 'Previous First Name was '
+ QUOTENAME(d.first_name, '''')
+ ' and the new First Name is ' + QUOTENAME(i.first_name,
'''') + '.'
ELSE ''
END + CASE WHEN i.last_name != d.last_name
THEN @newline + @tab + 'Previous Last Name was '
+ QUOTENAME(d.last_name, '''')
+ ' and the new Last Name is '
+ QUOTENAME(i.last_name, '''') + '.'
ELSE ''
END + ''';'
FROM inserted i
JOIN deleted d ON i.person_id = d.person_id ;
-- where-clause omitted when all rows are expected to have at minumum one auditable change, add where-clause if needed
-- send emails
EXEC(@sql) ;
---------------------------------------------------------------------------
END
edit: correct example trigger code, was still referencing temp tables #inserted & #deleted instead of virtual tables inserted and deleted (Thanks Eugene)
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
March 9, 2012 at 3:35 am
opc.three (3/8/2012)
...Now the example trigger:
IF EXISTS ( SELECT *
FROM sys.objects
WHERE object_id = OBJECT_ID(N'test_table')
AND type IN (N'U') )
DROP TABLE test_table ;
CREATE TABLE test_table
(
person_id INT,
first_name VARCHAR(100),
middle_name VARCHAR(100),
last_name VARCHAR(100)
) ;
GO
CREATE TRIGGER CustomerUpdateMail ON test_table
FOR UPDATE
AS
BEGIN
---------------------------------------------------------------------------
-- protect against a batch update of more rows than is typically expected
-- during normal system activity from sending out too many emails
DECLARE @rc INT,
@max_rows INT = 20 -- change this number to the max number of batch updates you want to allow
;
WITH cte
AS (
SELECT TOP (@max_rows + 1)
person_id
FROM inserted
)
SELECT @rc = COUNT(*)
FROM cte ;
IF @rc = @max_rows + 1
RAISERROR('%d is the max allowed for updating test_table', 11, 1, @max_rows) ;
---------------------------------------------------------------------------
---------------------------------------------------------------------------
-- build a command to send emails with a listing of profile changes
DECLARE @sql VARCHAR(MAX) = '',
@newline CHAR(2) = CHAR(13) + CHAR(10),
@tab CHAR(1) = CHAR(9) ;
-- use of QUOTENAME to surround values is critical to avoid SQL injection!! do not omit!!
EXEC msdb.dbo.sp_send_dbmail
@profile_name = ''test_mail'',
@recipients = ''test@exam.com'',
@subject = ''Customer Information Updated'',
@body = ''Customer with ID ' + CAST(i.person_id AS VARCHAR(10)) + ' has been updated.' + CASE WHEN i.first_name != d.first_name
THEN @newline + @tab + 'Previous First Name was '
+ QUOTENAME(d.first_name, '''')
+ ' and the new First Name is ' + QUOTENAME(i.first_name,
'''') + '.'
ELSE ''
END + CASE WHEN i.last_name != d.last_name
THEN @newline + @tab + 'Previous Last Name was '
+ QUOTENAME(d.last_name, '''')
+ ' and the new Last Name is '
+ QUOTENAME(i.last_name, '''') + '.'
ELSE ''
END + ''';'
FROM #inserted i
JOIN #deleted d ON i.person_id = d.person_id ;
-- where-clause omitted when all rows are expected to have at minumum one auditable change, add where-clause if needed
-- send emails
EXEC(@sql) ;
---------------------------------------------------------------------------
END
:w00t:
1. To check the count you do just this (you should try to make trigger as lightweight as possible):
if (select count(*) from inserted) > 20 -- change this to limit number of batch updates allowed
RAISERROR('%d is the max allowed for updating test_table', 11, 1, 20) ;
2. You still refer to #tables in you trigger sample
3. I like your note about SQL Injection... I would suggest to defend your database against SQL Injection at point where data is modified, well before trigger as it could be too late....
And for the previous post about using email from trigger:
Yes, xp_ uses service broker to email and sending email itself is asynchronous. But, it enlists the message into queue to send in a synchronous manner to get the handle back (messageid).
Yes, you can use service broker to make fully asynchronous - and it is one of the methods which fails into category of methods we advised to OP.
However, from a practical way of implementation, it would be much better to get a service running from some application server which would send an email to subscribers. It is quite often that company policy do not allow emailing from SQL Servers...
March 9, 2012 at 9:22 am
Eugene Elutin (3/9/2012)
1. To check the count you do just this (you should try to make trigger as lightweight as possible):
if (select count(*) from inserted) > 20 -- change this to limit number of batch updates allowed
RAISERROR('%d is the max allowed for updating test_table', 11, 1, 20) ;
Re: "you should try to make trigger as lightweight as possible"
I could not agree more, Eugene. Unfortunately your example will force a full scan of the inserted table (a property of count() ), whereas mine will only read the top n rows and then stop. Imagine if the inserted table contained 50,000 rows. Your select count(*) must visit all 50,000 rows to return the count. Mine will stop at row 21 and return.
2. You still refer to #tables in you trigger sample
Thank you. I have corrected the example trigger code above.
3. I like your note about SQL Injection... I would suggest to defend your database against SQL Injection at point where data is modified, well before trigger as it could be too late....
I agree with you here as well in that the calling application should be attempting to prevent sql injection...however multiple layers of protection are strongly advisable, if not standard for well-written applications. As a last line of defense the database code must protect the data. If someone told you that all calling applications would prevent sql injection would you believe them and omit the call to QUOTENAME? I most certainly would not.
And for the previous post about using email from trigger:
Yes, xp_ uses service broker to email and sending email itself is asynchronous. But, it enlists the message into queue to send in a synchronous manner to get the handle back (messageid).
Actually, xp_sendmail uses SQL Mail, a deprecated feature of SQL Server. I do not think it uses Service Broker at all, i.e. it implements a synchronous process to send mail, which is why it is advisable to use Database Mail instead.
Yes, you can use service broker to make fully asynchronous - and it is one of the methods which fails into category of methods we advised to OP.
However, from a practical way of implementation, it would be much better to get a service running from some application server which would send an email to subscribers. It is quite often that company policy do not allow emailing from SQL Servers...
I also agree with you on your point made about having the calling application handle email duties, where practical. That said, Database Mail can be quite convenient, scales well and has the added advantage of audit tables and logging.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
March 9, 2012 at 1:28 pm
Thanks A lot.
March 12, 2012 at 4:44 am
opc.three (3/9/2012)
Re: "you should try to make trigger as lightweight as possible"
I could not agree more, Eugene. Unfortunately your example will force a full scan of the inserted table (a property of count() ), whereas mine will only read the top n rows and then stop. Imagine if the inserted table contained 50,000 rows. Your select count(*) must visit all 50,000 rows to return the count. Mine will stop at row 21 and return.
create table #t (val varchar(10))
insert #t values ('asdasd'),('sdfsd'),('werw')
select COUNT(*) from #t
select top 1 * from #T
Execute both selects and check the execution plan. You may be surprised to see that to SELECT TOP 1 it's also a TABLE SCAN...
"lightweight as possible" mostly important to trigger when you perform multiple single transactions. I guess it will not make huge difference when you update millions of records at once, should say that your trigger simply prevents such updates to happen....
I agree with you here as well in that the calling application should be attempting to prevent sql injection...however multiple layers of protection are strongly advisable, if not standard for well-written applications. As a last line of defense the database code must protect the data. If someone told you that all calling applications would prevent sql injection would you believe them and omit the call to QUOTENAME? I most certainly would not.
I'm not against using QUOTENAME, I've just mentioned that it does look a bit too late to defend against SQL Injection in a trigger... But Yes, in a way you are using dynamic sql it will help.
Actually, xp_sendmail uses SQL Mail, a deprecated feature of SQL Server. I do not think it uses Service Broker at all, i.e. it implements a synchronous process to send mail, which is why it is advisable to use Database Mail instead.
Actually, I just used the old name for the SQL Server mail by mistake. I should really use the sp_send_dbmail (SQL Database Mail), which is does use Service Broker to send email.
March 12, 2012 at 9:10 am
Eugene Elutin (3/12/2012)
opc.three (3/9/2012)
Re: "you should try to make trigger as lightweight as possible"
I could not agree more, Eugene. Unfortunately your example will force a full scan of the inserted table (a property of count() ), whereas mine will only read the top n rows and then stop. Imagine if the inserted table contained 50,000 rows. Your select count(*) must visit all 50,000 rows to return the count. Mine will stop at row 21 and return.
create table #t (val varchar(10))
insert #t values ('asdasd'),('sdfsd'),('werw')
select COUNT(*) from #t
select top 1 * from #T
Execute both selects and check the execution plan. You may be surprised to see that to SELECT TOP 1 it's also a TABLE SCAN...
"lightweight as possible" mostly important to trigger when you perform multiple single transactions. I guess it will not make huge difference when you update millions of records at once, should say that your trigger simply prevents such updates to happen....
It should come as no surprise that both produce a table scan, as neither contain a WHERE-clause. That is beside the point. The point is the I/O wasted when using select count(*). Looking at the execution plan is only part of the story, and in this case there is not much difference.
If we create a table of non-trivial size, say 100,000 rows, and we look at the I/O statistics we can clearly see the difference between the two techniques:
-- build a test table with 100,000 non-essential rows
SELECT TOP 100000
o1.*
INTO #test_objects
FROM master.sys.objects o1
CROSS JOIN master.sys.objects o2
CROSS JOIN master.sys.objects o3
GO
-- disable rowcount output to reduce clutter
SET NOCOUNT ON
-- enable I/O stats output
SET STATISTICS IO ON
-- count all rows
SELECT COUNT(*)
FROM #test_objects;
-- count 1 row
WITH cte
AS (
SELECT TOP 1
*
FROM #test_objects
)
SELECT COUNT(*)
FROM cte;
Output (line breaks adjusted for readability):
Table '#test_objects'. Scan count 1, logical reads 1174, physical reads 0, read-ahead reads 0,
lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#test_objects'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0,
lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
We still see a scan (Scan Count = 1), but take a look at the reads. With the select count(*) we see 1174 reads. With the cte using TOP(n) we see a mere 2. A huge difference in a small and contrived example. With concurrency, and potentially much larger row counts you can see which technique gives us a better chance to scale.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
Viewing 15 posts - 16 through 30 (of 31 total)
You must be logged in to reply to this topic. Login to reply