July 26, 2021 at 1:38 pm
In the dev environment there is a table that gets loaded with verbose log data by a DOT NET app. The only thing that I care to retain in that table are entries designated as ERROR level entries. There are a total of nearly 2 million entries into this table every hour.
What do you suggest is the best way to handle this ?
? triggers - I think the table is too busy for this.
Scheduled job - I think this is a maybe but with a table so large I would prefer to truncate then delete non-error types, and it is a schedule rather than an on condition action .
Alter - I'm just starting to learn about this and not sure what type or how to identify the condition to trigger it.
The other part of this is how to manage those 10k error records. Once the history table reaches 10k records, ??remove 500??
Any help would be greatly appreciated. Thanks.
July 26, 2021 at 2:00 pm
Changing the .NET app so that it only loads the data which is required is not possible, I presume?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
July 26, 2021 at 2:03 pm
If nested triggers are disabled, just create an instead of insert trigger which filters the inserted table on the conditions you want. eg:
CREATE TRIGGER TR_I_YourTable
ON YourTable
INSTEAD OF INSERT
AS
BEGIN
INSERT INTO YourTable
SELECT YourColumns
FROM inserted
WHERE YourColumn LIKE '%ERROR%';
END
GO
If nested columns are enabled you will also need to play around with TRIGGER_NESTLEVEL().
As Phil has mentioned, the best option is to sort out the .Net code.
July 26, 2021 at 2:29 pm
Agree with Ken. Use a trigger to only write the data you want in the table to begin with.
The only thing Ken left out was a:
SET NOCOUNT ON
at the start of the trigger, for efficiency.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
July 26, 2021 at 2:58 pm
Phil - Correct. Devs want it all in dev.
Ken - AWESOME "INSTEAD OF" Why didn't I think of that. Is a trigger to delete ## history table records WHEN (sys.partitions.rows for that table ) shows that the record count exceeds 10k ? IS there a better way
Scott - Thanks. Good reminder.
July 26, 2021 at 3:03 pm
Phil - Correct. Devs want it all in dev.
Ken - AWESOME "INSTEAD OF" Why didn't I think of that. Is a trigger to delete ## history table records WHEN (sys.partitions.rows for that table ) shows that the record count exceeds 10k ? IS there a better way
Scott - Thanks. Good reminder.
If you're on SQL 2016, check if page compression will save you significant space instead. 10K rows is actually rather small nowadays for SQL Server, I wouldn't worry about it until at least 200K or more rows, unless they're (very) wide rows.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
July 26, 2021 at 3:04 pm
Phil - Correct. Devs want it all in dev.
Then add a switch to the code … this sort of thing:
IF (DEV)
DEBUGMODE="Verbose"
ELSE
DEBUGMODE="Standard"
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
July 26, 2021 at 3:04 pm
Budd wrote:Phil - Correct. Devs want it all in dev.
Ken - AWESOME "INSTEAD OF" Why didn't I think of that. Is a trigger to delete ## history table records WHEN (sys.partitions.rows for that table ) shows that the record count exceeds 10k ? IS there a better way
Scott - Thanks. Good reminder.
If you're on SQL 2016, check if page compression will save you significant space instead (using sys.sp_estimate_data_compression_savings). Also, 10K rows is actually rather small nowadays for SQL Server, I wouldn't worry about it until at least 200K or more rows, unless they're (very) wide rows.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply