December 4, 2012 at 12:29 pm
We have a Log table where we store the information we Catch in the Application. Initially we got less number of records in the Log Table, Now the Size of the Table has grown up. It is almost 20 Gb in size
We are now taking the report like
1) how many Connectivity errors are available so far
2) how many logdescription field in a year has the text like '%couldnot%'
The Table has the following columns
LogID LogDescription LogNumber LogPageName ErrorType LoggedDate
-----------------------------------------------------------------------------------------
1 Failed xxx 12 login.aspx Application 12/02/2006
2 Couldnot connect 16 Connection.aspx Connectivity 19/08/2007
------------------------------------------------------------------------------------------
ErrorType has 5 types
LogNumber has 10 type of number
logPageName is of type 60+
LogDate is daily logged date
In this table, i have millions and millions of records. So please suggest me that if Partition is the only way or we can do some kind of tricks to generate the report..?
December 4, 2012 at 12:34 pm
Partitioning is mostly about maintenance and data loads, not performance.
Please post query, table definitions, index definitions and execution plan, as per http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
Searching on something like '%couldnot%' is going to be slow. Because of the leading wild card, that can't use indexes. Have you considered full text indexing?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 4, 2012 at 5:28 pm
As a side bar, having 20GB of errors would be serious temptation for me to have someone fix the problems in the front end that are causing all of the errors.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply