March 29, 2012 at 3:16 am
Hi all,
I'm using 2008 R2 Standard. I have a table containing around 2 million records (it will grow to 10x that). Apart from a single update every day, it is read-only, so I want to optimize it for reading as much as possible. The update can be large and I use SqlBulkCopy and/or MERGE to add/update/delete records.
I used SQL Server Profiler and Database Engine Tuning Advisor to recommend additional indexes and statistics, and after implementing the recommended changes on my local database, the read performance is massively improved. However, the update performance is now crawling and will take most of the day. So my question is about how to get the best of both worlds, if possible.
Is it just a question of experimenting with indexes to find the best compromise? Should I use Server Profiler and Tuning Advisor to analyse the updates procedures (this sounds like it might complicate things)? Is it possible to 'disable' indexes and statistics before the update and re-apply them afterwards?
What's the best way to approach this?
Thanks in advance.
Paul
March 29, 2012 at 3:41 am
Drop or disable the indexes before, rebuild them after. First though check that they are necessary. DTA is very prone to over-recommending.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 29, 2012 at 4:32 am
Hi Gail,
Thanks for your reply.
Does it matter which - drop or disable? Is one better than the other in this scenario?
I've read 'Disabling an Index' in 'Stairway to SQL Server Indexes: Level 12, Create Alter Drop', and it seems to confuse the two.
I've read elsewhere that DTA can overdo things, but how do I check if the indexes/statistics are necessary? As someone who doesn't have a thorough grasp on indexing, it's very easy to just accept the recommendations!
Thanks again.
Paul
March 29, 2012 at 4:49 am
Disabling is easier because you can just rebuild the index afterwards and don't need to worry about the definition (ALTER INDEX <index name> REBUILD). If you drop it, you'll need to recreate it, so will need to hard code the index keys into the job.
Stats don't matter (though I'm not personally fond of manually creating stats). For indexes, read up on the index usage stats DMV (sys.dm_db_index_usage_stats) and check this out
http://sqlinthewild.co.za/index.php/2011/11/11/sql-university-advanced-indexing-indexing-strategies/
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 29, 2012 at 8:14 am
Some additional things to consider is take a snapshot of the database size and logfile size before and after a load to determine database growth. Then set a growth value of the files in size not percentage that is at least equal to that growth, I might consider double or even triple myself. One of the biggest bottlenecks is the time it takes for the files to grow when they reach fullness.
Also, profiler is a great tool but verify your indexes are all needed and not just recommended. The more indexes you have the longer adding data takes and some of those indexes might be good against one query while other recommendations could have been rolled into that as well but it recommended a different need.
Example indexes that can be collapsed into one.
Last Name, First Name, City
Last Name, First Name
Last Name
you can drop index 2 and 3 because 1 will cover all scenarios.
Also, consider the worst queries from your profile to see if maybe they can be written to take advantage of other existing queries with good performance allowing you to drop indexes that are for these.
As far as stats, they are somewhat important to decisions on index choices so I would refresh them after large loads.
March 29, 2012 at 8:19 am
Thanks again Gail. I've read your post and watched a video you linked to - very useful.
I'm a bit confused about stats now, as DTA recommended creating lots (19 groups?), and you say you're not fond of manually creating them.
I've been creating stats manually (not as many as 19 groups) and then rebuilding them at the end of each daily maintenance. Is this not good practice?
Paul
March 29, 2012 at 8:31 am
Updating stats at the end of imports is very good. I'm not that fond of manually created stats (unless auto-create stats is off)
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 30, 2012 at 8:26 am
bpw (3/29/2012)
Thanks again Gail. I've read your post and watched a video you linked to - very useful.I'm a bit confused about stats now, as DTA recommended creating lots (19 groups?), and you say you're not fond of manually creating them.
I've been creating stats manually (not as many as 19 groups) and then rebuilding them at the end of each daily maintenance. Is this not good practice?
Paul
1) Rampant DTA usage has led to some of my most involved client jobs cleaning up the mess. I spent over 200 man hours at one client sorting out many hundreds of indexes that were DTA created. Over 2/3rds were deleted with just about a 5% drop in aggregate read performance but a HUGE increase in concurrency and DML performance.
2) If you have lots of manually (DTA?) and automatically created stats and you are modifying/inserting lots of rows you can get hammered by the system updating those stats every 20% modification level during loads. It is often best to disable autostats, do massive mods/inserts then manually update stats (and possibly turn on auto stats again)
3) Consider getting a professional on board to help you get your indexing, stats and loading process optimized while also mentoring you on how to do the same in the future.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy