September 30, 2009 at 9:13 am
I have a report table I need indexes on (non clustered) already have clustered.
My problem is that this table has over 400 fields, I really don't want to manually add a non-clustered to every field.
How can I do this with t-sql?
September 30, 2009 at 9:23 am
krypto69 (9/30/2009)
I have a report table I need indexes on (non clustered) already have clustered.My problem is that this table has over 400 fields, I really don't want to manually add a non-clustered to every field.
How can I do this with t-sql?
ok, you can do this in a script but I am really interested to know why you need to index every column. every index you add will add overhead to data manipulation that you do and cause you a maintenance nightmare, if you need to rebuild/reorganise the indexes on a regular basis.
Just out of curiosity what sort of application do you have that, has tables with over 400 columns
--------------------------------------------------------------------------------------
[highlight]Recommended Articles on How to help us help you and[/highlight]
[highlight]solve commonly asked questions[/highlight]
Forum Etiquette: How to post data/code on a forum to get the best help by Jeff Moden[/url]
Managing Transaction Logs by Gail Shaw[/url]
How to post Performance problems by Gail Shaw[/url]
Help, my database is corrupt. Now what? by Gail Shaw[/url]
September 30, 2009 at 9:23 am
SQL server 2005 has a max number of NC indexes of 249 so you can't have 400 indexes.
even though the pratical limit is much lower i would not recommend having more than 4-5 indexes.
You should post your table structure , query and query plan for some index help
September 30, 2009 at 9:27 am
This is for ad-hoc reporting and is one big 'flat table'.
So, I never know what fields they could pick, they could pick any or all of the 400 columns.
Long story but the data in this table is imported from a vender via SSIS.
September 30, 2009 at 9:30 am
krypto69 (9/30/2009)
This is for ad-hoc reporting and is one big 'flat table'.So, I never know what fields they could pick, they could pick any or all of the 400 columns.
Long story but the data in this table is imported from a vender via SSIS.
To be honest you cannot create indexes just based on assumptions. It would be better to either ask what are the more common columns they tend to use, or wait until they start running queries and troublehoot when/if you get any performance issues. that way you will know what is being accessed and you can add indexes accordingly if they are needed.
--------------------------------------------------------------------------------------
[highlight]Recommended Articles on How to help us help you and[/highlight]
[highlight]solve commonly asked questions[/highlight]
Forum Etiquette: How to post data/code on a forum to get the best help by Jeff Moden[/url]
Managing Transaction Logs by Gail Shaw[/url]
How to post Performance problems by Gail Shaw[/url]
Help, my database is corrupt. Now what? by Gail Shaw[/url]
September 30, 2009 at 9:31 am
Makes sense thanks!
September 30, 2009 at 9:32 am
You may want to look putting your data into a OLAP structure for reporting, as a table with 400 columns and 249 indexes is never going to perform well.
September 30, 2009 at 9:33 am
The only way you'll get an index on every column is to vertically partition the table (move some of the columns to separate tables), or build a lot of indexed views.
However, there's generally no need to index every column in such a table. Run a trace on every query done on the table, see what they actually need, and index for those. It's pretty much guaranteed that there will be a small number of queries that are done over and over again and could really benefit from indexing, while there will be a larger number of one-off queries where indexes wouldn't really be helpful.
Also, if you have individual indexes on each column, they will mostly be ignored, because if the query includes columns not in the index, it'll require bookmark lookups anyway. And, beyond a certain point, the SQL engine will ignore indexes because the cost of working out the solution to use them is greater than the cost of simply doing a table scan.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
September 30, 2009 at 10:25 pm
krypto69 (9/30/2009)
This is for ad-hoc reporting and is one big 'flat table'.So, I never know what fields they could pick, they could pick any or all of the 400 columns.
Long story but the data in this table is imported from a vender via SSIS.
It would probably be very well worth your while to normalize the data in several tables instead of leaving it as one very wide table.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 1, 2009 at 1:16 am
krypto69 (9/30/2009)
This is for ad-hoc reporting and is one big 'flat table'.So, I never know what fields they could pick, they could pick any or all of the 400 columns.
Long story but the data in this table is imported from a vender via SSIS.
Does the data in this table get updated regularly?
"Keep Trying"
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply