September 4, 2008 at 2:05 pm
I have just started at a new company and their primary database that has 299 user tables in it. The manager wants to no which tables are no longer in use and can be done away with. What would be my best option for figuring this out?
Warm Regards,
Arthur Lorenzini
Sioux Falls, SD
September 4, 2008 at 2:41 pm
...deny select on all the tables and watch for errors in the application? 😀 Ok, bad advice.
Seriously, you might want to try to trace this but if there is abundant activity that would be nuts. It it was 2005 I would say that you could look for unused indexes but I don't know how to go about doing that in 2000.
Outside of running traces to try to catch the use of the tables I don't know of any way. Maybe others will.
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
September 4, 2008 at 3:09 pm
I did this manually once.
Search all the code on SQL server and all the code on the client application (search for each table name, also need to be carefull for table names that give false results like this : Order And OrderDetail (doing a search for Order, you can get a false match from OrderDetails).
For the app part, I had to reduce my search to strings only between quote (") to get more accurate results.
Once you're sure that a table is not used, take the table out and move it into a temp database and back that up. I kept that DB up for a few months untill we felt safe to remove it from the server (still have the backup to this day).
It's really a painful procedure, but it might be worth it... while we're at it, why does your boss want you to do this? I'd need to have a real good reason to spend a few hours, or days on this task.
September 4, 2008 at 3:38 pm
The reasoning behind it is that this database has been around for about 5 years and it has gone through several version and older tables were never cleaned out.
Warm Regards,
Arthur Lorenzini
Sioux Falls, SD
September 4, 2008 at 7:00 pm
Ya sorry about the "bad" news.
A little word of advice, if you are not forseeing any problems (performance, db size, etc.)... your time might be better invested elsewhere. Over 5 years, anything can be used from a ton of different applications. That makes this task very hard to do.
Also I've been on the path of rebuilding an app from an existing DB. It's easier to just rebuild the design, import only the data you need, and leave out the rest. That's the best cleaning method I know of... most expansive one too!
Best of luck!
September 5, 2008 at 6:17 am
I've unfortunately had to do this a number of times against a legacy db I inherited.
I usually start off by checking for occurrences in the syscomments table and then wade through the results to validate usage. After that I run the drops in our beta environment and let QA bang away on it for a month or so. Once they're happy, then it gets run against our prod db.
Never fails that something will come up broken so I script out both objects AND data and keep them in a safe place before running the drops.
_____________________________________________________________________
- Nate
September 5, 2008 at 7:49 am
Thanks for all the advise. It looks like an ugly jobs awaits me and I might as well get after it.
Warm Regards,
Arthur Lorenzini
Sioux Falls, SD
September 5, 2008 at 8:11 am
It is. If you scan only syscomments, it means that 100% of the data access is done via stored procs or other sql objects.
Since it wasn't the case for me, I also had to scan the application code. That basically trippled the amount of work needed... and then again we only had 1 application using that db.
Good luck!!!
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply