July 22, 2012 at 5:24 pm
I recently had cause to change the definition of a user defined data type and was I suppose intrigued by the challenges faced with the process and the little information I could locate on the internet that catered fully for making a change in a controlled and reliable manner.
We have in our production databases a little over 1000 tables, some 8000 stored procedures, 500 functions and nearly 250 user-defined types with at present about 30 production databases in existence. The general solutions to the problem of needing to re-define a UDT was a lot of manual drop this, copy data to temporary columns, re-create everything etc etc. It seemed to me like a manual process was very open to catastrophic failure and one I would definitely not use in a production environment - surely if one could do the job manually, one could also handle it in a more automated manner?
The particular UDT we needed to change from a nvarchar(20) to nvarchar(40) was used in about 40 columns across 30 tables with some of those tables being the most heavily populated and queried tables in the database. These tables are supplemented by no fewer than 300 indexes, constraints and foreign keys - trying to handle manually dropping all those objects, and the 1000 odd stored procedures that referenced columns of the specified type across 30 production databases (not to mention 60 or so test databases) was a tasks to which I was not prepared to dedicate my time. I don't get paid for the mundane and repetitive after all...
Too hard, can't be done my colleagues said... challenge accepted.
Attached is three scripts I created to accomplish the task; one main processing procedure and two functions which are used to build the SQL to drop and create an index/constraint/key.
Bearing in mind these scripts were designed and tested in my environment and there may be differences or situations in other environements which I did not encountered and thus have not catered for. I accept that validation could be improved - primarily checking compatibility between the current type definition and the new definition to ensure you're not changing an nvarchar(20) to a tinyint for example. Given that it was created to solve an internal problem and I didn't actually intend on publishing the code it's not the cleanest thing I've written, but I'm happy with it none-the-less.
These scripts have been used in SQL Server 2008, SQL Server 2008R2 and SQL Server 2012 environments with success across the board - all development, test and production databases were succesfully updated. They were designed for 2008 primarily so they don't cater for columnstore indexes and other new features of 2012 - but we're only in the test phase of 2012 so we don't presently make use of any new features.
What I'm looking for at this stage is some feedback from those experienced amongst us - pros and cons - on the procedure, information about some setups that it would not be practical in or other suggestions on better handling of the process.
I am presently adapting a copy of this process to solve the problem of changing database collation. Using the guts of the process to drop/recreate objects with the new collation is so far succesfull in test environments but I've yet to deploy in a production setting, though I hope to do so shortly once my tests have been thoroughly evaluated at which time I may also consider posting the code for that.
As a side note - use of this script did expose to me a significant number of invalid stored procedures that were sitting in our databases - they were no longer in use and did not re-compile due to changes in the database schema that were not compatible with the current procedure definition. After getting over the disbelief of how little maintenance has been done on our databases in the past I purged nearly 500 stored procedures and UDF's from each of our databases.
************************************************************************************************
DISCLAIMER:
I wrote this procedure specifically for my own development, test and production environments.
If anyone intends on making use of this in your own environment I strongly suggest thorough testing in a test
environment prior to production usage.
I cannot and will not make any guarantees as to the suitability or reliability of the process to environments other than my
own and strongly recommend that only experienced DBA's make use of the process and only after gaining an understanding
of how it works.
************************************************************************************************
May 15, 2013 at 6:43 am
Great idea - but I found a flaw...
If you have a view (2nd level) that references another view (1st level), unfortunately, the stored procedure does not drop any 2nd level view, which makes the drop of the datatype fail.
May 20, 2013 at 1:36 pm
Hello, I've enjoyed working with these scripts and adapting them to my environment. Thank you. Couple of changes and ideas.
1. I've modified the functions to take two parameters: Table Name and Index Name as some indexes have the same name but different tables.
2. Generated an automated loading of the two functions across multiple databases: Change_UDT_Definition_LoadFunctions.
3. Generated a wrapper script to run the proc in multiple databases: Change_UDT_Definition_WrapperScript.
4. Created logging to generate a manual way to script of what is going to run before it makes changes. ( this has come in handy in reference to the 2nd level Views failing out the DROP TYPE)
5. The Second part of the logging conducts the actual transformations and drops etc... and loads into the same table for tracking.
6. In my environment we are not needing to drop PKs etc.. just the FK so I've modified the script to get just the FK_s with a WhereIs_Foreign_Key = 1
7. Added a parameter to turn on [ @PrintOnly =1 or 0 ] so you can run in an environment to see what is going to happen and not modify anything.
Hope this helps... let me know if you have any questions.
Best,
Andrew
Andrew Brittain
Database Administrator
www.galaxysql.com
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply