Updating schema reference automatically in stored Procedures

  • Requirement: Need to copy all objects (tables, fn, stored Procedures) from default dbo schema to new schema. This new schema is created programmatically on user action from the front end.

    Problem: Copying the objects is straightforward for tables. For stored procedures and functions also it is not too bad to make a copy.

    My issue is that reference to tables in the stored procedures of dbo schema, should be updated to reflect the new schema which has been created programmatically.

    So for eg: if my stored proc has statements like

    select a,b from foo

    then, in the new schema (named ,say, dbo2) the stored procedure should updated reference to

    select a,b from [dbo2].foo

    Right now i am achieving this by:

    1. getting the definition of the stored procs from the source db schema
    2. using a lot of substring and charindex function calls to go over the text of the schema and stuff the name of new schema in the stored procedure text.

    Is there another better way to do this? My code works if the original stored procedure was referencing tables like below since it is a simple text replacement.

    select a,b from [dbo]. foo

    but does not work when it encounters

    select a,b from foo

     

    How To Post[/url]

  • Modifying all the code to point to a new schema is a very hard task.  Depending on the amount and complexity of your code, it might be a task that will take you few weeks to develop in order to cover all cases.   Maybe instead of modifying the code in the procedures, you can change the default schema for this user to the new schema?  This way when you don't specify the schema's name, it will first look for the object at the new schema and you won't have to modify the code to specify the new schema's name.  Have to admit that I don't like this solution because I don't like to count on defaults (and I do hope that this user is not part of the sysadmin server role, because in that case I think that it will continue working with the dbo as it's default schema), but in this situation it might be the less worth solution.

    Adi

  • Out of interest, could you explain the background to this requirement?

    Have you thought about tracking down all missing schema references in the master (I assume 'dbo') schema and adding them? There's probably a way of doing this using one of the SQL tools out there.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Source control here will be your buddy. First, like Phil says, make darned sure every query is using the schema as a part of the query (honestly, this should be everyone's standard regardless). With that in place, if all objects are available in source control, it should be pretty simple text manipulation through a search & replace to get them all updated. Then, deploying from source control is achievable through a number of different means. That's how I'd go about it.

    And yeah, this is an odd request. Are you creating duplicate schema's per client or something? Scaling could be a serious issue if so.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Thanks for the responses guys. For background, we have an application to do some complex calculations on data imported via excel.

    Our users want to also safely try "what if" scenarios with different input data, or different logic in stored procedures. So far, we have been doing this by simply creating a new database. But as the data count goes up, there is maintenance required. Also, getting approvals on production systems is a little time consuming. So we are exploring if creating these dynamic new schemas on user demands could help us out.

    Options on this so far seem to be:

    1. Continue on the path of substring and charindexes and match patterns and replace.
    2. Rely on the default schema. Which, as mentioned, by the poster, I also would prefer not to rely on. As an added security i 'could' deny

      select permissions on the dbo (original) schema for the new schema user - so there is no cross contamination of data.

    3. Go to master source code and update references in all stored procedures. I think we will do this too  - as a best practice - moving forward. But with close to 3000 stored procedures and about a hundred functions, it is going to take some time.

    Thanks for your suggestions - I think this will take a multi pronged approach. I had also toyed with using lex or yacc to build a grammer tree and safely do this, but my parsing skills are not quite there. Maybe I will make it an open source project some day 🙂

    How To Post[/url]

  • Have to admit that I don't think that this is a simple text manipulation, because it could get very messy.  Depending on how many "conventions" were used in the code, this could be very tricky.  For example if some queries did specify the schema name (dbo) and other didn't.  if there is some code that the tables in the from clause are separated  by comas and the programmer specified the join terms in the where clause instead of using the join in the from clause.  If you have a dynamic SQL that you get the object's name only during runtime.  I know that all the things that I mentioned are not best practice, but unfortunately I've seen them being used in places

    Adi

  • Agree. Thankfully this is in the exploratory phase only. For the immediate needs, I am planning to only give individual permissions to select/execute on original dbo tables/SP that i want the new schema user to have access to. So this way, I am not dependent on the "default" schema.

    How To Post[/url]

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply