How to deploy many stored procs to multiple DBs faster?

  • Hello all,

    I've inherited an environment where 1000+ databases are running on single SQL instance (granted, these are not VLDBs 🙂 ). I will also need to deploy 200+ stored procs to each of those DBs, there is SQL script file will all needed drop/create statement. Here is the issue - it takes 30+ seconds to execute the script against one DB. That means 30,000+ seconds, or 500hrs to run it against all DBs sequentially. Even if I use RedGate's SQL MultiScript and run 25 parallel threads, I'm still looking at ~20hrs, and this is prohibitively long time...

    What are my options to speed up such process? I think it takes time because SQL runs some syntax checks when creating new procs, and keep track of sys.dependencies, etc... Is there any way to "avoid" such checks?

  • Another option is to use CMS.

    You can deploy a script to an entire set of servers at once that way.

    I have seen some places develop there own deployment tool via C# to automate the process as well.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Thank you. Yes, we have our own tool, and we can deploy to multiple servers at once. But the real issue is the time it takes to run the drop/create script against single DB. Even with all the tools/multi-threading available it is hard to manage more than 5-10 apps open. If this was one-time deal that would be fine, but this is pretty regular activity - once a month.

  • Why do you think that doing this all at once is going to decrease the amount of time; i.e. it may take 30 seconds/db now, but when you run it on all dbs it "might" take a heck of a lot longer. You may want to test your theory somehow before assuming anything.

    Jared
    CE - Microsoft

  • Yes, it might... That's why I said running it with 25 threads using MultiScript does make it faster, but not fast enough; and having more than one instance of MultiScript does not really help. Again, I'm less concerned about finding the ways of running it in parallel, more interested in possibilities of speeding up single script executon. And this is not data update/schema change script (where some indexing, etc could help), it is straight drop/create proc, like this:

    drop proc proc1

    go

    create proc proc1

    ..

    go

    drop proc proc2

    go

    create proc proc2

    ..

    go

    ...

    go

    drop proc proc200

    go

    create proc proc200

    ..

  • Why drop instead of alter?

    Jared
    CE - Microsoft

  • 30 seconds for dropping and creating over 200 procs does not sound unreasonable. It starts to get crazy when you have to run this on 1,000 db's. Suddenly you are dropping and creating over 200,000 procs. This is going to take a lot of time. The amount of I/O alone is quite a bit with that much information. Once you add in parsing and compiling I just don't think this is every going to be a really fast process.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Well, that's how script was generated. I could see ALTER being faster than DROP/CREATE... not sure by how much though, will have to test.

  • Unless this information is sensitive.....

    Why are you doing this task?

    What's the purpose of these stored procedures?

  • These procs are used by our web app; this is standard release procedure used for quite some time, but it slowness is becoming a problem because I can't limit access to web site for many hours while updates are running.

  • Ok.... Old release process that doesn't scale.... doh!

    Hmmm

    Unless you can buy some time and try what I'm going to suggest going forward.... you're gonna have to grit your teeth and bare it I'd say. I'm amazed you have 1000 databases in one instance... the dbs must be tiny or the db cache must be massive.... You ever looked at the parse rates of queries running on these dbs?

    As an asides... to resolve this permanently: What about getting some time with the development team, look at how you can centralise these stored procs on to another DB (or dare I say it, even in master somewhere...security nerds attack me now!), and then call them from that DB...

    This process needs to change... unless they like having the following issues:

    a) Non-scalable releases that increase outage time as new web app DBs are deployed

    b) Risk of large scale issues if a logic error appears in any of those stored procs which affect the application functioning correctly

    c) 1000 different databases which could contain any version previously release if one of them has been missed from the release process by accident (and let's face it accidents happen all the time).

    d) A narked off DBA everytime they want to 'tweak' something in those SPs....

    And, as I always say: Developers aren't DBAs...... and the amount of times I run into client configurations which are running something that hasn't been eye balled by a competent DBA is shocking.......Swiss cheese has less holes....

    Good luck!

    Michael

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply