Automating the Database Patch

  • Hi

    We are into application development and everytime we release the product there are large number of database patches that needs to be run.

    Right now these .sql files are being created by the individual developers and the release manager discuss with development team and ordering them and then finally execute them manually.

    This is becoming very combursome procedure and trying to automate this process.

    Is it possible to write a Stored Procedure that will accept the block of SQL statements as a parameter and then run these block of sql statements?

    Generally the .sql statements will have ALTER scripts, INSERT Statements and Update statements etc..

    One e.g. is given below

    if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[RULES]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)

    begin

    drop table [dbo].[RULES]

    IF @@ERROR = 0

    print 'Table RULES dropped'

    ELSE

    print 'Drop of table RULES failed'

    END

    INSERT INTO [dbo].[MY_CONFIG_WORK]

    SELECT [CONFIG_SECTION], [CONFIG_KEY],[CONFIG_VALUE_TYPE], [CONFIG_VALUE] FROM [dbo].[MY_CONFIG]

    GO

    Regards

    NN

  • I am completely aware of the approach you people take during the release night. see there is no any specifc procdure to deply these kind if scripts. totally based on your release procees .

    but some steps are there which can make your life easier

    1) syntactically scripts should be fine verfiy and test in any lower environment like testing or staging

    2) all objects references like column or table which are part of scripts must be available on prod environment.

    3) try to consildate some similar scripts to reduce the individual run, it can helkp to use resources (IO, memory etc) with optimal way (exmaple if two columns getting updated in two different sql in two scripts ...club them if possible).

    try to do some migration (DML) as pre release stuff

    4) try to drop column/table as LAST step of whole sql process (not in middle or first)

    5) break some DML scripts in batches so that it doesnt affect the log size badly (you often find these kind of scripts in you release sql pool)

    6) always update stats and index rebuilds as last step

    now LAST but not LEAST... create a testing environment and TEST TEST these scripts there make sure all the working fine .. place a SQl profiler trace and check cpu, io , memory usage and work on highlighted queries. also keep close eye on lof file size.

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • Hi Bhuvnesh

    Thanks for the response; I think you have not understood my question.

    Regards

    Narayanan

  • n_narayanan (12/4/2012)


    Hi Bhuvnesh

    Thanks for the response; I think you have not understood my question.

    Regards

    Narayanan

    🙂 Hello Narayanan,

    If I am understanding your question correctly, you can keep the sql statements in a few files. Then, use osql command utility and use these files as input files. The osql utility allows you to enter Transact-SQL statements, system procedures, and script files. This utility uses ODBC to communicate with the server

    Check below link:

    http://msdn.microsoft.com/en-us/library/ms162806.aspx

  • Hi Arun

    Thanks a lot. Yes this link has helped a lot and I am able to succeed to my first step using sqlcmd function.

    Microsoft is planning to remove the osql feature what you had recommended and here is the alternative link for sqlcmd

    http://msdn.microsoft.com/en-us/library/ms162773.aspx

    Many thanks for the pointer

    Regards

    Narayanan

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply