Problem: The QA team as well as Dev team will update their Development Database with the current code via an update\upgrade script. We fall under the “State or Model Based Approach” meaning “DB Developers only concern themselves with defining the desired end state, not how the transition occurs”, per Redgate documentation regarding DevOps approaches. The current process in place now relies on SQL Compare to compare a previous version of the DB vs the current source control. Once SQL Compare finds the changes and creates the script, a very time-consuming process begins of copy and pasting other changes from other static scripts in TFS into the script created by SQL Compare. It is laborious and often error prone. It can take upwards of 4 hours a week to generate these scripts. Doing some basic math, you could extrapolate that this process costs the company over 10K a year.
There has got to be a way to automate this process and schedule an update\upgrade script creation.
Solution: Use SQL Compare Command-Line and Powershell to automate this process.
For this process to work, you must have a DB, locally on your box using Redgate SQL Source Control synced to TFS, GIT or some Source Control platform.
The following code example includes comments to help you understand the process.
- I have created several Write-Debug messages to help understand what is going on when the script is being executed. By changing the Powershell Preference Variable we can turn off the debug output in one place. I always like seeing stuff printed to the screen.
- The $outfile_for_script variable is where the script will be written too.
- This process will check to see if there are any files that already exist in the directory found in the $outfile_for_script variable. If any files are found it will delete them.
- To run RedGate SQL Compare from the command-line you must set the location to the directory
##change to SilentlyContinue to remove write-debug output $DebugPreference = 'Continue' ## this is the filepath to a local directory. change to a directory on your box. Better to write the files locally and then copy to the network share. This process only writes once to the network share $outfile_for_script = '<path to where script will be written>' ##Clean up any old files before the process starts Write-Debug 'checking if script needs to be deleted' if ((get-childitem $outfile_for_script).count -gt 0) { Remove-Item $outfile_for_script\*.sql } ##For this process to work you must set the directory to the location of the SQLCompare.exe Write-Debug 'set location of sql Compare' Set-Location "C:\Program Files (x86)\Red Gate\SQL Compare 13"
Switches Used: I put each switch in the code example below on a single line. When executing this script, you must have all switches on one line.
- /Sourcecontrol1 defines what source you are using for Compare
- /revision1:HEAD tells Compare to look at the latest version of your source-controlled DB
- /sfx If using sql source control you must add the SQL Compare XML Fragment in a text file. Please see the link below to find this XML Fragment..
- /server2 is the target server
- /u2 user for the target server
- /p2 password for the target server user account
- /db2 database used on the target server
- /include this switch is issued as it suppresses errors if objects are identical
- /scriptFile tells Compare where to write out script
- /force tells Compare to overwrite the file if one is already in place
- /Quiet will suppress output to the console
- /filter:"<path to sql Compare >\SQL Compare\Filters\Tables_filter.scpf"
- The filter was created in the SQL Compare GUI and saved to the file system.
- Each execution of SQL Compare with the associated filters generates a file with only the objects the filter specifies
- The below links provide all the documentation for the /filter switches
# filter ..tables Write-Debug 'Compare tables...' .\SQLCompare.exe /Sourcecontrol1 /revision1:HEAD /sfx:"<path to ..ScriptsFolderXML.txt>" /server2:"<target server>" /u2:<user with permissions to target DB> /p2:<user password> /db2:<Target Database name> /filter:"C:\<path to sql compare>\SQL Compare\Filters\Tables_filter.scpf" /include:Identical /Options:ie,ip,f,oec,iup,iu,nc,ndl, dacia /scriptFile:"$($outfile_for_script)\script.sql" /Force /Quiet
I also use several options to change the default behavior of SQL Compare and those are explained below.
/Options:ie,ip,f,oec,iup,iu,nc,ndl, dacia
Options Used:
There are lots of options you can use with compare. Here are some of the ones I used for our process.
- ie IgnoreExtendedProperties
- Ignores extended properties on objects and databases when comparing and deploying databases.
- ip IgnorePermissions
- Ignores permissions on objects when comparing and deploying databases.
- oec ObjectExistenceChecks
- Checks for the existence of objects affected by the deployment by adding IF EXISTS statements in the deployment script.
- f forceColumnOrder
- if additional columns are inserted into the middle of a table, this option forces a rebuild of the table, so the column order is correct following deployment. Data will be preserved.
- iup IgnoreUserProperties
- If you specify this option, users' properties are ignored, and only the user name is compared and deployed.
- iu IgnoreUsersPermissionsAndRoleMemberships
- Ignores users' permissions and role memberships.
- nc DoNotOutputCommentHeader
- When this option is specified, comments and comment headers aren't included in the output deployment script.
- incd IncludeDependencies
- Includes dependent objects when comparing and deploying databases
- ndl NoDeploymentLogging only on SQL Compare 13
- removes deployment logging to sql monitor which we don’t have.
- dacia Drop and Recreate instead of Alter
- used when recreating the functions, views and sps. Does not work with types per the SQL Compare documentation
The above example executes SQL Compare and filters on only the tables with the specified switches. In my process, I use five different filters and several static files. I won’t bore you with all five filters. Because its more of the same. I do want to show you how I get the static files into the final script. Using the Get-Content cmdlet, I pull out the contents of the static files and append it to the file.
Write-Debug "create drop of dependent sps and then drop types" ##print a delineation between each script creation '-------------------' | Out-File $outfile_for_script\script.sql -Append -Encoding utf8 '-------------------' | Out-File $outfile_for_script\script.sql -Append -Encoding utf8 '-------------------' | Out-File $outfile_for_script\script.sql -Append -Encoding utf8 '-------------------' | Out-File $outfile_for_script\script.sql -Append -Encoding utf8 ##QP_Upgrade_Drop_Types_And_Dependencies.sql script finds dependent SP and drop then drops types get-content <path to static file>\Upgrade_Drop_Types_And_Dependencies.sql | Out-File $outfile_for_script\script.sql -Append -Encoding utf8
Now that the process is complete, I then place the contents of the script into a new file with a different naming convention MM_DD_YYYY_Upgrade_Version.sql and copy it to the network share. You can add additional logic to check for a files existence as well as run this process multiple times a day if needed.
[string]$date = (Get-Date -format MM_dd_yyyy) ##create a new filename with date time and version designation $path = "$($outfile_for_script)\" $version = '<version of software release>' $filename = "_Upgrade_$($version).SQL" $scriptpath = $path + $date + $filename Get-Content "C:\$($outfile_for_script)\script.sql" | Out-File $scriptpath -Encoding utf8 Write-Host "move new upgrade file from $($scriptpath) to <network share>" copy-item -Path $scriptpath -Destination \\network_share
Schedule this process in a SQL Agent job using the job type Operating System (CmdExec) or call directly from Powershell.
Conclusion:
This process will create a .sql script that will update the development database to be in sync with the current development.
I hope this will help with automating an Update\Upgrade script of you Development database.
https://thesurfingdba.weebly.com/redgate-sql-compare.html
thesurfingdba@gmail.com