November 25, 2010 at 9:39 am
Duh ..
I am able to port any over 100 GB db on MSSQL from server to server , so this DOES work, since I have used it also in production ; )
The reason why it did not work for somebody are the different regional settings which make the log files paths impossible and it fails ...
this is the reason why you should have the :
Get_Nice_Time script . ( yee I know there are batches to get the date in nice format but they ALways have bugs , blame Microsoft for not implmenting the simple `date +%Y.%m.%d-%H:%M:%S`in cmd .. )
and the script configurator , which is probably going to be different for all your environments ...
mkdir :
D:\cas\sfw\sql\sql.0.5.9.20101023_232611\Install\scripts
Download :
http://dl.dropbox.com/u/9726039/cas/sfw/sql.0.5.9.20101023_232611_20101125_182852.zip
Unpack zip to
D:\cas\sfw\sql
go to :
D:\cas\sfw\sql\sql.0.5.9.20101023_232611\Install\scripts
I have commented EACH CALL in the batch scripts , so ... If all this does not help ... I cannot help you ... You could downvote me ; ) ... I have used to it ... our outsourced dba's use to do it all the time ; ) ...
During TechEd 2010 in Berlin several weeks ago .. Microsoft demoed their PACK-DAC or whatever solution ... It might be also more convenient for somebody , so it might be worth checking ....
However I have set the following requirements:
- The db should be scriptable via command line : both schema and data
- the db should be portable between different mssql servers via simple command line ...
- the db should be easy to deploy via the command line - just a single double click should be enough ... ( This is because the so called outsourced dba's actually do not understand a shit and burn money by asking stupid questions and changing the steps provided to them in the instructions ...
November 25, 2010 at 9:58 am
In my world it is the job of the developer to provide the DBA with a single script file. The developer prefixes their script files with 01_, 02_ etc. to ensure the correct sequence, and uses a simple VB program with the FileSystemObject to assemble the scripts, with comments for filename & last modified date, into a single file. (Each individual script ends with a GO to prevent any conflicts between the scripts.) The prior-night production backup & tlogs are restored to the dev database, and the dev applies the change script to the dev copy to verify that everything works as expected. Then, that script to sent to the DBA to apply to production.
November 25, 2010 at 10:07 am
Another quirk I found for SQLCMD, and it affected a rollout of about 90 stored procs, was to use the -I option (set quoted identifiers on), as not all sql scripts have that inside. E.G.:
@echo off
for /f "delims=] tokens=1*" %%a in ('find /v /n "" ^<%1') do (
sqlcmd -S DBSERVER\InstanceName -E -d AdventureWorks -i %%b -I
)
Trust me on this one, saves you having the developer scratching his head and you going crazy trying to figure this one out.
EDIT: stole this script from somewhere, so not original, except maybe the -I 😀
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
November 26, 2010 at 9:42 am
An excellent presentation of how to automate these scripts! I know I will use it in my work.
I wonder if you or anyone else in this forum could take this SQL Server model and modify it to do the same things using PL/SQL and an Oracle database? Most of my work is with Oracle databases and having the ability to automate my PL/SQL scripts in this fashion would be tremendously valuable.
Thanks for your consideration!
November 26, 2010 at 12:53 pm
For Oracle you can use SQL*Plus on a command line and the START option i.e. @filename will allow you to run a file. Alternatively you can use the START command from within SQL*Plus. Documentation for SQL*Plus can be found at http://www.oracle.com/technology/docs/tech/sql_plus/index.html. The rest is basically the same.
Martin
November 29, 2010 at 9:20 am
Why, oh why, oh why, oh why do people still insist on using scripts? They do it here too and it is frustrating.
Anyone ever hear about a new object in SQL Server called stored procedures?
Not only that, but SQL Server also came out with something called the SQL Server AGent a while back (maybe not everyone is read up on it yet), and Jobs. The jobs have steps and you can control the order in which you execute stored procedures. If one fails, you can take specific action, including sending out email, paging someone, etc. It is really cool stuff. Besides, you do not need to learn hieroglyphics (@for /r %%s in (.) do (echo ..............%%s /b >>).
Not only is it more efficient, using stored procedures allows SQL Server to compile the code and results in overall faster execution. And, if you need to insert something between *_A and *_B, just insert a job step, and all references in both upstream and downstream steps are automatically adjusted.
November 29, 2010 at 9:56 am
Doug Bishop (11/29/2010)
Why, oh why, oh why, oh why do people still insist on using scripts? They do it here too and it is frustrating.Anyone ever hear about a new object in SQL Server called stored procedures?
Not only that, but SQL Server also came out with something called the SQL Server AGent a while back (maybe not everyone is read up on it yet), and Jobs. The jobs have steps and you can control the order in which you execute stored procedures. If one fails, you can take specific action, including sending out email, paging someone, etc. It is really cool stuff. Besides, you do not need to learn hieroglyphics (@for /r %%s in (.) do (echo ..............%%s /b >>).
Not only is it more efficient, using stored procedures allows SQL Server to compile the code and results in overall faster execution. And, if you need to insert something between *_A and *_B, just insert a job step, and all references in both upstream and downstream steps are automatically adjusted.
Some people use a thing called version control for those stored procedure scripts, and although there are some flashy features in Visual Studio or third party tools to help maintain code and deploy them; some companies are either not in the position to do that; or just don't want to use them, especially when it can be done on the command line for no extra licence cost.
Martin
November 29, 2010 at 10:12 am
Martin Bell-290511 (11/29/2010)
Some people use a thing called version control for those stored procedure scripts, and although there are some flashy features in Visual Studio or third party tools to help maintain code and deploy them; some companies are either not in the position to do that; or just don't want to use them, especially when it can be done on the command line for no extra licence cost.Martin
Okay, I am confused; license cost for what? A stored procedure is a script that has been saved to the database using CREATE PROCEDURE procname. There are no licensing costs, the procedure gets backed up everytime the database is backed up. If you have version control software, it can be used with SQL Server. If you don't want to integrate it into SQL Server, then script the procedure and save that manually under version control.
The only reason they have in our shop for using scripts is, "that is the way it used to be done so that is how they are still doing it." Whenever I have to go in and maintain code that is in a script that is run through OSQL, I change it to a stored procedure and delete the old script code form the server.
November 29, 2010 at 10:35 am
Doug Bishop (11/29/2010)Okay, I am confused; license cost for what? A stored procedure is a script that has been saved to the database using CREATE PROCEDURE procname. There are no licensing costs, the procedure gets backed up everytime the database is backed up. If you have version control software, it can be used with SQL Server. If you don't want to integrate it into SQL Server, then script the procedure and save that manually under version control.
The only reason they have in our shop for using scripts is, "that is the way it used to be done so that is how they are still doing it." Whenever I have to go in and maintain code that is in a script that is run through OSQL, I change it to a stored procedure and delete the old script code form the server.
If you are using version control then you would use the script held in version control as the definitive copy. Relying on backups to keep a history of version of a procedure is not workable when there are more than one developer working on a project or if you need to keep an accurate history of changes. Making changes on the fly is also not acceptable on systems that require the code changes to be accountable, and when you start to implement releases rather than ad-hoc changes you will usually need to run one or more script (procedure or otherwise) in a given order and in a reliable way.. hence the reason for this thread.
Martin
November 29, 2010 at 10:54 am
Martin Bell-290511 (11/29/2010)
If you are using version control then you would use the script held in version control as the definitive copy. Relying on backups to keep a history of version of a procedure is not workable when there are more than one developer working on a project or if you need to keep an accurate history of changes. Making changes on the fly is also not acceptable on systems that require the code changes to be accountable, and when you start to implement releases rather than ad-hoc changes you will usually need to run one or more script (procedure or otherwise) in a given order and in a reliable way.. hence the reason for this thread.Martin
I am still confused as to how using scripts is better than using stored procedures and using a job with multiple steps to control the flow. Furthermore, in Sadequl's article, he specifically mentions that you cannot check for errors that occur between scripts, something that can be done with a job and you are controlling the flow of execution of the steps. You could add that, but then your code to execute code starts becoming complex, using pipes and FIND and branching (mind you extremely limited branching). Using a job, each step can branch to a completely different step if the executing step fails, thus customizing error handling.
I am also somewhat opposed to saying, "execute anything in the folder and always assume the last character of the filename (prior to the period) is the sort order". The code presented would attempt to execute a temp file, a text file, a backup copy of a current script, etc. At the very least, if I was going to implement this methodology, I would have a control table in SQL Server that specified the script name and order of execution. I could then pull the script name from that table in execution order. Then I could give my scripts meaningful names without worrying about them having the "correct" prefix or suffix.
I did not state, or even imply that backups are a method of version control. I stated that you can either integrate your version control software into SQL Server, or script the procedure and manually add updates to your version control (check out existing, check in changes). I was implying that backing up the database backs up the procedures.
November 29, 2010 at 11:54 am
Doug Bishop (11/29/2010)
I am still confused as to how using scripts is better than using stored procedures and using a job with multiple steps to control the flow. Furthermore, in Sadequl's article, he specifically mentions that you cannot check for errors that occur between scripts, something that can be done with a job and you are controlling the flow of execution of the steps. You could add that, but then your code to execute code starts becoming complex, using pipes and FIND and branching (mind you extremely limited branching). Using a job, each step can branch to a completely different step if the executing step fails, thus customizing error handling.
For software releases, especially if they are to multiple servers or third party systems you need a rubust and reliable method to deploy your scripts. A job is just not going to do this, and there is always the question of creating the job in the first place. There are means to check a script has worked so I am not sure where in Sadequl's article it says otherwise or if it's been taken in the correct context.
Doug Bishop
I am also somewhat opposed to saying, "execute anything in the folder and always assume the last character of the filename (prior to the period) is the sort order". The code presented would attempt to execute a temp file, a text file, a backup copy of a current script, etc. At the very least, if I was going to implement this methodology, I would have a control table in SQL Server that specified the script name and order of execution. I could then pull the script name from that table in execution order. Then I could give my scripts meaningful names without worrying about them having the "correct" prefix or suffix.
If you control what is in the given folder and the names of the files, then it is not a problem. How do you get the data into the control tables that you mention?
Doug Bishop
I did not state, or even imply that backups are a method of version control. I stated that you can either integrate your version control software into SQL Server, or script the procedure and manually add updates to your version control (check out existing, check in changes). I was implying that backing up the database backs up the procedures.
You can integrate version control with SSMS, that is a tool not SQL Server itself and Visual Studio or some third party tools can also interface with version control systems. Version control is only the starting point for a release process and a small part of implementing change control. Third party tools can package up releases for you, but not everyone wants to use them or has the budgets to buy them.
Martin
November 29, 2010 at 12:10 pm
Martin Bell-290511 (11/29/2010)
Doug Bishop (11/29/2010)
I am still confused as to how using scripts is better than using stored procedures and using a job with multiple steps to control the flow. Furthermore, in Sadequl's article, he specifically mentions that you cannot check for errors that occur between scripts, something that can be done with a job and you are controlling the flow of execution of the steps. You could add that, but then your code to execute code starts becoming complex, using pipes and FIND and branching (mind you extremely limited branching). Using a job, each step can branch to a completely different step if the executing step fails, thus customizing error handling.For software releases, especially if they are to multiple servers or third party systems you need a rubust and reliable method to deploy your scripts. A job is just not going to do this, and there is always the question of creating the job in the first place. There are means to check a script has worked so I am not sure where in Sadequl's article it says otherwise or if it's been taken in the correct context.
Doug Bishop
I am also somewhat opposed to saying, "execute anything in the folder and always assume the last character of the filename (prior to the period) is the sort order". The code presented would attempt to execute a temp file, a text file, a backup copy of a current script, etc. At the very least, if I was going to implement this methodology, I would have a control table in SQL Server that specified the script name and order of execution. I could then pull the script name from that table in execution order. Then I could give my scripts meaningful names without worrying about them having the "correct" prefix or suffix.If you control what is in the given folder and the names of the files, then it is not a problem. How do you get the data into the control tables that you mention?
Doug Bishop
I did not state, or even imply that backups are a method of version control. I stated that you can either integrate your version control software into SQL Server, or script the procedure and manually add updates to your version control (check out existing, check in changes). I was implying that backing up the database backs up the procedures.You can integrate version control with SSMS, that is a tool not SQL Server itself and Visual Studio or some third party tools can also interface with version control systems. Version control is only the starting point for a release process and a small part of implementing change control. Third party tools can package up releases for you, but not everyone wants to use them or has the budgets to buy them.
Martin
Dilemna: you have some form of version control software that you want to integrate for change management, but it is not integrated into SQL Server.
Solution: script the stored procedure and save the script under version control. Check that in and out of your version control. A scripted stored procedure is a script.
Chances are, if you are maintaining the scripts, you are checking them out, modifying and executing them for testing in SSMS and saving the completed script for checkin. If it is a procedure, right-click on it in the SSMS object explorer, select Script Stored Procedure As > Create to > New Query Editor Window (or File).
As far as packaging up releases, the job does that for you and control of execution is maintained in the order of the job steps.
November 29, 2010 at 12:22 pm
This may work for you, but it is not going to work or be a option for everyone.
November 29, 2010 at 12:53 pm
I still think my old method of using scripts is a good one. It handles dependencies, and it integrates VCS with a release process.
Building a stored procedure for a release can work, but I'm not sure it's that efficient, especially as the release grows.
http://www.sqlservercentral.com/articles/System+Development+Life+Cycle/vcspart3/525/
November 30, 2010 at 2:35 am
Hi Sadequl Hussain,
Thanks.Very use full script. I have 1 question. how can we capture the output of the result file to and variable?
Regards,
Vijay Singh.C
Viewing 15 posts - 46 through 60 (of 72 total)
You must be logged in to reply to this topic. Login to reply