August 25, 2017 at 7:26 am
wim.buyens - Monday, September 14, 2015 7:24 AMu can also use CLR and run all the code from T-SQL
I love it when people say things like that.... and then fail to post any code or even a link to a reference to demonstrate how.
And, seriously... write compiled code to do what is frequently a one-off?
--Jeff Moden
Change is inevitable... Change for the better is not.
August 25, 2017 at 7:31 am
wmyers01+SSC - Monday, September 14, 2015 7:59 AMYou know you can skip all of that and just run this following from the command line:for %f in (*.sql) do echo osql -S servername -E -i %f
+1 for that. Just in case you don't know about it, lookup the FORFILES command in DOS for a possible alternative. And, yeah... OSQL was deprecated a long time ago even if this was posted 2 years ago. As already suggested, have a look at SQLCMD.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 25, 2017 at 7:52 am
i do something exactly like this in PowerShell; with that, i can capture the output of the script, whether it was an info message, a data set or an error message.
I think that is a much better practice, with more control.
$DirectoryToProcess = "F:\StandardTakeOver\"
foreach ($CurrentFileName in Get-ChildItem -path $DirectoryToProcess -Filter SQL_*.sql | sort-object )
{
#add an Output sub directlry,and then the file name
$out = [System.IO.Path]::Combine([System.IO.Path]::Combine($DirectoryToProcess,"Output") , $CurrentFileName.name.split(".")[0] + ".txt") ;
#invoke-sqlcmd -InputFile $f.fullname | format-table | out-file -filePath $out
#Flags:
# -S server\instancename
# -E = trusted connection
# -i input file
# -I Enable Quoted Identifier: critical fro stored proc creations
$CurrentFileName.name #debug/echo out the current filename so I can see what is being processed
sqlcmd -S "(local)" -E -I -i $CurrentFileName.fullname | format-table | out-file -filePath $out
}
Lowell
August 25, 2017 at 9:55 am
BTW, Jason... cool article. I know what your goal was and you nailed that. The article is also spawning some pretty good side-bars. Thanks for taking the time!
--Jeff Moden
Change is inevitable... Change for the better is not.
August 25, 2017 at 9:58 am
Lowell - Friday, August 25, 2017 7:52 AMi do something exactly like this in PowerShell; with that, i can capture the output of the script, whether it was an info message, a data set or an error message.
I think that is a much better practice, with more control.
$DirectoryToProcess = "F:\StandardTakeOver\"
foreach ($CurrentFileName in Get-ChildItem -path $DirectoryToProcess -Filter SQL_*.sql | sort-object ){
#add an Output sub directlry,and then the file name
$out = [System.IO.Path]::Combine([System.IO.Path]::Combine($DirectoryToProcess,"Output") , $CurrentFileName.name.split(".")[0] + ".txt") ;#invoke-sqlcmd -InputFile $f.fullname | format-table | out-file -filePath $out
#Flags:
# -S server\instancename
# -E = trusted connection
# -i input file
# -I Enable Quoted Identifier: critical fro stored proc creations
$CurrentFileName.name #debug/echo out the current filename so I can see what is being processed
sqlcmd -S "(local)" -E -I -i $CurrentFileName.fullname | format-table | out-file -filePath $out}
Cool. And it's commented, as well.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 25, 2017 at 10:17 am
I once wrote a Powershell script to release multiple files into several environments and log messages and errors. I requires creating some text files with run-time arguments (so you don't have to modifiy the script, itself). We used this script for all releases at my old job. To make an entry invisible to the script, precede it with a colon (:).
For folder run order, use entries like this in DBFolders.txt:
1 - DatabaseOne
2 - DatabaseTwo
3 - DatabaseThree
@JeffModen, you could probably clean it up to streamline it.
August 25, 2017 at 4:56 pm
Jeff Moden - Friday, August 25, 2017 9:55 AMBTW, Jason... cool article. I know what your goal was and you nailed that. The article is also spawning some pretty good side-bars. Thanks for taking the time!
Thanks Jeff.
The larger picture and real power comes in by integrating this into SSDT's Pre-Deploy phase. By using a specific folder structure in your project and having the scripts prefixed with TFS task numbers (which ensures they get ran in the proper order) the final deploy script that gets generated will inject your custom code into that script. You can also extend this into the Post-Deploy phase as well. All in all, this resolves the one major drawback of SSDT. Data Movement is not handled, but with this approach it's pretty sweet. I just haven't gotten around to writing it up.
______________________________________________________________________
Personal Motto: Why push the envelope when you can just open it?
If you follow the direction given HERE[/url] you'll likely increase the number and quality of responses you get to your question.
Jason L. SelburgAugust 25, 2017 at 9:38 pm
Also, my apologies, Jason! The first line of my previous was supposed to resemble, "Great article, I gave you 5 orbs!" Again, sorry.
August 28, 2017 at 1:09 am
John Hick-456673 - Friday, August 25, 2017 10:17 AMI once wrote a Powershell script to release multiple files into several environments and log messages and errors. I requires creating some text files with run-time arguments (so you don't have to modifiy the script, itself). We used this script for all releases at my old job. To make an entry invisible to the script, precede it with a colon (:).
For folder run order, use entries like this in DBFolders.txt:
1 - DatabaseOne
2 - DatabaseTwo
3 - DatabaseThree@JeffModen, you could probably clean it up to streamline it.
thanks to John Hick for this script
Viewing 9 posts - 16 through 23 (of 23 total)
You must be logged in to reply to this topic. Login to reply