August 18, 2016 at 8:45 pm
Hi there
I have written a PS script to export a batch of tables listed in a .txt file to CSV:
$server = "XXX"
$database = "XXX"
$TableFile = XXX.txt'
#Delcare Connection Variables
$connectionTemplate = "Data Source=$server;Integrated Security=SSPI;Initial Catalog=$database;"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
# Loop through all tables and export a CSV of the Table Data
Get-Content $TableFile | ForEach-Object {
$queryData = "SELECT * FROM XXX.[$_] WITH (NOLOCK)"
#Specify the output location of your dump file
$extractFile = "XXX\$_.csv"
$command.CommandText = $queryData
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation -Delimiter "|"
#write-output $queryData
}
This works great until the table is more than a few GB - a 10GB tables can take a day to export.
Is there anything I can change to significantly reduce this such as exporting in batches etc, using DataStream etv?
Thanks in advance
Steve
August 19, 2016 at 12:43 am
I think that you would be better off using the SqlDataReader so that you can process and discard rows as you go. The way you are working currently loads all the data into memory (and I suspect the swap file).
There are plenty of examples of using SqlDataReader. I bet a quick search would provide a PowerShell specific example too.
You might have to consider changing your file generation strategy as part of this i.e. appending a line to a CSV file as you go. Adding a post-processing task if necessary.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
August 19, 2016 at 11:48 am
Have you tried using bcp to extract the data?
You could still set up a powershell wrapper to parse your input file and generate the bcp commands and run them.
August 21, 2016 at 6:32 pm
Thanks for the responses.
Yes I tried BCP but have to include headers. I used the UNION ALL approach to add the hedrers but this failed due to data type differences.
In the meantime I am trying the approach in the first response.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply