March 5, 2021 at 11:05 pm
We have a JSON file with 10M records to bulk copy load into a SQL Server table. The Powershell code is below. The issue we are having is the first line seems to hang up, take a long time to execute and freeze the machine. Is there a more efficient way to convert a large JSON file into a data table?
$results = Get-Content $tempfile | ConvertFrom-Json
$dt2 = $results | select-object $FieldAttribute
$dataTable = ConvertTo-DataTable -InputObject $dt2
March 5, 2021 at 11:49 pm
try this function https://github.com/RamblingCookieMonster/PowerShell/blob/master/ConvertTo-FlatObject.ps1
most likely you may be able to do
[xml]$file =get-content $tempfile
$datatable = convertto-datatable -inputobject (ConvertTo-FlatObject -InputObject $file)
March 5, 2021 at 11:51 pm
I will try it, thank you.
March 8, 2021 at 7:25 pm
Unfortunately, it still appears to freeze the machine and exits powershell without an error after a couple hours.
March 9, 2021 at 8:06 am
hum. should not be that slow but possible if the records are big - what's the file size?
is it possible to share the file (even if on private) for me to try it out?
as well as the ps1 script you are using.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply