Loading large JSON file into a Data Table to Bulk Copy into SQL Server Table

  • We have a JSON file with 10M records to bulk copy load into a SQL Server table. The Powershell code is below. The issue we are having is the first line seems to hang up, take a long time to execute and freeze the machine. Is there a more efficient way to convert a large JSON file into a data table?

    $results = Get-Content $tempfile | ConvertFrom-Json

    $dt2 = $results | select-object $FieldAttribute

    $dataTable = ConvertTo-DataTable -InputObject $dt2

  • try this function https://github.com/RamblingCookieMonster/PowerShell/blob/master/ConvertTo-FlatObject.ps1

     

    most likely you may be able to do

    [xml]$file =get-content $tempfile

    $datatable = convertto-datatable -inputobject (ConvertTo-FlatObject -InputObject $file)

     

  • I will try it, thank you.

  • Unfortunately, it still appears to freeze the machine and exits powershell without an error after a couple hours.

  • hum. should not be that slow but possible if the records are big - what's the file size?

     

    is it possible to share the file (even if on private) for me to try it out?

    as well as the ps1 script you are using.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply