1

I am trying to import csv files stored in location. All files are vary in size which contain 1 KB to 15 KB or more than that. so when i try to importing that files with following script o got out of memory exception.

$DataImport=Import-Csv -Path (Get-ChildItem $CsvFilePath)

$DataTable=Out-DataTable -InputObject $DataImport

Write-DataTable -ServerInstance $server -Database $Database -TableName $Table -Username 

$Username -Password $Password -Data $DataTable

what is th max number of rows for a csv file should have to avoi OOM exception? is there any other better method to handle the situation. I saw post to use fgetcsv command to read line by line . But it long time right??

please give your suggessions for me. Thanks in advance jerin

1 Answer 1

5

Looking at the Out-DataTable syntax: it's most likely pipeline friendly. So instead of "sucking in" whole CSV into variable, you can streamline the process easily:

Import-Csv -Path (Get-ChildItem $CsvFilePath) | Out-DataTable

Limits: none that I'm aware off, but that's what pipeline can usually help you with: it won't store whole thing in the memory, so even if you don't have enough, chances are it will work just fine.

Sign up to request clarification or add additional context in comments.

1 Comment

In addition I highly recommend using a smaller batch size for your Write-DataTable command, as the default we had was 50,000 which is a massive burden for the server if it's doing anything else in a low memory environment. We took it down to 500 and that worked great for us. No real speed reduction but a massive reduction in memory in our case from 660mb down to 280mb for the operation we were doing and that was important. Cheers.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.