I am trying to import csv files stored in location. All files are vary in size which contain 1 KB to 15 KB or more than that. so when i try to importing that files with following script o got out of memory exception.
$DataImport=Import-Csv -Path (Get-ChildItem $CsvFilePath)
$DataTable=Out-DataTable -InputObject $DataImport
Write-DataTable -ServerInstance $server -Database $Database -TableName $Table -Username
$Username -Password $Password -Data $DataTable
what is th max number of rows for a csv file should have to avoi OOM exception? is there any other better method to handle the situation. I saw post to use fgetcsv command to read line by line . But it long time right??
please give your suggessions for me. Thanks in advance jerin