Quantcast
Channel: SQL Server Integration Services forum
Viewing all articles
Browse latest Browse all 24688

Can anyone help with: How to load a large CSV with inconsistent double-quotes into SQL server?

$
0
0

I would like to find a way to import a large (above 1 GB, above 1 million rows) CSV file into SQL server fast.

Ideally the file columns structure would be detected dynamically (butlet's say it's a known structure for now)

I have tried:

1) PowerShell Import-Csv with datatables  => it is too slow - takes several hours for a 400K rows file.

2) Powershell dbatools  >  Import-DbaCsvToSql => I found the correct regexp, but for larger files (wide columns) is slower than Import-Csv

3) BCP with both XML and with non-XML format file. In both cases it fails as I don't know if the column value will be enclosed in "" or not.  Some values are in "", some are not, but missing values is always without "". I don't know if it's possible to use a format file in that case?


Viewing all articles
Browse latest Browse all 24688

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>