I just can't seem to figure out how to get SSIS to specify a non-standard column delimiter.
The following BULK INSERT query processes my data file without issue:
BULK INSERT NZ_POC.[dbo].[v_STG_DCLK_DSM_PROPOSAL_LINE_ITEM]
FROM 'W:\NZ_POC\ETL_Testing\WORK\dsmapi_ProposalLineItem_20161011_56476.txt'
WITH ( CODEPAGE = 'RAW' -- { 'ACP' | 'OEM' | 'RAW' | 'code_page' } ]
, DATAFILETYPE = 'char' -- { 'char' | 'native'| 'widechar' | 'widenative' } ]
,
FIELDTERMINATOR = '0xFE' -- Latin THORN
, FIRSTROW = 2 -- Skip the header row
, TABLOCK
--, ERRORFILE = 'W:\NZ_POC\etl_error.txt'
);
After working that out I assumed it would be easy to set the column delimiter / field terminator in SSIS as an expression.
Does anyone know how to set a high value ASCII character (in this case, the decimal ascii value is 254 = 0xFE)?
I've tried casting the value using DT_STR with several different code pages but they usually result in a truncation error.
Just about every example I've seen online uses either comma or pipe delimiters.
Once I have this nugget figured out I should be all set.
Thanks,
PS: I know I could implement this in SSIS as a SQL Task but it would have to use dyanamic sql, since the package handles multiple input data files and target tables. It just seems much cleaner to use the built in BULK INSERT task.
Here's a working prototype of the TSQL dynamically called code:
DECLARE @INPUT_FILE NVARCHAR(4000)
, @FQ_TABLENAME NVARCHAR(4000)
, @TSQL NVARCHAR(4000);
SELECT @INPUT_FILE = QUOTENAME (N'W:\NZ_POC\ETL_Testing\WORK\dsmapi_ProposalLineItem_20161011_56476.txt', '''')
, @FQ_TABLENAME = N'NZ_POC.[dbo].[v_STG_DCLK_DSM_PROPOSAL_LINE_ITEM]';
SELECT @TSQL = N' BULK INSERT ' + @FQ_TABLENAME + N'
FROM ' + @INPUT_FILE + N'
WITH ( CODEPAGE = ''RAW''
, DATAFILETYPE = ''char''
, FIELDTERMINATOR = ''0xFE''
, FIRSTROW = 2
, TABLOCK
);'
EXECUTE sp_executesql @TSQL;