I have an SSIS package.
To simplify , I would say that it reads data from hundreds of Oracle tables – making a copy of each one in a SQL server table.
The issue appears to be one big data flow task. In this it has approx. 120 oracle sources and SQL destinations (one for each table).
It has been running fine for several years in SSIS 2008 (as a SQL agent job).
I need to move it to SSIS 2012.
It works ok in data tools, but when I deploy the project and run the job from Sql agent it fails.
The Windows server running the SSIS SQL instance now has 8 GB RAM (same as the development machine). The SQL instance on this server (i.e. running SSIS and SQL agent) has max server memory 7168 MB – i.e. 7 GB (leaving 1 GB for the server OS).
In the SQL Server agent job step properties (advanced tab on left) I have ticked INcludde step in output history. Logging level is set to verbose.
The agent job is set to use 32 bit runtime.
After the step has failed, if you run the all executions report – there are two distinct error messages (repeated approx. 120 times – looks like one for each source/destination combo):
- Load standard Tables to staging tables:Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Connect to table1 returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
- Load standard Tables to staging tables:Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Looking at the “Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020” error, some have advised that we change the DefaultBufferMAxrow, and the DefaultBufferSize.
I have changed (separately) the DefaultBuffermaxrows from 10000 to 1000, and then the DefaultBufferSize from 10485760 to 50485760 (ie approx. 5 times the size – leaving the defaultbuffermaxrows at 1000) – and redployed the solution and re-run – but get the same error message.
Why is the package failing ?