hi folks,
I have created a very small dtsx package that dumps data from Oracle (11g) to SQL Server (2014 Std). The package runs fine at design time, within the DataTools 2015 environment (consumes little memory and finishes normally), however, any time I deploy the package
within SSISDB and run it through SQL Agent, it starts consuming all of the RAM on the machine, currently 32gb RAM, then starts using the swap file big time and it eventually fails with "There is insufficient system memory in resource pool 'default' to
run this query"
I have tried to change the FastLoadMaxInsertCommitSize of the Destination DataFlow Component, redeploy, rerun, but the package at runtime still takes the entire RAM and eventually runs out of memory. At design time, as mentioned, runs beautifully, within
the 32 bit IDE environment.
Are there additional memory limiting params that can force SSIS Execution to limit the used RAM? (There are no transforms, sorts or any other complex operations, just a simple source to destination dump.)
thanks bunches,
Cos