Hi
I have package that I was running last week without any problems.
The package has 1 source component (attunity oracle connector) that loads into a table in SQL Server (OLE destination). I have the SQL server (in a Virtualised environment) all to myself which has 12GB of RAM on it. From what I can tell in Performance Monitor 9GB of that is allocated to SQL Server and 3GB to Integration Services. Set up of the data flow task for this are below -
Maxbufferrows set to - 50000,
Maxbuffer size set to 104857600 (which is the maximum SSIS will allow),
Engine Threads set to - 2
When I executed the package today the package is running a bit slower but more concerning I am receiving an Information error containing -
Information: The buffer manager failed a memory allocation call for 104857160 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.
Looking at perfmon when I executed the package I can see the following -
SSIS pipeline counter values:
Buffers in use - 76
Buffer memory - 7,549,710,024
Buffers spooled - 0
Process counters:
DTExecUI Private Bytes - 69,312,512
DTExecUI Working Set - 3,608,576
It appears as though that everytime I run the package the number of buffers is increasing for the package? Any ideas why I seem to having memory issues? Is there some type of memory house keeping that needs to be done for SSIS?
For the package I have designed I would imagine only 2 buffers to be created one from the source component - 1 for the good records (green arrow) and 1 for the bad records (red arrow for error handling). Please can you correct me if my understanding is incorrect on this one.
The source table contains approx 300million records.
Thanks