LS,
I have to upload a CSV into a table, the CSV will mostly contain new data although it will also contain updated data (based on a 4 column key).
At the moment I've a very straightforward package, which performs a lookup on the table based on the 4 column key. The match output routes to an OLE DB Command with an update statement and the error output routes to an OLE DB Destination which performs a fast table load.
The CSV holds approx. 15k records, and I've got two SQL environments: a Q&A with the table containing 1.2M rows and a PRD environment with the table holding 5.4M rows.
At the moment I am experiencing suspended processes in the activity monitor on the Q&A environment when the package executes; i.e. the Update is blocked by the Bulk insert, both processes are suspended.
I already tried to uncheck the Table lock on the OLE DB Destination, but without any success.
All the settings on the control and dataflow elements are default.
My questions are:
- Should I play with the settings FastLoadMaxInsertCommitSize on the OLE DB Destination, currently 0;
- Should I play with the settings DefaultBufferMaxRows and DefulatBufferSize on the Dataflow, both on their default now, 10000 and 10485760 respectively;
- How to get to the root cause of the suspended process or blocking?
Regards,
René