I have a package with a data flow task like so
oledb src -> many transformations -> oledb dest
I want to do something like
oledb src -> many transformations -> [wait until src read is complete] -> oledb dest
basically if i'm reading and writing to the same table and my rows pass through my transformations too fast I sometimes get lock throttling issues
It's all good usually but occasionally a task that should take a few min is taking an hr. And not only that I don't want any data quality issues where im reading a row that i've inserted in the same data flow. I haven't had this happen yet and dont think I will but never say never
I've tried playing around with script task preexecute and post execute and adding waits to the process_row but no combination seems to work
And if possible i'm looking for something a bit more elegant than splitting this into 2 data flows