I have a few tables that has over 200+ million rows. I would like to perform an incremental load to these tables. It is currently a flush and fill method which as you know is not something that is desirable.
each table is within its own data flow task. And for each data task, I would like to have a lookup transformation in between the source and destination. I tried using the partial cache and apparently, i'm stuck at 4096 (~4GB) for 64bit, which is not enough for a table w/ 200+ million rows. I would like to use Full cache instead but I have a couple questions about that
1. is full cache using SQL Buffer Cache or cache reserved outside of SQL?
2. does full cache release itself after the data task flow task is done or does it hold onto that cache until the entire pkg is done?
Thanks