Quantcast
Channel: SQL Server Integration Services forum
Viewing all articles
Browse latest Browse all 24688

Connection times out before drop second table executes

$
0
0

I have an SSIS package, which I generated from the export data wizard. It has a series of drop table, create table, data flow task, drop table, create table, data flow task items in it. Each works for 5 tables. So, it drops 5 tables, creates those 5 tables, then loads data to those five tables. Then it drops 5 different tables, creates those tables, then loads data to those five tables. Repeat process 29 times until all tables in my database are transferred. So, the first drop works fine, the first create works fine, and the first data flow works fine. Somewhere along the way though, my connection seems to have been lost, so that the second drop fails...which doesn't cause the package to fail, but when the create tables tries to run, it fails as they already exist, and then package dies. I just found out that my "partner' that I am exporting "to", found that their edge firewalls are terminating stale connections, and thinks that may be the issue. I'm not sure how when I just finished transferring data 2 seconds ago, my connection is stale...but I need to find a way around. The workaround suggested for my client connection is to add a tcpKeepAlive=true. I'm not sure where I could do this in my ado net connection. I set my "Default Command Timeout" to "0", to hopefully allow it to stay open, but that didn't work. Is there another setting I can tweak to keep the connection alive while my lengthy data flow runs? Or can/should I add another task of some sort before each drop table task that re-establishes the connection? Advice appreciated!

Thanks in advance!

mpleaf



Viewing all articles
Browse latest Browse all 24688

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>