Hi,
I have a package that loads data from one sql server db to another. It is scheduled to run every night and for the past few days it has been failing once or twice every 15 days. The reason is bad data. In the package i have eight containers and each container has a data flow task, i have set up a flat file destination for each data flow task to collect the bad data. But what i want is even when the data flow task encounters bad data the job should not fail. I came across an article where it says that i can configure the error output ( instead of "Fail Component" i can select "Redirect Row") , this sounds very good but the caveat is i need to change my access mode from FAST LOAD to acces row by row. In my case i'm dealing with large sets of data and changing the acces mode is not an option. So can any one please guide me if there is any other way that i can collect bad data and also not let my job fail?
Thanks