I have inherited abunch of SSIS support due to a college leaving. Needless to say we are having performance issues and trying to figure out where in the package executions the time is being spent. These packages are stored in an ssis catalog
and executed via sql agent jobs. No logging enabled other than the ssis standard reports in ssms.
In a nutshell, this package downloads a bunch of files from AWS S3, parses them does a small bit of transform and writes them to the db. It also writes a log table of the file names it processed. Nothing earth shattering, but trying to find out:
1. how many files did it process each time it ran
2. how long did each loop task task in total (how much time downloading all the files, parsing all the files, how much time writing all the files to the db). I want a total for each task/step for each execution, not one per file processed.
3. Total number of rows processed across all files.
I would like to expose this data via an ssrs report or stored proc to use this as monitor tool for trend analysis as we expect the data volume to continue to grow.
Here is a screen shot of my control flow:
and data flow
Full disclosure: enabled standard logging on everything in the package to table and it writes so many rows I don't know what event(s) to look for to capture this if that information is already there somewhere.