Quantcast
Viewing all 24688 articles
Browse latest View live

Setting "Set values" when defining a SQL Server Agent Job to Execute your SSIS Package

This got a little more complicated than I thought. I kind of thought the SSIS Package Variables specified in the SQL Server Agent Job Set andSet values would be satisfied at run time. It seems as though they need to be satisfied correctly prior to run time.

I have a Variable GlobalFilePath which is a fully qualified path name with an embedded space.

\\Location.net\Folder\Sub Folder\...we supplied the Value...

/Set \Package.Variables[User::GlobalFilePath].Properties[Value];"\\Location.net\Folder\Sub Folder\\"

Is that correct?

Our Flat File Connection Manager uses in its Properties ConnectionString @[User::GlobalFileFullyQualified]which is a formula unto itself...

@[User::GlobalFilePath] +  @[User::GlobalFileName] + @[User::GlobalFileNameEndDateQualifier] + @[User::GlobalFileType]

So we provided the Set values for \Package.Variables[User::GlobalFileName].Properties[Value] to be...

/Set \Package.Variables[User::GlobalFileName].Properties[Value];"WD_ClientName_846_Inventory_"

Now the @[User::GlobalFileNameEndDateQualifier] is satisfied within the SSIS Package with the following Expression...

(DT_WSTR, 4) YEAR( @[User::GlobalDate]  ) +  RIGHT( "0" + (DT_WSTR, 2)  MONTH( @[User::GlobalDate]),2 ) + RIGHT( "0" + (DT_WSTR, 2)  DAY( @[User::GlobalDate]),2 )

And the GlobalDate Variable is being satisfied within the SSIS Package by an Execute SQL Task with Direct Input T-SQL...

SELECTCONVERT(DATE, GETUTCDATE() AT TIME ZONE 'UTC' AT TIME ZONE 'Eastern Standard Time')AS[DateIn]

And Storing [Datein] to Result Set  User::GlobalDate

When attempting to run the SQL Server Agent Job, we get the error message...

Source: IRIS Datapass 846File Connection manager "Flat File Connection Manager - Eagle 846 Inventory File"     Description: The file name "/Set \Package.Variables[User::GlobalFilePath].Properties[Value];"\\Location.net\Folder\Sub Folder\"/Set \Package.Variables[User::GlobalFileName].Properties[Value];"WD_ClientName_846_Inventory_"20190517/Set \Package.Variables[User::GlobalFileType].Properties[Value];".csv"" specified in the connection was not valid.  End Error  Error: 2019-12-24 12:10:01.13     Code: 0xC001401D     Source: IRIS Datapass 846File      Description: Connection "Flat File Connection Manager - Eagle 846 Inventory File" failed validation.  End Error  Error: 2019-12-24 12:10:01.13     Code: 0xC001401E     Source: IRIS Datapass 846File Connection manager "Flat File Connection Manager - Eagle 846 Inventory File"    Description: The file name "/Set \Package.Variables[User::GlobalFilePath].Properties[Value];"\\Location.net\Folder\Sub Folder\"/Set \Package.Variables[User::GlobalFileName].Properties[Value];"WD_ClientName_846_Inventory_"20190517/Set \Package.Variables[User::GlobalFileType].Properties[Value];".csv"" specified in the connection was not valid.  End Error  Error: 2019-12-24 12:10:01.13     Code: 0xC0202070     Source: IRIS Datapass 846File Connection manager "Flat File Connection Manager - Eagle 846 Inventory File"     Description: The file name property is not valid. The file name is a device or contains invalid characters.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  12:10:01 PM  Finished: 12:10:01 PM  Elapsed:  0.219 seconds.  The package execution failed.  The step failed.

Now that value "20190517" is what is hard-coded in the SSIS Package for VariableGlobalFileNameEndDateQualifier even though the Expression for that variable is...

(DT_WSTR, 4) YEAR( @[User::GlobalDate]  ) +  RIGHT( "0" + (DT_WSTR, 2)  MONTH( @[User::GlobalDate]),2 ) + RIGHT( "0" + (DT_WSTR, 2)  DAY( @[User::GlobalDate]),2 )

How can I fix this? Do I need to satisfy the Set values for GlobalDate even though it will be satisfied in the SSIS Package?

Thanks for your review and am hopeful for a reply.


SSIS - Batch Size

Hello everyone, I have a doubt about the batch size issue.

When I set a batch size of 500 at my destination, does this mean that the data from my source will be broken into blocks of 500 records and each block opens and ends a new transaction in destination?

Thank you and a Merry Christmas =)

How to use parameter or variable to specify table name or query in SSIS OLE DB Source

MY GOAL:
I want to use a data flow in a for loop container, where each iteration of the for loop sets the name of a database table in a string variable using an expression task, and then the data flow has an OLE DB Source that queries the table specified in the variable, followed by other components that process the data.

WHAT I HAVE DONE:
I have created a data flow in SSIS, and the first component is an OLE DB Source. I have successfully used the component to retrieve data from a database table using a query such as the following:

SELECT col1, col2, col3

FROM table1

WHERE col4 = ?

In the parameter section of the OLE DB Source, I set the '?' to be a variable I set earlier in the SSIS package, and everything works great.

I have also placed a 'For Loop Container' in the control flow, and in the for loop I placed an 'Expression Task', followed by the data flow mentioned above. I created an integer variable name 'ForLoopCounter' to set the init, eval, and assign expressions as follows:

@ForLoopCounter = 0
@ForLoopCounter < 4
@ForLoopCounter  = @ForLoopCounter + 1

MY PROBLEM:
I have a string variable named QueryVar, and I set it's value in the expression task to be "tableN", where N is the value of ForLoopCounter. I then try to use the following query in the OLE DB Source in the data flow:

SELECT col1, col2, col3
FROM ?

When I try to click on parameters I get the following error:

Parameters cannot be extracted from the SQL command. The provider might not help to parse parameter information from the command. In that case, use the "SQL command from variable" access mode, in which the entire SQL command is stored in a variable.
------------------------------
ADDITIONAL INFORMATION:
The batch could not be analyzed because of compile errors. (Microsoft SQL Server Native Client 11.0)

This error is confusing to me because I am able to specify a parameter in the where clause without any problems, but I am unable to use a parameter to specify the table name in the query. 

I have also tried modifying the expression task to contain the entire SQL query, so I have the following:

QueryVal = "SELECT col1, col2, col3 FROM tableN" - once again, 'N' is replaced by a numerical value.

However, when I try to use the 'SQL command from variable' data access mode in the OLE DB Source, and provide QueryVal in the 'Variable Name' field, I get the following error when I click 'OK'.

Exception from HRESULT: 0xC0202009
Error at Data Flow - MyDataFlow [GetRawExtractData[19]]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E0C.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E0C Description: "Command text was not set for the command object.".

Can someone give me some pointers on what I am doing wrong. I just want to run my data flow several times, where each time the OLE DB Source component runs the same query on a different table. Thanks in advance for any help.


SSIS Wizard cannot import text columns longer then 255 using Excel source

(Applies to SQLServer 2005 SP1)

We have found that using the SSIS "Import and Export Wizard" using the "Microsoft Excel" data source that there appears to be a maximum column length of 255 characters for any row.

Even when defining the destination table columns as nvarchar(4000), the wizard fails with the errors shown below.

We have found no workaround except manually changing the imput data. There doesn't appear to be any "Advanced" options for the Excel importer as there are for the flat-text importer. So, no question here, just posting the bug so that *next* time someone searches the web for an answer, this post comes up Image may be NSFW.
Clik here to view.



Messages
Error 0xc020901c: Data Flow Task: There was an error with output column "English String" (18) on output "Excel Source Output" (9). The column status returned was: "Text was truncated or one or more characters had no match in the target code page.".
(SQL Server Import and Export Wizard)

Error 0xc020902a: Data Flow Task: The "output column "English String" (18)" failed because truncation occurred, and the truncation row disposition on "output column "English String" (18)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
(SQL Server Import and Export Wizard)

Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - Sheet1$" (1) returned error code 0xC020902A. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038.
(SQL Server Import and Export Wizard)

Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
(SQL Server Import and Export Wizard)

Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039.
(SQL Server Import and Export Wizard)




edit: After searching further this is documented under "Excel Source" in BOL which provides a registry-based workaround.  I guess the issue is that the wizard considers truncation to be  a 'fail' case and there's no easy way to override this behaviour, specify the column types nor determine which line is in error)

Truncated text. When the driver determines that an Excel column contains text data, the driver selects the data type (string or memo) based on the longest value that it samples. If the driver does not discover any values longer than 255 characters in the rows that it samples, it treats the column as a 255-character string column instead of a memo column. Therefore, values longer than 255 characters may be truncated. To import data from a memo column without truncation, you must make sure that the memo column in at least one of the sampled rows contains a value longer than 255 characters, or you must increase the number of rows sampled by the driver to include such a row. You can increase the number of rows sampled by increasing the value of TypeGuessRows under the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel registry key.
)

parse json response from rest api

Hi, I am using SSIS script task to parse json response from rest api. When I try to deserialize json payload using newtonsoft json the assembly reference does not stick if I were to close and open the edit script window. Could you please help me guide re-write the below code for json.

JSON payload:

{"Id": "1","Code": "Man","Description": "Manchester"
}


C# Code to parse xml (I am trying to re-write data insert using JSON payload)

                XmlDocument xdoc = new XmlDocument();
                string records;

                records = getRecords();
                xdoc.LoadXml(records);

                string connectionString = @"Data Source=server\SQLEXPRESS;Initial Catalog=TestDB;Integrated Security=SSPI;";							
                string sqlTable = "test";

                using (SqlConnection sc = new SqlConnection(connectionString))
                {
                    sc.Open();
                    XmlReader xr = XmlReader.Create(new StringReader(records));
                    DataSet ds = new DataSet();
                    ds.ReadXml(xr);
                    
                    DataTable dt = ds.Tables[0];
                    using (SqlBulkCopy sbc = new SqlBulkCopy(sc))
                    {
                        sbc.DestinationTableName = sqlTable;
                        sbc.BatchSize = dt.Rows.Count;
                        sbc.BulkCopyTimeout = 0;
                        sbc.ColumnMappings.Add("Id", "Id");
                        sbc.ColumnMappings.Add("Code", "Code");
                        sbc.ColumnMappings.Add("Description", "Description");

                        sbc.WriteToServer(dt);
                    }
                }	

JSON class object

public class RootObject
{
    public string Id { get; set; }
    public string Code { get; set; }
    public string Description { get; set; }
}

Thank you.


SQLEnthusiast

صيانة غسالات ال جي المنصورة 01225025360 | اصلاح ثلاجات ال جي 01127571696

How to add drop downs to excel exports

I am creating dynamic excel files with each file consisting of 2 worksheets in it. In both the worksheets I am looking to create a drop down with few values for few of my columns. 

I am not getting it to be done using ssis excel destination.

Thanks

Slowly changing dimension - ODS and DWH

I have the following dimension EmployeeStatus in my datawarehouse :

SELECT des.EmployeeStatusSK,des.EmployeeStatusId,des.Label AS StatusFROM DWH.DimEmployeeStatus des

The output is like below :

EmployeeStatusSK EmployeeStatusId Status11                Recruited22In33In(Inexit confirmed)44                Out00                Undefined

In my ODS part, I am getting data from STG table EmployeeStatus to ODS table DimEmployeeStatus and adding a column CreationDate with a derived column with GETUTCDATE() for data time tracking :

Image may be NSFW.
Clik here to view.
enter image description here

I want to implement the historical tracking using Slowly Changing Dimension in my Datawarehouse part knowing that I am loading data from ODS to DWH like below :

Image may be NSFW.
Clik here to view.
enter image description here

The question : How should I implement the slowly changing dimension in the DWH part ?


SSIS package fails for Fast Load when run from C# code

Hi,

I am using Fastload option to load a table in data flow. This package is running fine if i execute it from Visual Studio. But the same package is failing if i run from C# code.

I have set the Delay validation to True. The table has identity column. So i am using the fast load option to explicitly set the values for identity column. Why is the package behaves differently when run from C# code. Please suggest me the fix for this issue.

Below is the error message i am getting.

Error Code - -1071636471. Source - dftDbLoadAccountCode. Description - SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft OLE DB Provider for SQL Server"  Hresult: 0x80004005  Description: "Unable to resolve column level collations.  Bulk-copy cannot continue.".

Error Code - -1071636416. Source - dftDbLoadAccountCode. Description - Failed to open a fastload rowset for "xxxx". Check that the object exists in the database.

Error Code - -1073450982. Source - dftDbLoadAccountCode. Description - dstDbAccountCodeInsert failed the pre-execute phase and returned error code 0xC0202040.

Thank you.

Regards

Vikram


Flat File Connection Manager "ConnectionString" Expression has mis-matched quotation marks

My Flat File Connection String is being satisfied within its Properties and the ConnectionString which is a formula within my SSIS Package.

@[User::GlobalFilePath] +  @[User::GlobalFileName] + @[User::GlobalFileNameEndDateQualifier] + @[User::GlobalFileType]

When I try to schedule this through SQL Server Agent, we are getting mis-matched quotation marks (").

How can I prevent that from happening?

Error: 2019-12-26 10:15:00.87     Code: 0xC001401E     Source: File Connection manager "Flat File Connection Manager"     Description: The file name "/Set \Package.Variables[User::GlobalFilePath].Properties[Value];"\\Local.net\apps\Production\ClientPortal\Clients\O10368\Inventory Reports\"/Set \Package.Variables[User::GlobalFileName].Properties[Value];"WD_846_Inventory_"20191226/Set \Package.Variables[User::GlobalFileType].Properties[Value];".csv"" specified in the connection was not valid.  End Error 

Code: 0xC0202070     Source: File Connection manager "Flat File Connection Manager"     Description: The file name property is not valid. The file name is a device or contains invalid characters.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  10:15:00 AM  Finished: 10:15:00 AM  Elapsed:  0.218 seconds.  The package execution failed.  The step failed.

SSIS to replace 15 batch jobs (written in C#) which load txt, csv, xml files into SQL Server database

Hello,
We are a small company which receive ~50 files per day from different data sources which
we load into a SQL Server database.  We currently have this broken down into ~15 batch jobs which are written in C#.
The files are generally in .txt, .csv and a few XML.  One or two reference APIs.
We generally loop through the files and load the data - with limited transformations.  Occasionally we'll look up an ID
- i.e. text file has "ABC" and we look this up in SQL Server - the account table - to find out "ABC"=23 - and then load 23 into the table.
A few questions:
(1) This appears to be fairly easy to do in SSIS.  Thoughts on why SSIS would not work or be recommended?  Beyond the "if it ain't work don't fix it." - our batch jobs work reasonably well - but still require support - probably run well around 95% of the
time - but 5% still requires support and knowledge of C#.  Looking for a more no/low code solution.
(2) Can APIs be called from SSIS?  Or would we have to add scripts/code and call that from SSIS?
(3) Thoughts on SSIS being supported 5-10 years from now?  We have looked at Azure Data Factory and some other Azure products but they seem like overkill.
(3) If we do go the SSIS route we want to make sure we're setting up the first few efficiently - so we can use those
as a training tool/basis to set up the other 12-13.  Recommendations of best way to learn/optimize this process?
Best training classes and/or consulting firms that would potentially work with us on 1-2 and then we'd build the rest
after that?
Appreciate any thoughts.
Thanks,
Dan

Retain connection, avoid re-loading data in ForEach Loop Container while processing in-memory

I have been scratching my head over this design for quite some time now and I think it is time to ask for help.

I have designed a SSIS package which uses a FELC to enumerate over an ADO recordset. Inside the FELC, I have a flat file source, a conditional split, an aggregate transform and a destination. For each enumeration, the source loads the same flat file and use the enumerated values to split the table before aggregation.

For performance purpose, I would like to (1) retain the connection in the FELC to avoid creating it at each enumeration, (2) avoid fetching the same input table and (3) keep the table in memory for processing.

To retain the connection, I understand that I could use other sources such as OLEDB, ADO.NET or Excel with the option “RetainSameConnection”. However, these will not satisfy my constraints of processing in-memory or not to re-load the data at each enumeration.

To keep data in-memory I have found two potential solutions: send the flat file table to a recordset or to an in-memory table. Concerning the in-memory table with the OLEDB source, the connection would be retained. But I have not found a solution to avoid fetching the table at each enumeration.

Concerning the recordset, I could send the flat file table to a recordset before the FELC and use the recordset as source with a script component within the FELC. It does look like the best solution.

My questions:

(1) Am I missing any potential solutions to satisfy my 3 constraints? I am flexible on the type of FELC, or even to another approach than a FELC as long as performance is high.

(2) Is there a way to avoid a source to be loaded at each enumeration in a FELC?

(3) If I use a script component and a recordset, can it satisfy my constraints of (a) retaining the connection and (b) avoid loading the data at each enumeration? If yes, how to achieve that in the script?

Many thanks in advance,

Failed to start project (Microsoft Visual Studio)

After going through uninstallation and re-installation of SQL Server 2016 and Visual Studio 2017, as a newbie, I am getting now a message in Visual Studio when a run a sis package with the following details, any guidance would be of great support in going through this steep learning curve.

Failed to start project (Microsoft Visual Studio)

===================================

The directory name is invalid (Microsoft.DataTransformationServices.VsIntegration)

------------------------------
Program Location:

   at Microsoft.DataTransformationServices.Project.DataTransformationsPackageDebugger.LaunchVsDebugger(IVsDebugger iVsDebugger, DataTransformationsProjectConfigurationOptions options)
   at Microsoft.DataTransformationServices.Project.DataTransformationsPackageDebugger.ValidateAndRunDebugger(Int32 flags, IOutputWindow outputWindow, DataTransformationsProjectConfigurationOptions options)
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchDtsPackage(Int32 launchOptions, ProjectItem startupProjItem, DataTransformationsProjectConfigurationOptions options)
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchActivePackage(Int32 launchOptions)
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.LaunchDtsPackage(Int32 launchOptions, DataTransformationsProjectConfigurationOptions options)
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectDebugger.Launch(Int32 launchOptions, DataTransformationsProjectConfigurationOptions options)

Problem with SSIS and Azure DevOps

An SSIS package (created in Visual Studio 2019) has been checked in using Azure DevOps.  I see it in source control but for some reason it is not in the solution.  I think it is a matter of how the packages were created and checked in, but I don't know what the problem is.  

CleanUP SSISDB execution Data Manually.

Hi All,

I want to clean up SSIDB Database (Execution Data) manually. I checked the properties and found that currently we have 340 Days Retention Period and Maximum Number of Versions per Project is 10. There is SQL Agent Job: SSIS Server Maintenance Job created as part of Integration Services Catalogs setup/Configuration. 

Default: 

Retention Period (days)=365

Maximum Number of Versions per Project=10

I want to Change Retention Period (days)=90 days and Maximum Number of Versions per Project=5

What I noticed If I change "Integration Services Catalogs"-->SSISDB-->RIGHT Click-->Properties--> and change Retention Period (days) to 90 and Maximum Number of Versions per Project to 5 then It is trying to delete lot of data from SSISDB and log is getting full.

One way to do it like change Retention Period (days) from 340 to 335 and so on so that It will delete smaller dataset and log will not grow so much. 

Other option I was looking for is to delete data manually and in loop. 

Has anyone done similar thing..?

https://www.mssqltips.com/sqlservertip/4811/sql-server-integration-services-catalog-best-practices/

https://www.timmitchell.net/post/2018/12/30/a-better-way-to-clean-up-the-ssis-catalog-database/


Thanks Shiven:) If Answer is Helpful, Please Vote




Converting float to int in SSIS serived column

In my data flow, I have several columns that I modify using a derived column. One column is a float, and the decimal portion is always 0. The portion to the left of the decimal has a variable number of digits. I would like to convert this to an 8-byte int using the derived column. Below are some example values for the column and an expression I have tried.

FLOAT: 0.000000                  INT: 0

FLOAT: 1234567.000000       INT: 1234567

FLOAT: 123456789.000000   INT: 123456789

(DT_I8) FLOOR(col1)

Apparently, just casting a float as an int won't convert it to an int in the derived column, but it also doesn't cause an error. It just leaves it as a float. Can anyone help me figure out how to do this conversion? Thanks in advance!

SSIS MongoDb connector

Hello everyone,

Would anyone know a free SSIS connector to MongoDb?

Thank you.

Vladimir.

deleted data from excel

I accidentally deleted  data from an excel document, and save it automatically and I need the old data back 

Will adding a new column to a existing UTF -8 file at the end in SSIS package break the mapping ? SSIS 2014

Will adding a new column to a existing UTF -8  file at the end  in SSIS package break the mapping ?

Existing file

"Employee","Email","FirstName","LastName"
"7777777","user@tests.com","Joe","Smith" 

New file

"Employee","Email","FirstName","LastName","CITY"
"7777777","user@tests.com","Joe","Smith" ,"Chicago"

Right now we are getting the below error

The column delimiter for column "LastName" was not found.”

Env :VS 2013 and SSIS 2014 .SQL server 2008




How to Pass a boolean variable to a SQL query

I have a SSIS variable called MyBoolean defined as type Boolean. I want to pass it to this toy query:

SELECT 42 where ? <> 0

so that the value 42 is returned iff ? = 1 -- that is, if the value of MyBoolean is set to True. I see that I can't specify bit (the SQL Server equivalent of a boolean type) in the Parameter mapping tab of the Execute Sql Task editor. What type should I use? I've tried VARIANT_BOOL, SHORT, SIGNED_CHAR and NUMBER, but these don't seem to work. That is, I'm not getting any rows returned when I execute the task with this sort of setup.
Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>