Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS environment variables for SQL instances – SQL Server 2008

$
0
0

Hello everyone,

I have a small challenge that I'm facing.

CONTEXT: I am doing this project for this client where they are still using SQL Server 2008R2. There are three environments Dev, Test, and Prod, as the case usually is. However, these environments are down as instances of a SQL Server installation which is sitting on a single Windows Server. I am working on the SSIS packages that use SQL Server package configuration. I am trying to get an indirect configuration by using environment variables.

QUESTION: How do I set environment variables for different environments of instances sitting on only one Windows Server system? If there were three separate standalone servers I would have individual environment variables with the same name across all the three servers and just change the connection property. Does anyone have a better approach on how to configure SSIS packages for environments such as I have described above?

Please note that at the moment the client is using XML Configuration, which I’m finding to be so cumbersome and clunky with so many files scattered everywhere. I wanted to bring everything into one SQL table according to the environment.

Many thanks,  


Mpumelelo


How to handle deletes and inserts based on a flag table in SSIS

$
0
0

Hi,
I have two packages: 

Package ONE pulls data from source (cloud) to destination (on-prem):
a) Unecrypted tables go to DB1
B) Encrypted tables go DBstg

Package TWOdecrypts data from DBstg to DB1 (within the same instance)

Each package has a sequence container. Inside each sequence container, there are about 16 Data Flow Tasks. Inside each Data Flow Tasks, there are 8-10 tables. The way the package works is: First it gets max date from destination then gets max date from source. After this, it will compare between source and destination tables then it replicates the difference ... SELECT * FROM TABLENAME WHERE (Date > ? and Date <=?). Once replicatiOn is done, it decrypts from DBstg to DB1.

The problem is sometimes, the package fails after after half the tables are populated from the source to the destination or half of the tables are dcrypted from DBstg to DB1. half of the load is done. 

The following are not an option:

  • Check point is not an option because when the packages are re-run, there are duplicate values inserted. This is due to having multiple tables per Data Flow Task.
  • I have multiple Connection Managers (CM) and I cannot use BEGIN/COMMIT TRAN option since this can only be supported within one CM. 
  • The option to enable MSDTC is not on the table!

My question is, how do I  go about implementing a method to delete records 
My question is, how do I  go about implementing a method to delete records when the packages fails half way after records are inserted into half of the tables?  The option to use Script Task inside Data Flow Task is not available. I see OLE DB Command but it does not look like I can use as much as I can use Script Task to update a table. 

Any help really appreciated!

Thanks


IN~

Do you be announced as the next SSIS Guru? Read about TechNet November 2018 competition!!

$
0
0

What is TechNet Guru Competition?

Each month the TechNet Wiki council organizes a contest of the best articles posted that month. This is your chance to be announced as MICROSOFT TECHNOLOGY GURU OF THE MONTH!

One winner in each category will be selected each month for glory and adoration by the MSDN/TechNet Ninjas and community as a whole. Winners will be announced in dedicated blog post that will be published in Microsoft Wiki Ninjas blog, a tweet from the Wiki Ninjas Twitter account, links will be published at Microsoft TNWiki group on Facebook, and other acknowledgement from the community will follow.

Some of our biggest community voices and many MVPs have passed through these halls on their way to fame and fortune.

If you have already made a contribution in the forums or gallery or you published a nice blog, then you can simply convert it into a shared wiki article, reference the original post, and register the article for the TechNet Guru Competition. The articles must be written in November 2018 and must be in English. However, the original blog or forum content can be from beforeNovember 2018.

Come and see who is making waves in all your favorite technologies. Maybe it will be you!


Who can join the Competition?

Anyone who has basic knowledge and the desire to share the knowledge is welcome. Articles can appeal to beginners or discusse advanced topics. All you have to do is to add your article to TechNet Wiki from your own specialty category.


How can you win?

  1. Please copy/Write over your Microsoft technical solutions and revelations to TechNetWiki.
  2. Add a link to your new article on THIS WIKI COMPETITION PAGE (so we know you've contributed)
  3. (Optional but recommended) Add a link to your article at the TechNetWiki group on Facebook. The group is very active and people love to help, you can get feedback and even direct improvements in the article before the contest starts.

Do you have any question or want more information?

Feel free to ask any questions below, or Join us at the official MicrosoftTechNet Wiki groups on facebook. Read More about TechNet Guru Awards.

If you win, people will sing your praises online and your name will be raised as Guru of the Month.


PS: Above top banner came from Syed Shanu.

signature  Ronen Ariely
 [Personal Site]    [Blog]    [Facebook]   [Linkedin]

Is the Derived Column Task Dynamic or Static?

$
0
0

Lemme 'splain…

I receive a single member file that has member data from several suppliers. Unfortunately, they are not consistent in their naming conventions. Some use First Last; some use Last, First; some toss in the middle initial; for some inexplicable reason, one of them puts the last name in quotes; one of them even tosses in a comma after the name for who knows what reason.

I thought I'd use a single Derived Column task to parse the names. But, each name will need several passes to cover all the eventualities and I'd like the ability to easily put in additional parsing needs, if necessary.

So, that being the case, I didn't know if the successive rows in a Derived Column task could use the results of a prior row (dynamic) or if each row could only use the data as provided by the prior task - in this case, an Excel source editor (static).

If it's static, I can add successive Derived Column tasks to cover all my parsing needs in individual tasks, but, if not, if it's dynamic, then I can do all of it in one task.

So, here I am at the altar of the Cloud Minds to beg their wisdom.

Thanx in advance for any assistance!

Excessive transaction log size in SSIS catalog database (SSISDB)

$
0
0

I am having a recurring issue with the size of the transaction log for the SSISDB database. The database is running in simple mode (it's only a development instance) so it's not related to log backups. If I look at the utilisation of the log file it is mostly empty but when I try to truncate the log file I get an error stating that the log cannot be truncated due to ACTIVE_TRANSACTION.

I'm also getting a similar error in the "SSIS Server Maintenance Job":

Is there any reason why SSIS would be holding open transactions on the SSIS catalog database and thereby preventing the log file from being truncated?

SSIS Loading Data From Excel With Multiple Data Types in Column is not working properly

$
0
0

Hello Everyone,

We are trying to Load data from several Excel Spreadsheets into SQL Server Tables.

For this purpose, we are using SSIS to read and load the data. However we have found that the Excel data is not getting loaded properly into the SQL Server Table. Basically we noticed that if there is an Excel Column with All Integer Values in the first 8 rows and then it holds Character or Alpha Numeric values in the rows below, the system by default identifies that column as an Integer column and sets all the cells with Character or Alpha Numeric values as NULL.

The SSIS Excel connection manager nor the Excel Data source gives me an option to manually alter the expected data types.

This issue has been very heavily discussed in several forums but Microsoft has not given any solution for this till date.

Here is a sample excel file:

When i try to import this in SQL Server Table using SSIS, this is how it shows the data preview:

Many people have suggested Windows Registry Edit to force excel to read all records before deciding on the data type. This is not possible in my case as the clients will not allow us to play with the registry settings of the system.

Some have suggested to add "IMEX=1" Extended Property in the Excel Connection String. This is not working for my case.

Some have suggested to add "TypeGuessRows=99999" in the Extended Properties of the Excel Connection String. This also does not work in my case.

I have no control in the way the excel is being generated or formatted, hence having additional hidden text rows before the actual data starts, or having the cell format as TEXT for all the columns of excel is also not an option for me.

Below is the standard connection string from the Excel Connection Manager that i currently have:

Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\Users\suvrat\Desktop\RawExcel.xlsx;Extended Properties="Excel 12.0 XML;HDR=YES";

Any way we can force excel connection manager to use Data Types as per our choice?

This is such a simple requirement, but unfortunately i could not find any useful solutions to this problem.

Any thoughts on how this type of data can be loaded into the system?

Thanks in advance!

Suvrat



save json to file from http api response via ssis

$
0
0

i have api returning json and i'd like to save this response to a file or process it via ssis standart tools.

where to start with this problem?

SSIS Custom Destination Component not showing in dtsx package

$
0
0

Hi All, I am trying to create a custom xml destination component by using this as a reference xml destination.Below are the steps I used for deploying

1)Build the class library(target framework 4.0)

2)Sign the assembly using strong name

 [DtsPipelineComponentAttribute(DisplayName = "XML Destination",
            ComponentType = ComponentType.DestinationAdapter,
            UITypeName = "Microsoft.SqlServer.Dts.XmlDestSample.XmlDestinationSampleUI, xmldest, Version=1.0.0.0, Culture=neutral, PublicKeyToken=here I add the public token key")]

3)Install in GAC using gacutil \i "dll full path"

4) Copy the xml.dll in the following paths C:\Program Files (x86)\Microsoft SQL Server\110\DTS\PipelineComponents\xmldest.dll && C:\Program Files\Microsoft SQL Server\110\DTS\PipelineComponents\xmldest.dll

After this I open SSIS package and refresh the toolbox, but the custom xmldestination component is not showing in the toolbox.

Can someone please help me to identify  why its not showing in the toolbox?

Environment:

VS2010

.netframework 4.0

MS Windows 10 pro


calling werbservice dynamically by using websevice task in ssis

$
0
0

Hi All,

Please help me to implement  setup for fetching data through web services in ssis  through web service task where web service call dynamically.

Thanks in advance

regards

Ganesh

SSIS - OLE DB Connection is truncating numeric value when the destination is money

$
0
0

Hi, 

When I transfer numeric column directly to a money column, OLEDB component is truncating the value in it but TSQL and ADO.NET components are rounding as expected.  

Source value : 2.081250

Vat1: map directly ,

Vat2: mapping after converting to money data type

Result: 

VAT1VAT2CONN
2.08132.0813 TSQL
2.08132.0813 ADO
2.08122.0813OLEDB


Do you have any idea? Is it default behavior of OLEDB component? 

Regards, 

Mustafa 

Test Case: 

-- Source
CREATE TABLE TEST_NUMERIC(VAT1 NUMERIC(38,6), VAT2 NUMERIC(38,6))
INSERT INTO TEST_NUMERIC VALUES( 2.081250, 2.081250)

-- Destination 
CREATE TABLE TEST_MONEY(VAT1 MONEY, VAT2 MONEY, CONN NVARCHAR(20))

-- Test 1 - With TSQL

INSERT INTO TEST_MONEY
SELECT VAT1, CAST(VAT2 AS MONEY), 'TSQL' FROM TEST_NUMERIC 

-- Test2 - With SSIS, ADO NET and OLE DB components:

ADO.Net Source: 

SELECT VAT1, CAST(VAT2 AS MONEY) VAT2, N'ADO' AS CONN
FROM TEST_NUMERIC

Ado .Net Destination: TEST_MONEY

OLE DB Source: 

SELECT VAT1, CAST(VAT2 AS MONEY) VAT2, N'OLEDB' AS CONN
FROM TEST_NUMERIC

OLEDB Destination: TEST_MONEY



After FTP Task, trying to copy the file to directory - error that file is in use

$
0
0

I used FTP Task Editor to receive a file, which it did successfully. I then have a ForEach Loop container where then use File System Task editor to copy the file to another directory.  When it gets to that task (File System Task), I get an error that states that the process cannot access the file, that it is being used by another process.

I've read everything I could find on this error message and tried just about everything suggested but cannot get past this error.  I've looked in Task Manager and don't see it; I've rebooted. 

As a side note, I have the same issue with another file that was put in this directory manually (not with FTP).  It worked once and then got the same error every subsequent time.

Any help would be appreciated!

SSIS 2017 - Programmatically create script component as transformation

$
0
0

I am trying to create a script component as transform programmatically (not manually going inside script component) but i am unable to update main.cs file, i am getting below error.

Am i missing something, Any help would be appreciated.

FYI if i set second parameter in CreateNewProject to true i.e. "host.CreateNewProject(Guid.NewGuid().ToString("N"), true, true)", i am able to generate script component correctly but my changes done using PutSourceCode is not working.

Code:

IDTSComponentMetaData100 _ScriptComponent = pipeline.ComponentMetaDataCollection.New();
                _ScriptComponent.ComponentClassID = typeof(Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost).AssemblyQualifiedName;

                // Get the design time instance of the derived column
                var DesignScriptComponent = _ScriptComponent.Instantiate();
                DesignScriptComponent.ProvideComponentProperties();        //design time

                //VSTAScriptingEngine
                // Create a Precedence Constraint between derived column and OLEDB Destination Components
                var path = pipeline.PathCollection.New();
                path.AttachPathAndPropagateNotifications(pipeline.ComponentMetaDataCollection[Prev_Transform.Transformation_Name].OutputCollection[Prev_Transform.Output_Number],
                        _ScriptComponent.InputCollection[0]);

                // Get the list of available columns
                var ScriptComponentInput = _ScriptComponent.InputCollection[0];
                var ScriptComponentvInput = ScriptComponentInput.GetVirtualInput();
                var ScriptComponentVirtualInputColumns = ScriptComponentvInput.VirtualInputColumnCollection;

                string scriptValue = @"using System;
                                        using System.Data;
                                        using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
                                        using Microsoft.SqlServer.Dts.Runtime.Wrapper;
                                        [Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
                                        public class ScriptMain : UserComponent
                                        {
                                          public override void PreExecute()
                                            {
                                                base.PreExecute();
                                            }
                                         public override void PostExecute()
                                            {
                                                base.PostExecute();
                                            }
                                         public override void Input0_ProcessInputRow(Input0Buffer Row)
                                            {
                                                //sam generated this
                                            }
                                        }";
                //
                //DesignScriptComponent.SetComponentProperty("ScriptLanguage","Microsoft Visual C# 2017" );
                //DesignScriptComponent.SetComponentProperty("VSTAProjectName","CustomTransform1.csproj");

                ScriptComponentHost host = (DesignScriptComponent as IDTSManagedComponent100).InnerObject as ScriptComponentHost;
                host.CreateNewProject(Guid.NewGuid().ToString("N"), false, true);
                host.PutSourceCode("main.cs", "UTF8", scriptValue);
                host.SaveScriptProject();
                //string[] sourcecode_array= _ScriptComponent.CustomPropertyCollection["SourceCode"].Value;
                //// when counted manually array 9 had main.cs
                //sourcecode_array[9] = scriptValue;
                //DesignScriptComponent.SetComponentProperty("SourceCode", sourcecode_array);


Error

Validation error. Data Flow Task: Data Flow Task: Microsoft.SqlServer.Dts.Pipeline.CannotCreateUserComponentException: Cannot create user component class. Make sure there is one class marked with SSISScriptComponentEntryPointAttribute in your script.
   at Microsoft.SqlServer.Dts.Pipeline.VSTAComponentScriptingEngine.CreateScriptObjectHelper(Assembly scriptAssembly)
   at Microsoft.SqlServer.Dts.Pipeline.VSTAComponentScriptingEngine.GetScriptEntryPoint(String versionGuid)
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.CreateUserScriptInstance()


Sam

BI developer

$
0
0

creating ETL Job:-

How to obtain the desired result?

I have a column A in source table that has value 0 and 1. I have created a derived table for that column and associated column B and have destination table where there is already associated column Z which has value  "XYZ" where source column A has value = 0 
So if value in source table A changes from 0 to 1 the destination column Z will have the value "ABCD" and if the value changes back from 1 to 0 the destination column Z  should have value "EFGH"


data arehouse developer

$
0
0
can a table as both source and destination in ssis  data flow?

SQL Server Agent job failing due to handshake error

$
0
0

Hello,

Having some intermittent issues with a SSDT job run by SQL Server Agent on a daily schedule. The error message is as follows:

"Started:  12:59:07 AM  Failed to execute IS server package because of error 0x80131904. Description: A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: SSL Provider, error: 0 - The wait operation timed out.)  Source: .Net SqlClient Data Provider  Started:  12:59:07 AM  Finished: 1:00:13 AM  Elapsed:  65.13 seconds.  The package execution failed.  The step failed."


It doesn't fail every night, just at random, and this is the only job out of ~15 that fails like this. Any one else experience this before?


retaining a connection between master to sub pkg calls

$
0
0
Hi, we run 2016 enterprise. We have a situation where a master calls 2 or more sub packages, one after the other.  The 1st sub package creates a global temp which we'd really like for the 2nd thru the nth sub pkg to see.  Is there a way to retain that 1st connection so all sub pkgs can see the global temp? 

Which version of Visual Studio to use for SSIS, SSAS and SSRS development

$
0
0

Hi,

We have a SQL Server 2014 DW/BI solution that was developed using Visual Studio2013 Shell (Integrated) with SSIS, SSAS and SSRS. This has been absolutely fine until I was referred to the articles below which talk about using SSIS project properties that I cannot see for my project. For example, in my project I do not have/see a Project.params file.

Question: what is the recommended version of Visual Studio for developing SSIS,SSAS and SSRS code that is targeting SQL Server 2014?

Articles that I am referring to are:

https://www.hansmichiels.com/2016/11/19/using-sensitive-parameters-ssis-series/

https://blogs.msdn.microsoft.com/biblog/2013/05/07/step-by-step-of-executing-ssis-2012-package-through-stored-procedure/

Can't open Analysis Services Processing Task with latest SSDT release

$
0
0

I have an existing SSIS package with an "Analysis Services Processing" task. When I try to edit the task I receive the error message below. When I confirm the error message there is nothing visible on the "Processing Settings" section in the editor which is opened. What am I missing here?

System components:

  • Visual Studio 2017 (15.8.9)
  • SQL Server Data Tools standalone installer (15.8.2)
  • Integration Services Project Extension  (2.1)
  • Windows 10

TITLE: Analysis Services Processing Task
------------------------------

Could not load type 'Microsoft.DataWarehouse.Design.IUserPromptService' from assembly 'Microsoft.DataWarehouse, Version=15.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'.

------------------------------
BUTTONS:

OK
------------------------------

Permission to deploy ssis packages and created SQL server agent job in SQL 2012

$
0
0
Hi,
I got access into a sql server 2012, in which i can see all the databases. But I don't see the SSIS packages deployed in Integration Catalog and also i could not see SQL Server Agent -  Now, what kind of permission (The technical name of the permission) do i need 
--> To view the IS Catalog folders & 
--> To deplo packages and 
--> To view the SQL server  Agent and also 
--> To create SSQL server agent Jobs?

Sql Job using attunity teradata connector failing..

$
0
0

I have a 2008 ssis package pulling data from teradate using attunity connector. The packages are developed in my laptop, and called thru a job from the sql 2008 box. ) Shared ICU Libraries For Teradata nt-x8664
2) Teradata GSS Client nt-x8664
3) Teradata cliv2 nt-x8664
4) ODBC Driver for Teradata (64 bit)
6) TPT Operator support library (poslc000_ x8664)
7) TPT Load Operator (plodc000_x8664)
9) TPT Update Operator(pupdc000_x8664)
10)TPT Stream Operator (pstmc000_x8664)
11)TPT API (papic000_x8664)

Are the stuff thats installed on the Win 2003 server, sp2 - 64 bit sql 2008 box...works fine in dev but then when we moved to uat i get an error saying -
Executed as user: NA\svc-sqlinst09-clst81. Microsoft (R) SQL Server Execute Package Utility  Version 10.0.2520.0 for 64-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    Started:  4:35:26 PM  Error: 2009-11-09 16:35:28.85     Code: 0xC0014005     Source:       Description: The connection type "MSTERA" specified for connection manager "Teradata Connector 1" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.  End Error  Error: 2009-11-09 16:35:28.85     Code: 0xC0010018     Source:       Description: Error loading value "<DTS:ConnectionManager xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="DelayValidation">0</DTS:Property><DTS:Property DTS:Name="ObjectName">Teradata Connector 1</DTS:Property><DTS:Property DTS:Name="DTSID">{0C558FD4-924C-4D23-841B-3C77F" from node "DTS:ConnectionManager".

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>