Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS Data Flow Item tab missing when I go to "Choose toolbox items" in SSDT for visual Studio 2013

$
0
0

I had MS Visual Studio ultimate 2013 and MS SQL Server 2012 on the dev machine (Win 7) and I installed SQL Server Data Tools - "Business Intelligence for Visual Studio 2013". When I launch SSDT for visual Studio 2013, I cannot find SSIS Data Flow Item tab if I go to "choose items" in toolbox. I gone through related suggestion on the forum which says "REM Unregister and re-register the assembly on the GAC" but I could find assembly namedChangeCase.dll on my machine hence following URLs does not help. Looking forward for support............

ThanksChoose Toolbox Items


MSDN


Data Load of a HUGE National Database File and any hints...pointers

$
0
0

We have tasked to load the Centers for Medicare & Medicaid Services(CMS) NPI file...or otherwise know as the "National Plan and Provider Enumeration System(NPPES)" file. The file is created and posted as a monthly full file cut and then weekly incrementals. We are attempting to load the full monthly file which is 5.7GB. Soooo leads me to my questions...plural...

  1. How much space do I need to allocate to my SQL Server Database and Table? 5.7GB or is there overhead?
  2. For efficiency purposes, what do I need to do to my SSIS Package that will run this efficiently? And the most timely
  3. Should I load this with SSIS or some other means because of the huge volume?
  4. Do I need to explicitly define my data fields on my Flat File Connection Manager? I think I saw somewhere that you should explicitly define your data columns and lengths to avoid any overhead

If there are any additional suggestions, please let me know.

Thanks for your review and am hopeful for a reply.

Is the error output from a flat file source and oledb source (excel file)to flat file destination always differernt

$
0
0

When I am doing error logging in to a flat file destination(text file) using a flat file source using row direct option at the flat file source I tend to get following columns - 

Flat File Source Error Output Column,ErrorCode,ErrorColumn

When I am doing error logging in to a  file destination(text file) using a oldeb source using row direct option at the OLEDB source I tend to get following columns -  db columns and ErrorCode,ErrorColumn

In my case the OLDEB SOURCE is a excel file so it puzzles me .

Why is the redirection so different .?

For OLEDB redirect cant I have the same -Oldeb source Source Error Output Column(which is long text) ,ErrorCode,ErrorColumn .

Is it mandatory to map all the db columns along with the error columns in the  OLEDB redirect case?I think it is.










SSIS access to Azure Key Vault

$
0
0

I have an Azure SQL database with a table that has one field encrypted using Always Encrypted.  The column master key is store in Azure Key Vault.  Now I would like my on-premise SSIS application to be able to decrypt that column when retrieving data from Azure SQL database.  Can SSIS be configured to talk to Azure Key Vault for the master key?  If not, is there a way my SSIS application can decrypt the value?

Informix 2000 to SQL 2008R2 SSIS error! error message: Ambiguous column

$
0
0
I am involved in a project to create a reporting database from an Avaya telephony system (running on an Informix 2000 db) into a SQL 2008R2 database.

Specs. For the Informix 2000 Avaya, I have the OLOD4032.DLL (4.20.04.05) OpenLink Generic 32 bit v4.0. I am using SQL 2008R2, build 10.50.1777EntEdit on 64-bit platform (Win2008R2 EntEdit SvPk1). Using the ODBC Admin 32-bit tool, I was able to configure a DSN with connection success.

When I use the "import data" wizard (SSIS background) in SQL SSMS, I can get it to recognize the 32-bit older driver. I see my dbase, tables, etc., from source, and have created my target tables.

When I try to PREVIEW data, or run the SSIS / Wizard package, I get this error:

Error 0xc0202009: Data Flow Task 1: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "OpenLink ODBC Provider" Hresult: 0x80004005 Description: "[OpenLink][ODBC][Informix Server]Ambiguous column(item_type). (-324)". (SQL Server Import and Export Wizard)

Below is the query:

SELECT S.acd_no,
S.item_type,
S.item_name,
S.value,
S.descr
FROM root.synonyms s

Any HELP would be appreciated.

Visual Studio 2015 with SSIS. Where are the variables?

$
0
0

I have found a lot of blogs etc showing how to work with variables in an SQL Execute Task widget but very few sites that deal with parameters outside of that scope and none of the pages I have found match the environment I am working with, the directions don't work.

I have visual studio 2015 and want to create a dynamic variable based on the System::StartTime (the time the SSIS package started) and other dynamic variables.

Where can I see a list of system variables?  How am I to reference them?

thanks.

Error capturing at various levels in SSIS dataflow and sending emails |SQL server 2008 R2

$
0
0


I have a SSIS package which loads data from csv files in 2 step process.

Step1. Load data in stage table and if any bad records flow at the flat file source level redirect them to a text file 

Step2. Load cleansed data from staging table to destination and complete the dataflow task.I dont need send email task here as errors wont occur here .

After the 2 steps - I send email with bad records in a text file, based on the rowcount  between the Source -FLAT FILE REDIRECT ERROR OP  and error text file destination.

However I have noticed that in step 1 when data flows correctly and then fails in the other subsequent data processing step say like a data conversion transformation task between source and destination ( not error redirect ) the package fails and no error email is triggered. 

I therefore need a solution to send mail during step 1 and also this scenario( which is if package fails for anyother errors in any other processing steps).I just need to capture the errors and send them in a email using send email task.

How do I do it effectively?

I was able to log bad records in a flat file and was able to send email by attaching the log file.




Using Attunity Drivers 3.0 in Visual Studio 2015 Preview

$
0
0

Hi,

We currently use Attunity SSIS Connectors v2.0 for Oracle with SQL Server 2012 Integration services within the Visual Studio 2012 IDE.

We are moving to SQL 2014 and want to leverage Visual Studio 2015 SSDT as this now includes all the BI projects (SSIS, SSAS, SSRS) + SSDT and integrated with Visual Team Services. We are testing the dev tools prior to the upgrade.

I have installed the version 3.0 Attunity SSIS Connectors for Oracle (both 32bit and 64bit) which states that is works for VS 2013+, but the Connection type is not available within my Visual Studio 2015 SSIS project.

So my question is - are the Attunity SSIS Connectors v3.0 for Oracle supposed to be compatible with VS 2015, and if not, is there expected to be an update in the future?

Kind Regards,

James


SSIS derived column that will point multiple source columns to one destination column

$
0
0

I currently have 6 columns 

Tomatoes, Broccoli, Cauliflower, Potatoes, Spinach, Carrot and if one of these vegetables = 1 then a one should be shown in the corresponding vegetable column.

Heres what I have so far:

[Carrots] = 1 ? [Vegetables]

Limitations on CSV & SSIS?

$
0
0

I have a large? CSV file of 12mb where the data, among the 30 columns or so, is essentially split up into two columns. Furthermore, to avoid conversion problems, I set the connection manager's column properties to DT_TEXT. Oddly enough, I noticed that my packages would not run past the Pre-Execution stage because they would just hang!!After troubleshooting a number of issues, from SSIS coding, to potentially damaged files, to SQL Server's transaction blocks...nothing would stop it from hanging. After several days of this, I was at my wits end, and angrily set the connection manager's column properties to DT_STR(50) across all my columns just to see if the package was broken.

Would you know that it worked? After a few tests, sure enough, SSIS will hang if I have too manycolumns set to DT_TEXT in the Source Manager! What the exact number, I have not tested since I'm tired and hungry and upset with Excel...which seems to me to be extremely finicky with SSIS 2012.

Does anyone know why DT_TEXT is so unstable and what the limitations on using DT_TEXT are? I'm still shocked knowing that SSIS makes it so damn hard to use Microsoft's own Excel files...

Any insight would be much appreciated.


SSIS XML source not pulling in Root level attributes

$
0
0

Using SSIS in visual studio 2015 and the XML source still doesn't retrieve the root element attributes.  I have seen older posts on this but is this bug going to be fixed any time soon?

[EzAPI] How to working with EzDerivedColumn?

$
0
0

hi!

I try to use the class EzDerivedColumn from the EzAPI project. I want to create a new column andconcatenate two input columns like: Col1 + "//" + Col2. I can create the new column but if i try to set the "FriendlyExpression"i get always an error. Any little example for me please?

Thanks!

Andreas

SqlServer 2014 - SSIS Package Execution - version 14.0 script not supported

$
0
0

I developed a SSIS project by SQLServer Data Tools (SSDT) 2015 which used the script task function. The package run successfully on SSDT. But when the package was deployed on SQLServer 2014 and executed, it was failed with the following  error message. Please help.

Error:

There was an exception while loading Script Task from XML: System.Exception: The Script Task "ST_88e6571b80144dd3b19beae856cbbab1" uses version 14.0 script that is not supported in this release of Integration Services. To run the package, use the Script Task to create a new VSTA script. In most cases, scripts are converted automatically to use a supported version, when you open a SQL Server Integration Services package in %SQL_PRODUCT_SHORT_NAME% Integration Services.

   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask.LoadFromXML(XmlElement elemProj, IDTSInfoEvents events)

all of a sudden dts run time variables cant be indexed in script tasks

$
0
0

Hi. I have 2014 enterprise and 2008 r2 std installed side by side on a desktop. I was under the impression that is a safe thing to do.

All of a sudden, after working with both for months, and executing script tasks often under each, I get an error in new and existing pkgs complaining about dts runtime indexing issues in the scripts.

Why is this?  What can I do about it?  Can I export a failing one to my catalog when the binary for such scripts wont build locally?   And get it to run there on the server anyway?

The only thing I've done a bit out of the ordinary recently with this pc was to vpn into our parent company's domain and run (run as) ssms using the creds I use there.


Interop issues only seen when running through SQL Agent

$
0
0

I have a SSIS package that generated a 9 MB workbook, then runs a script task that deletes a single row.  When running in BIDS it works no matter how big the file is.  When I run it in SQL Agent with a small sample set of the data it works.  When I run the full report through SQL Agent, I get the message "Document not Saved"

I have tried changing the save to a save s, and still get the same error. 

Appears that the script cant save a large file, but works fins with the smaller file.

Win 2008 R2

SQL 2012

Interop - Microsoft Excel 15.0 Object Library

C# script

Export data into new text file on daily basis

$
0
0

Hi,

I want export data into new text file  from SQL on daily basis. Everyday new file has to generate and export the data into that text file.

Could you please suggest me how can I create a package to create new text file and export the data into that.

Thanks in advance.

--

With Regards,

KrishnaReddy.

how to do data cleansing ssis2008

$
0
0
how to do data cleansing ssis2008 ?

Error. Data Flow Task 1: Data Flow Task 1: Column "gender_code" cannot convert between unicode and non-unicode string

$
0
0

I am a newbie with SSRS 2008.  I developed a DTSX file to copy data from one table in one database to another table in another database.  In this DTSX file I have 3 control flow items:

Execute SQL Task: Drops my destination table

Preparation Task: Creates my destination table

Data Flow task:

    Source: connects to SourceConnection OLEDB via SQL Command access and SQL command text

    Destination: connects to DestinationConnection OLEDB via Table or view - fast load.  Destination table = same as mentioned above.

When I run this package I get the error described in the title.  But I compared the datatypes between Preparation SQL Task and the Data Flow source query and they are matching for all fields.  So why does it still give me this error?  It gives me this same error for many fields.

Also, when I run it the first two control flow items succeed.  It is just the Source Query Data Flow that is failing.  The other problem I noticed is that the password for the destinationConnectionOLEDB keeps getting cleared even though I checked the "Save password" box.

 

SSIS package "Echo Information Migration.dtsx" starting.
Information: 0x4004300A at Data Flow Task 1, SSIS.Pipeline: Validation phase is beginning.
Error: 0xC02020F6 at Data Flow Task 1, Source - Query [1]: Column "gender_code" cannot convert between unicode and non-unicode string data types.
Error: 0xC02020F6 at Data Flow Task 1, Source - Query [1]: Column "religion_code" cannot convert between unicode and non-unicode string data types.
Error: 0xC02020F6 at Data Flow Task 1, Source - Query [1]: Column "ethnicity_code" cannot convert between unicode and non-unicode string data types.
Error: 0xC02020F6 at Data Flow Task 1, Source - Query [1]: Column "race_1_code" cannot convert between unicode and non-unicode string data types.
Error: 0xC004706B at Data Flow Task 1, SSIS.Pipeline: "component "Source - Query" (1)" failed validation and returned validation status "VS_ISBROKEN".
Error: 0xC004700C at Data Flow Task 1, SSIS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Data Flow Task 1: There were errors during task validation.
SSIS package "Echo Information Migration.dtsx" finished: Failure.

 


My Execute SQL Task SQL statement:

drop table dbo.rd_information3_echo

My Preparation SQL Task SQL Statement:

 

CREATETABLE [rd_information3_echo] (
    [client_id] nvarchar(20),
    [original_table_name] varchar(300),
    [last_name] nvarchar(50),
    [first_name] nvarchar(50),
    [middle_name] nvarchar(15),
    [agency_id_no] nvarchar(20),
    [gender] nvarchar(50),
    [gender_code] varchar(20),
    [ss_number] nvarchar(9),
    [date_of_birth] datetime,
    [street_address_1] nvarchar(50),
    [street_address_2] nvarchar(50),
    [City] nvarchar(50),
    [state] nvarchar(50),
    [state_code] nvarchar(2),
    [zip_code] nvarchar(9),
    [religion] nvarchar(50),
    [religion_code] varchar(20),
    [ethnicity] nvarchar(50),
    [ethnicity_code] varchar(20),
    [race_1] nvarchar(50),
    [race_1_code] varchar(20)
)

 


My source query SQL command text:

 

selectdistinct--en.location_c as Location, --This only exists to identify data errors by location and does not run when loading data.CONVERT(nvarchar(20),c.clientcode_c) AS client_id, --Legacy CDT# (must be the same legacy ID for all conversion tables)CONVERT(nvarchar(50),c.lastname_vc) AS last_name,CONVERT(nvarchar(50),c.firstname_vc) AS first_name,CONVERT(nvarchar(15),c.middlename_vc) AS middle_name,CONVERT(nvarchar(20),c.altclientcode_vc) AS agency_id_no, --TNKIDS NumberCONVERT(nvarchar(50),CASEWHEN c.gender_c = 'M'THEN'Male'WHEN c.gender_c = 'F'THEN'Female'WHEN c.gender_c = 'U'THEN'Unknown'ELSE'Unknown'END) AS gender,CONVERT(varchar(20),CASEWHEN c.gender_c = 'M'THEN'M'WHEN c.gender_c = 'F'THEN'F'WHEN c.gender_c = 'U'THEN'U'Else'U'END ) AS gender_code,CONVERT(nvarchar(9),(SUBSTRING(c.socialsecnum_c, 1,3)+SUBSTRING(c.socialsecnum_c, 5,2)+SUBSTRING(c.socialsecnum_c, 8,4))) AS ss_number,
     c.birthdate_d AS date_of_birth,CONVERT(nvarchar(50), address1_vc) AS street_address_1,CONVERT(nvarchar(50), address2_vc) AS street_address_2,CONVERT(nvarchar(50), city_vc) AS City,CONVERT(nvarchar(50), state_c) AS state,CONVERT(nvarchar(2), state_c) AS state_code,CONVERT(nvarchar(9), zip_c) AS zip_code,CONVERT(nvarchar(50),CASEWHEN c.religion_c = 'A'THEN'7th Day Adventist'WHEN c.religion_c = 'AG'THEN'Agnostic'WHEN c.religion_c = 'AT'THEN'Atheist'WHEN c.religion_c = 'B'THEN' Buddhist'WHEN c.religion_c = 'BA'THEN'Baptist'WHEN c.religion_c = 'C'THEN'Catholic'WHEN c.religion_c = 'E'THEN'Episopalian'WHEN c.religion_c = 'EC'THEN'Ecumencial'WHEN c.religion_c = 'H'THEN'Hindu'WHEN c.religion_c = 'HG'THEN'Huguenot'WHEN c.religion_c = 'J'THEN'Jewish'WHEN c.religion_c = 'JW'THEN'Jehovahs Witness'WHEN c.religion_c = 'L'THEN'Lutheran'WHEN c.religion_c = 'MU'THEN'Muslim'WHEN c.religion_c = 'ME'THEN'Methodist'WHEN c.religion_c = 'MEN'THEN'Mennonite'WHEN c.religion_c = 'MO'THEN'Mormon'WHEN c.religion_c = 'M'THEN'Moslem'WHEN c.religion_c = 'N'THEN'None'WHEN c.religion_c = 'NO'THEN'Nondenominational'WHEN c.religion_c = 'O'THEN'Other'WHEN c.religion_c = 'P'THEN'Protestant'WHEN c.religion_c = 'PC'THEN'Pentecostal'WHEN c.religion_c = 'PS'THEN'Presbyterian'WHEN c.religion_c = 'Q'THEN'Quaker'WHEN c.religion_c = 'UN'THEN'Unknown'WHEN c.religion_c = 'UT'THEN'Unitarian'ELSE'Unknown'END) AS religion,UPPER(CONVERT(varchar(20),CASE-- Cased out religion codes KMH 06/17/10WHEN c.religion_c = 'A'THEN'A'WHEN c.religion_c = 'AG'THEN'AG'WHEN c.religion_c = 'AT'THEN'AT'WHEN c.religion_c = 'B'THEN' B'WHEN c.religion_c = 'BA'THEN'BA'WHEN c.religion_c = 'C'THEN'C'WHEN c.religion_c = 'E'THEN'E'WHEN c.religion_c = 'EC'THEN'E'WHEN c.religion_c = 'H'THEN'H'WHEN c.religion_c = 'HG'THEN'HG'WHEN c.religion_c = 'J'THEN'J'WHEN c.religion_c = 'JW'THEN'JW'WHEN c.religion_c = 'L'THEN'L'WHEN c.religion_c = 'MU'THEN'MU'WHEN c.religion_c = 'ME'THEN'ME'WHEN c.religion_c = 'MEN'THEN'MEN'WHEN c.religion_c = 'MO'THEN'MO'WHEN c.religion_c = 'M'THEN'M'WHEN c.religion_c = 'N'THEN'N'WHEN c.religion_c = 'NO'THEN'NO'WHEN c.religion_c = 'O'THEN'O'WHEN c.religion_c = 'P'THEN'P'WHEN c.religion_c = 'PC'THEN'PC'WHEN c.religion_c = 'PS'THEN'PS'WHEN c.religion_c = 'Q'THEN'Q'WHEN c.religion_c = 'UN'THEN'UN'WHEN c.religion_c = 'UT'THEN'UT'ELSE'UN'END)) AS religion_code,CONVERT(nvarchar(50), --Added ethnicity case statement KMH 5/26/10CASEWHEN c.race_c = 'H'THEN'Hispanic'ELSE'Non Hispanic'END) AS ethnicity,CONVERT(varchar(20), --Added ethnicity case statement KMH 5/26/10CASEWHEN c.race_c = 'H'THEN'01'ELSE'02'END) AS ethnicity_code,CONVERT(nvarchar(50),CASEWHEN c.race_c = 'A'THEN'Asian'WHEN c.race_c = 'AI'THEN'American Indian'WHEN c.race_c = 'B'THEN'African American'WHEN c.race_c = 'BR'THEN'Bi-racial'WHEN c.race_c = 'C'THEN'Caucasian'WHEN c.race_c = 'H'THEN'Hispanic'WHEN c.race_c = 'ME'THEN'Middle Eastern'WHEN c.race_c = 'N'THEN'Native Hawaiian/Other Pacific Islander'WHEN c.race_c = 'O'THEN'Other'ELSE'Other'END) AS race_1,UPPER(CONVERT(varchar(20),CASEWHEN c.race_c isNULLTHEN'U'WHEN c.race_c = 'A'THEN'A'WHEN c.race_c = 'AI'THEN'AI'WHEN c.race_c = 'B'THEN'B'WHEN c.race_c = 'BR'THEN'BR'WHEN c.race_c = 'C'THEN'C'WHEN c.race_c = 'H'THEN'H'WHEN c.race_c = 'ME'THEN'ME'WHEN c.race_c = 'N'THEN'N'WHEN c.race_c = 'O'THEN'O'ELSE'U'END))AS race_1_code,'ar.client'AS original_table_namefrom
   ar.client cINNERJOIN cd.enrollments en ON (c.uniqueid_c = en.clientid_c)INNERJOIN cd.episode ep ON (ep.uniqueid_c = en.episodeid_c and ep.clientid_c = c.uniqueid_c)LEFTJOIN dbo.GC_clientaddress ad ON (ad.clientcode_c = c.clientcode_c)where
   (ep.enddate_d isNULLOR ep.enddate_d >= getdate()-729) and
   en.location_c in (select code from dbo.yv_LKUP_OfficeLocation where state in ('GA', 'AL'))--change data pull with states here (check location codes on dbo.yv_LKUP_OfficeLocation)

Also, I noticed that the Metadata window between Source and Destination on the Data Flow tab, that these 4 fields = DT_WSTR of length 5, even though I specified length 20 in the SQL Create script and also I see this same table already listed with these fields and length 20. 

 

 


Ryan D

Configuring automatically a SSIS solution deployed on SSIS catalog with environment variables - SSIS 2012

$
0
0

Hi,

I've a SSIS 2012 solution deployed on SSIS catalog of the dev environment. Now, I need to create the scripts to deploy the solution on the prod environment.

In particular, after the environment and related variables creation, I need to get the script that references the project with the environment and the script that associates for each project parameter the corresponding environment variable.

Any helps to me, please? Many thanks

SSIS Automation

$
0
0

Experts,

We have Oracle database in Domain1\USER1
We want to copy some of the table to Sql Server in Domain2 via SSIS process.

We opened SSIS package as 
runas /netonly /user:DOMAIN2\USER1_ACCOUNT " path\devenv.exe"
We have windows authentication for Domain1\USER1

We created the package and the data transfer between the packages are happenning correctly.

Now we want to automate this process.

step 1: We deployed the package and when we right click on the package in SSMS integration services Catalogs( ssms opened as runas /netonly ) and execute package is working Fine.

step 2: When I create a agent job and run the same package its failing.

We tested the same with SQL authentication (I'm in Domain1) in SSIS package for Domain2, and working fine.

We want the same to be run with windows authentication, How can I achieve this?

Thanks 

Praveen


Praveen

Viewing all 24688 articles
Browse latest View live