Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS package - do a manual checkpoint with sql task?

$
0
0

Can I do a manual checkpoint via a sql task from within an SSIS package, assuming the account executing the package has permissions?


Track failed filenames in a table SSIS

$
0
0

I am having 30 files in a folder. So i am using foreachloop container to load the data coming from all the files. If any one of the file among all the files having issue we don't fail the total package instead of failing package load all successful files into destination table and want to track failed filename . We loaded all the successful files in to deatination using propagate property.

How to know/track in table which file having issue / not loaded while execution of package

Thanks In Advance 

Siddanth


Siddanth

Examples to create SSIS packages dynamically

$
0
0

Hi,  I have been tasked to come up with C# prototype to create SSIS 2017 packages dynamically.

Very simple case - a single DATA FLOW TASK between sql server source/destination.

It has been done already many times before, probably.

I'm aware of incomplete bits in Microsoft docs..

Is there a blog or example of complete solution that I can use?

Please point me to the right direction.

I'm using VS 2017 with sql server 2017

thank you!

Setting a relative path to child packages

$
0
0
Greetings,

my SSIS solution has a Master Package which calls multiple child packages - in my development environment i had directly pointed my package connection managers to the absolute path to the dtsx file within my dev environment.  This package is for a commercial application that will be ran in many different environments (different paths).

I have an XML config file for setting (path set in environment variable) for my database connections, mail server /email addresses, and other variable settings.

I have over 20 child packages in my solution - i am hoping i do not have to include 20 settings in my xml file for pointing to each child package! I would also like to deploy this via the file system rather than SQL. I have tried using .\MyPackage.dtsx in my connection manager - but receive an error:

"Error: Error 0x80070002 while loading package file ".\MasterFiles1.dtsx". The system cannot find the file specified. . "

Is there a solution to this?

Thanks in advance for any assistance.

Regards, Clay

SQL Server 2016 splits rows in flat file destination

$
0
0

An SSIS job running with an OLE DB connection to a SQL Server 2008 produces a flat file destination output like this:

905574|012470,097566,298605,334988,426155,481002,...
910219|029929,149841,326495,461291
933447|067516
943449|053096,092330,249222,300504,329626,386116,...
944688|017507,071064,289347,319367,356738,474931,...
954567|033549,083460,294846,313324,446125,476221,...
961179|001630,043896,184759,326832,411758,445063,...
971664|048997,098448,
974104|085329,136985,204911,300600,356153
997138|032756,166485,214313,412516,481333,495417,...
...

The same SSIS job running with an OLE DB connection to a SQL Server 2016 produces a flat file destination output like this:

905574|012470,097566,298605,334988,426155,481002,...
910219|029929,149841,326495,461291
933447|067516
943449|053096,092330,249222,300504
943449|329626,386116,...
944688|017507,071064,289347,319367,356738,474931,...
954567|033549,083460,294846,313324,446125,476221,...
961179|001630,043896,184759,326832,411758,445063,...
971664|048997,098448,
974104|085329,136985,204911,300600,356153
997138|032756,166485,214313
997138|412516,481333,495417,...
...

The "..." above just means there is more data in the real files. The longest rows can have 3,000 or more six digit entries after the |. The shortest rows can have one six digit entry after the |.

Running with a connection to SQL Server 2016 produces output where every few lines a row is split into multiple rows. For example, the rows beginning with943449 and 997138 in the SQL Server 2008 example has everything after the | split into two rows in the SQL Server 2016 example.

The source data is exactly the same on the two databases. This is part of a conversion project from SQL Server 2008 to SQL Server 2016.

The output columns for the destination file have the same settings. What else can I check?

Thanks.

SSIS package - do a manual checkpoint with sql task?

$
0
0

Can I do a manual checkpoint via a sql task from within an SSIS package, assuming the account executing the package has permissions?

Redirecting error rows and storing the error of every column in a table coming from file

$
0
0

Hi Everyone,

I have a employee file that have column id ,name,age,dob  as header and header are not mentioned in file :Below is the file.

if i got wrong data in file as you can see in column age its 'd' written and in column dob its 'a' written .

I want 1 row to go into my main table i.e "EMP" table that are for good record and second row i want to go column wise in my "Error_Log" table.

I have tried using row redirect in SSIS but in that case i can capture only first error column not second any suggestion or link .

1.EMP File 

1

Alex

38

3/30/1983

2

henry

d

a

2.Good record to be inserted in EMP


ID

Name

Age

DOB

1

Alex

38

3/30/1983

3.Bad record to be inserted column wise in  Error_Log Table like below

ID

Error_Column

Error_Desc

2

Age

Coversion failed

2

DOB

Coversion failed

 

How to avoid package validation in package level?

$
0
0
Hi,
I have a ssis package with more than 20 tasks inside it, inclusive of 5 DFTs,For each loop containers, Execute SQL  tasks and so on.
When I open the package, its taking hell  a lot of time to get into ready postion, because of validating all the tasks.

Is there any option in the package level, where I can enable , so that , no validation happens and the package  opens very fast?

How to discover all the objects used in a SSIS project?

$
0
0

hi there,

i am just thinking in an automatic way to discover all the staging tables/general objects (udf, stored procedures) that a Integration Services projects is consuming...

I am including even objects that are not shared only created in the own SSIS project

In this concrete case I am talking about 5 .DTSX packages using a large number of tables and objects in a single database and we don't have documentation.

AS essentially every single .DTSX is a XML file, let me know your inputs

How to generate a SSIS package by VS to load data from DB2 and output a CSV file?

$
0
0

I have a new assignment to build a SSIS package by Visual Studio to get data from DB2 and produce a CSV file, with which I am not familiar. Currently in SQL server we set up a link server to DB2 and can query the data from DB2.

Can someone give me some idea or sample how to start this SSIS package?

Nested "For Loop Container" causes the outer loop to end

$
0
0

I have an abstracted project consisting of a "For Loop Container" within another "For Loop Container". The definition of the outer loop is:

and the inner loop is:

The default value for @QuitLoop is False.

Within the inner loop I have a script task which reports the values of the @OuterLoop and @InnerLoop variables and I get 100 messages:

However, if I add a step in the script task which sets the @QuitLoop variable to True at a certain point:

if (iOuterLoop == 2 && iInnerLoop == 4)
{
    Variables vars = null;
    Dts.VariableDispenser.LockForWrite("User::QuitLoop");
    Dts.VariableDispenser.GetVariables(ref vars);
    vars["User::QuitLoop"].Value = true;
    vars.Unlock();
}

both loops stop running at this point:

However the @QuitLoop variable should only impact the inner loop. Why would the inner loop be causing the outer loop to stop early?

How to configure TFS to automatically build an ISPAC file on check-in

$
0
0

Hello Everybody,

We are checking our SSIS 2016 packages into TFS.  We always make sure we can successfully build the project, too, which creates the *.ispac file for us locally.  When it comes time to deployment, we manually pass that ISPAC over to the Operations group.  The Operations person then logs onto the production SSIS server and double-clicks the ISPAC file in order to bring up the SSIS deployment wizard. 

Sometimes this process doesn't work for us due to human error.  Typically, that's a developer forgetting to check in the project they built into an ISPAC.  That ISPAC gets deployed, but when the next developer makes changes to the same project, we reintroduce a bug that was fixed in the previous deployment.

Is there a way to force TFS to create the ISPAC for us?  In this way, we could ensure that what gets checked in is what gets deployed (and likewise what doesn't get checked in doesn't get deployed).

Thanks,
Eric B.

[BUG] ADONETDestination::PreExecute generates wrong parameter names

$
0
0

I had tried to use "ADO NET Destination" [VS2017 + SSDT] with my ADO.NET Provider and found the problem.

// C:\WINDOWS\Microsoft.NET\assembly\GAC_MSIL\Microsoft.SqlServer.ADONETDest\v4.0_14.0.0.0__89845dcd8080cc91\Microsoft.SqlServer.ADONETDest.dll
// Microsoft.SqlServer.ADONETDest, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91
// Global type: <Module>
// Architecture: AnyCPU (64-bit preferred)
// Runtime: v4.0.30319

See this code of ADONETDestination::PreExecute:

    stringBuilder.Append(") VALUES (");    string parmameterMarkerFormat = getParmameterMarkerFormat();    PostDiagnosticMessage(Microsoft.SqlServer.Dts.Pipeline.Localized.DiagnosticPre("DbProviderFactory.CreateParameter"));    string empty = string.Empty;    for (int k = 0; k <= num; k++)    {        empty = string.Format(CultureInfo.InvariantCulture, "{0}{1:d}", new object[2]        {            "p",            k + 1        });        empty = string.Format(CultureInfo.InvariantCulture, parmameterMarkerFormat, new object[1]        {            empty        });        DbParameter dbParameter = m_DbFactory.CreateParameter();        dbParameter.ParameterName = empty;

Source code of getParmameterMarkerFormat:

private string getParmameterMarkerFormat()
{    if (m_DbConnection.GetType().Equals(typeof(SqlConnection)))    {        return "@{0}";    }    DataTable schema = m_DbConnection.GetSchema(DbMetaDataCollectionNames.DataSourceInformation);    return (string)schema.Rows[0]["ParameterMarkerFormat"];
}

Short description of problem

You use parameter name for generate parameter marker name and replaceparameter name by this parameter marker name.

---

See documentation about "ParameterMarkerFormat" column:

https://docs.microsoft.com/en-us/dotnet/framework/data/adonet/common-schema-collections#datasourceinformation

ParameterMarkerFormat string

A format string that represents how to format a parameter.

If named parameters are supported by the data source, the first placeholder in this string should be where the parameter name should be formatted.

(1) For example, if the data source expects parameters to be named and prefixed with an ‘:’ this would be ":{0}". When formatting this with a parameter name of "p1" the resulting string is ":p1".

(2) If the data source expects parameters to be prefixed with the ‘@’, but the names already include them, this would be ‘{0}’, and the result of formatting a parameter named "@p1" would simply be "@p1".

First (1) case

When my provider configured for "parameter name NOT includes prefix", it return ":{0}".

PreExecute generates SQL "INSERT INTO ... VALUES(:p1)" and creates parameter with name ":p1".

As result, my provider generates error "parameter with unknown name [:p1]".

Second (2) case

When my provider configured for "parameter name already has (includes) prefix", it returns "{0}".

PreExecute generates SQL "INSERT INTO ... VALUES(p1)" and creates parameter with name "p1"

As result, DBMS generates error at prepare stage - "unknown column [p1]".

----

Dmitry Kovalenko

www.ibprovider.com

https://www.nuget.org/packages/lcpi.data.oledb/




SSIS ODBC Source get char string truncated in middle when converting to unicode DT_WSTR

$
0
0

SSIS

In a simplified test, the SSIS package reads from an ODBC Source of MySQL, and writes to aFlat File Destination of text file.

The source contains columns of char(xx) and varchar(xx), and these columns get truncated at half of the length when reach destination. I noticed in Advanced Editor that these columns are DT_WSTR in both sections of External Columns and Output Columns, and the length equals to the source.

As a workaround, I changed ODBC Source to ADO NET Source, and the problem disappears.

I need some pointers to find out what was wrong when using ODBC Source.

It is on Windows with driver mysql-connector-odbc-8.0.12-winx64.msi, and please let me know if need more details. Thank you for your help.

Any general guidelines to allocate table space quota to different layers in ETL?

$
0
0

Hello Experts,

I am looking for any general guidelines to allocate table space quota to different layers/schemas in ETL flow of a data warehouse (% of total space in each layer).

 

I have these 4 layers:

  1. Staging -truncate and load data from source files
  2. ODS- Type 1 persistent tables
  3. Transformation layer- similar to final DWH layer but truncate before loading newly arrived data
  4. DWH layer- Final dimensional model layer

 

I understand space requirement may vary based on project requirements, however still any general guideline (if any such in data warehousing and ETL space) to estimate the space will be helpful.

 

Thanks,

Rajneesh

 

 

 

 

 


How to reduce Lookup Transformation performance cost?

$
0
0

Hi,

I have to insert thousand records of data each day into the table. I use Lookup Transformation and use "Redirect rows to no match output" to insert data into the table. The table itself contains (at least) hundred thousands of data. I am using the "Use results of an SQL query" and Full Cache mode in the lookup. My question is, is there any way to reduce the performance cost beside using customized query? I am new to SSIS and haven't explore it much, so I am afraid that I miss some configuration.

Thank you in advance

Does sqlserveragent account needs full control on a fileshare to modify a excel ?

$
0
0

I have a SSIS package which formats data using c# code in ssis script task .I can run it using sqlserveragent account which hasfull control permissions on the file share folder in DEV but in QA when I try to run I get error innvocation error at c#\script task.

The sqlserveragent account has modify permissions set on the file share folder in QA 

The flow of ssis package is as follows.

Prep data and dump it in a sql table and then use the table to write data in excel over a oledb connection . Once the data write is done format using c# /script task . 

SSIS 2014 and SQL server 2014.

Excel format- .xls





Custom Log Message for Execute SQL task

$
0
0

I have setup an SSIS package that executes multiple Execute SQL task .  I am aware of logging options that SSIS provides.

But what I want to do is store the output of each "Execute SQL task" in log table. Something like this - "Query has executed successfully with n number of rows. "

How do I achieve this?

error when run this dtsx package under sql server agent job but execute fine when right click under integration service catalog

$
0
0

Working fine when execute under sql server intagration service catalog

this dtsx uses sql execute task to run store procedure on remote sql server.database.store procedure. I can connect to remote sql server.database by sql server management studio. That store procedure calls another remote oracle database on remote sql server

when make it as one step under sql server agent job-- run job get following error message from stand report--all execution

error: executing the query "store procedure" failed with the following error "Cannot create an instance of OLEDB provider "OraOLEDB.oracle" for linked server "remote sql server". Possible failure reasons: Problem with the query,"Resultset"Property not set correctly,parameters not set correctly, or connection not established correctly.

Any help? Thanks. that store procedure on remote sql server A and that remote sql server A has oracle provider setup correctly. Do I still need to install oracle provider on the server where integration service catalog is?

Rollback Data Flow plain files creation, how to?

$
0
0

Hi there,

primary platform is Integration Services 15.0.2000.94.

Got a big sequence container where I have 8 data flow intented to create 8 plain files on a daily basis.

Every night my package creates -overriding- all the 8 existing files.


My goal is apparently easy: if any of them fails (in any order, the second or the fourth) because the staging data is wrong or any other reason, my SSIS package must be able to rollback the creation of the rest of files and (the key of this) left the original files (from last execution)

*TransactionOption = supported (sequence container) and TransactionOption = required (package) approach is working but not when you flush something to disk in this case, plain files.



Yes, I have the option of defining a FileTask in order to remove the recent files created and copy back the yesterday's ones from other folder.. but looks tedious


In a summary, i need to keep -always- good struct of the files in that folder.


Any input would be greatly appreciated,


Enric




Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>