Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS ERROR

$
0
0

[SSIS.Pipeline] Error: Lookup  failed the pre-execute phase and returned error code 0xC020824E.

[Lookup 4 [152]] Error: OLE DB error occurred while populating internal cache. Check SQLCommand and SqlCommandParam properties.

Error: A buffer failed while allocating 2728 bytes.

Error: The system reports 93 percent memory load. There are 8379478016 bytes of physical memory with 522973184 bytes free. There are 4294836224 bytes of virtual memory with 23629824 bytes free. The paging file has 16757055488 bytes with 5860470784 bytes free.

I tried searching for similar error and tried almost all the suggested options but none of it solved my problem. I'm loading fact table with 780 million records and the package has about 8 lookup component to get the surrogate keys, dimension table in the lookup are also huge in size with around 40 to 50 million records in each one of them. Server is 64 bit and has 32 GB Ram.

Can someone suggest me how to go about this, I think lookup with such huge data is what messing up the cache memory is there a better way to get surrogate keys without lookup.

Any help is appreciated 

Thanks


sree


Flat file (.csv file) to insert in table - using SSIS

$
0
0

Hi - I have other steps that I am working on SSIS package, and in the middle I have a requirement where I have a flatfile in the folder, SSIS task should read from that folder and insert in SQL table.

What is that I need to do here ? How can I achieve this ? taking file and inserting into sql table ?


MBH

SSIS 2012 was running perfectly in design mode(BIDS). But after deploying , Script Task is not working

$
0
0

Hi All,

I have a SSIS package created using VS 2012 . It has a Script Task (C# code) to validate some files loading. But after deploying , the package is erroring out. But after removing/disabling the Script task , it works perfectly.

Error is  " The Script Task is corrupted., Source:<Script task name>, SubComponent:, IDOfInterfaceWithError:{8266A638-4A4B-4A82-8648-F54B5B879197}, HelpContext:0"

The SSIS package is scheduled using a Windows service application and executing programmatically. 

Thanks

Sujo

SSIS CDC Control Task in SSDT for VS2015

$
0
0

Just installed SSDT August Preview for Visual Studio 2015 and noticed that there is no CDC control task.

Will there be any? We're thinking of investing in a solution based on CDC and we're looking at creating a SSIS control flow using that component in VS2013, it would be good to know if it will still be there in 2015.

Thanks!


Ajden Towfeek

OLE DB Destination vs SQL Server destination - SSIS 2012

$
0
0

Hi,

in order to write a large data volume into a SQL Server table using a SQL Server destination could be better rather than an OLE DB destination?

Moreover, which the two destinations avoid better a login failed issue for a SQL Server connection that uses a SQL Server login?

Many thanks

Login failed error for a SSIS 2012 solution deployed in the SSIS catalog with environment variables

$
0
0

Hi,

I'm changing an existing SSIS 2012 solution having some packages.

As the first thing, the protection level of the solution is EncryptSensitiveWithPassword. The connection managers to SQL Server database must be with a SQL Server login, not by the Windows authentication.

The solution has functioned until some days ago, before of the introduction of a set of project parameters and a set of environment variables in the SSIS catalog, in order to configure the deployed pkgs separately than the project inside SSDT.

In particular, I've created project parameters to contain the password for the SQL Server login mapped to the SQL Server dbs to access. These password parameters are required and sensitive. The related environment variables in the SSIS catalog are sensitive.

When I try to run the SSIS solution deployed by a SQL Server agent job I've obtained an error referring to "Login failed for user ...". The odd thing is that each package has more data flow tasks to load data into SQL Server table: the packages go in error not for all these tasks, someone are executed with success other ones with a login failed error.

I've tried to run a single package selecting it from the SSIS catalog obtaining a success execution.

However, the SSIS solution deployed functioned before the implementation of the environment variables associated to the project parameters.

I'm working with SQL Server 2012 SP2 (build 11.0.5058).

Any suggests to me, please? Could it be a SQL Server bug? Other?

Many thanks


How to remove/use line break in SSIS

$
0
0

I've a test file created through script task.
When I display the text file content into email body, it displays all in one line. If use \n\r to replace the comma in the text file, then I got 2 line feeds. I need the email body to just look like as were in the text file

Sample Text file
PRODUCT_TOTALS.XLSX - 386,
PROMATION_TOTAL.CSV - 256,
WARNINGS_INFO.CSV - 1250

When I render the text file in to email body without any expression
PRODUCT_TOTALS.XLSX - 386,PROMATION_TOTAL.CSV - 256,WARNINGS_INFO.CSV - 1250

When I use REPLACE(@MailBody,",",\n\r\). Through script task the text file contents are assigned to @MailBody 

PRODUCT_TOTALS.XLSX - 386

PROMATION_TOTAL.CSV - 256

WARNINGS_INFO.CSV - 1250

Desired output in the email body

PRODUCT_TOTALS.XLSX - 386
PROMATION_TOTAL.CSV - 256
WARNINGS_INFO.CSV - 1250

Any help is highly appreciated.

Thanks

Using checkpoints in SQL Server Integration Services (SSIS) Packages

$
0
0

Hi,

I need expert help to resolve Checkpoints Issue in SSIS package. Let me explain you what I am trying to do first.

I have a package with following task

1). ForEach Loop container for reading multiple files

2). Inside container, I have Execute SQL Task for Truncating staging table

3). Next Data Flow Task with Flat file source and OLE DB destination

4). Next Execute SQL Task to execute stored procedure

5). Last, File System task to move file to success folder

 I have configure checkpoint as follows

Package Properties:

CheckPointFileName: D:\CheckPointFile.xml

CheckPointUsage: IfExists

SaveCheckPoints: True

For EachLoop Container Properties: FailPackageOnFailure= TRUE

Task 4: Execute SQL Task Properties: FailPackageOnFailure= TRUE(Here whenever I am setting FailParentOnFailure=TRUE, package is deleting CheckPointfile.xml after error)

To verify checkpoint implementation, I have renamed stored procedure  name  in DB so that package get failed on task 4.After running package, task 4 gets failed and checkpointfile.xml created.

Issue: Whenever I am restarting the package, package once again executes step 1, 2, 3 which I believe is wrong. Package should start from Task 4 because task4 is point of failure

Please provide me your inputs .

 


DATAFLOW TASK IS HANGING WHILE LOADING DATA FROM ORACLE SOURCE TO SQLSERVER DESTINATION

$
0
0

Hi,

we designed a staging ssis solution in which 50 child packages are running through master package.

The problem is in some child packages, after loading data from oracle source table to sql server destination table in data flow task, remaining data is not inserting. i.e., suppose consider 3lakhs rows are there in source oracle table, after inserting 1lakh rows around in sql server destination table, it is showing same count in data pipeline of source to destination(means data is not inserting) and keep on executing mode and not showing any error in progress. earlier we don't have this problem. now we are facing this problem. why? we are facing this problem randomly(may occur in any package and can't predict).

note1: after hanging dataflowtask, if we stop exection manually and started masterpackage again, then data is inserting fine in respective child package for some time(here hanging package starts againg becasue of our package design). most of time, i observed initially data is inserting fine and after some time of execution, not inserting rows(this problem is in middle of execution and not at starting of execution)

note2: checked task manager, it is showing ram is not full.physical memory is 56% and cpu usage is 17% and also we configured some properties(default buffer max rows are 40000 and default buffer size is 8gb and ram is 16gb as suggested in google) and also configured rows per batch to 10,000 rows in some packages loading large volume of date and unchecked table lock also in ole db destination in all packages

what is the solution for this?


tsrkreddy


OData \ SharePoint List Collection Source

$
0
0

I want to use collection of sharepoint list fields as a source in my data flow.

You can output fields collection through OData like:

https://portal/sites/web/_api/Web/Lists(guid'70592db3-8842-4b84-a5ba-b12a0fb5245a')/fields

I can perfectly see it in IE or import in Excel as a table. But OData source in SSIS can not id this as a collection. Why? How to fix it?

More i found out that items query also is not working with OData source in SSIS:

https://portal/sites/web/_api/Web/Lists(guid'70592db3-8842-4b84-a5ba-b12a0fb5245a')/items?$select=Id,Title

Only _vti_bin/listdata.svc/list without keys (like ?$select=field) can be used as source...


Issue with Microsoft OLE DB Provider for Oracle

$
0
0

One of my SSIS packages connect to Oracle database using connection from Microsoft OLE DB Provider for Oracle. This was working fine until a recent Microsoft patching (with the following KB articles:kb3135445, kb3121255, kb3134214, kb3126587, kb3126593, kb3133043, kb3122648, kb3127220)

Could this be because of any of the above patches? Or is there another reason for this?

Now this connection does not work and the error seen is:

Error: 2016-02-21 07:13:25.19

   Code: 0xC0202009

   Source: pkg_XXXXXXXXXX Connection manager "XXXXXX"

   Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.

An OLE DB record is available.  Source: "Microsoft OLE DB Provider for Oracle"  Hresult: 0x80004005  Description: "Oracle client an\

d networking components were not found. These components are supplied by Oracle Corporation and are part of the Oracle Version 7.3.\

3 or later client software installation.

 

Provider is unable to function until these components are installed.".

End Error


How to join - without unique rows? I'm showing an example of the problem with tables.

$
0
0

Hi,

I have a problem there I'm have 3 flat files from two different customers. There are a lot of duplicates in the files.

I can't change the files, but i can add columns.

I'm using ID, Start_date and End_date as my key. But as you can se, some rows are not unique.

The tables below is what I'm looking for. 

Invoice, is that company X gives to Company Y.

Company Y then sends back "Handback" to Company X.

Then company Y sending "billing" to company X, to confirm the billing rows.

Invoice

Handback

Billing

Join

When I'm using this today i got a billing_date in all rown for the join table. I want to see the "null" on rows that are not Accept.

How can i solve this problem?

Best regards

BadenSql

SSIS 2012 - Error decimal format dot (.) - Comma (,)

$
0
0

Hi,

I have two Windows Server environments with the following configurations:

- Staging Environment: SQL Server 2012/ SQL Server Data Tools 2010

- Development Environment: SQL Server 2012/ SQL Server Data Tools 2013

Both machines have the same "Regional settings (Control Panel)", comma as decimal sign:

Both environment, have the same laguage settings:

I have an SSIS package wich insert data into SQL Server table but has'nt the same behaviour in the two server: in the staging environement it run fine, but i have a problem of the decimal format in the devlopment environment, following the data view of the package in the two environment:

In the Staging, the decimal sign is dot(.):

In the Development Env. the decimal sign is comma (,):

to be sure that the problem is in the level of SSIS and not SQL Server table, i run the package:

1- From the Staging Env to insert into SQL Server table in the same Env => Success

2- From the Staging Env to insert into SQL Server table in the Dev Env => Success

3- From the Development Env to insert into SQL Server table in same Env => Failure

It s as the package change behaviour from Data Tools 2010 and Data Tools 2013: in the first it take (.) as decimal sign and in the second (.) as sign

Have you any explication for that or did you have the same problem

Regards,





Use of ForceExecutionResult to Success

$
0
0

Hello, all -

In a prior question, I was wondering about how to detect whether or not a sheet exists in an Excel file and how to test for it. I received some good advice, but later on, I saw a reference to the property,ForceExecutionResult.

The answers I received used Sequence Containers and to test on precedence. But, the aforementioned property set me to thinking;

If I put only those tasks that I want executed if the relevant sheet exists in the Excel file into a Sequence Container and one of those tasks fails because the sheet doesn't exist in the Excel file being processed it would cause the Container to fail and then the package to fail.

Right?

But, if I set the Container's ForceExecutionResult propery to Success, then failure of a task in the Container will cause it to fail, but it will tell the package that the Container succeeded, so the steps AFTER the Container will still execute - which is what I want to happen.

Is that correct? Am I understanding this properly?

How can I ensure that the OS releases a file for processing?

$
0
0

I have a process where I convert some excel files to CSV before I load them into a table. My process keeps blowing up with the following message:

"The process cannot access the file C:\fileName.xls is being used by another process." 

I'm assuming that the reason for that is the file is still being converted or something which is weird because I've stuck the file conversion process in a separate loop task. I thought it wouldn't move on until the processing was totally complete. Below is a pic of my process and below that is the file conversion code.

 public void Main()
        {
            // TODO: Add your code here
            //File DataTable from Execel Source File. I have used Sheet1 in my case, if your sheet name is different then change it.
            string ConnString;
            ConnString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + Dts.Variables["User::SourceFile"].Value.ToString() +";Extended Properties=\"Excel 12.0;HDR=No;IMEX=1\";";
            var conn = new OleDbConnection(ConnString);
            conn.Open();
            string query = "SELECT * FROM [Sheet1$]";
            var command = new OleDbCommand(query, conn);
            OleDbDataAdapter adap = new OleDbDataAdapter(command);
            var datatable = new DataTable();
            adap.Fill(datatable);


            //Create csv File

            using (var sw = new StreamWriter(Dts.Variables["User::DestinationFile"].Value.ToString()))
            {
                for (int row = 5; row < datatable.Rows.Count; row++)
                {
                    var strRow = "";

                    for (int col = 0; col < datatable.Columns.Count; col++)
                    {
                        strRow += "\"" + datatable.Rows[row][col].ToString() + "\",";

                    }
                    //remove last , from row
                    strRow = strRow.Remove(strRow.Length - 1);
                    //write row to file
                    sw.WriteLine(strRow);
                }
            }
            Dts.TaskResult = (int)ScriptResults.Success;
        }


Salesforce to sql server 2016

$
0
0

Hi Experts,

 

I need to extract information from Salesforce (around 10GB daily) and put it into a sql server… I have been recommended many options like pentaho, etc… But I prefer to use SSIS…

 

I know I can export salesforce to CSV and then import CSV to sql server 2016...

 

My questions are…

 

1-   Is there any possible direct connection from salesforce to sql server 2016 using SSIS?

2-   Anyone with similar experiences, what’s the best practice here… I want to avoid exporting to a csv and then to Sql… I want to have this data synchronized near real time. ( I am thinking about replication and to be able to use Sql server replication with a salesforce source, but I know I am dreaming...)

 

Thanks in advance.

SSIS '12 - Union multiple (unique) derived columns together

$
0
0

Hi,

I have 3 Derived Column tasks.

Each task derives 1 column and there is no relationship b/w any of the derived columns.

I want to union these three columns into one "table" with 3 columns.

i.e.

1st DC trans : Field Name = "First"; Value = "1"

2nd DC trans : Field Name = "Second"; Value = "2"

3rd DC trans : Field Name = "Third"; Value = "3"

Desired "Single" Output:

FirstSecondThird

12 3

I tried using Union All but I'm getting the following output

FirstSecondThird

1Null Null

Null2 Null

NullNull 3

I need to do this in SSIS/memory and not in sql server/table.

Thank you

SSIS PACKAGE ERROR

$
0
0

CAN SOME1 HELP ME WITH THIS ERROR...

USING SSIS 2012...

[Flat File Source [12]] Error: Data conversion failed. The data conversion for column "LAST" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page."

I AM USING DATA CONVERSION AS WELL... TY IN ADVANCE

Dynacmic Conditional Split

$
0
0

Hi Experts,

We have developed a SSIS package with the below flow

1. Truncate Destination table -->"For each file loop "Pick a txt file from a folder--> Load into DB --> Exec a store proc for performing some changes on the data loaded.

Now what i need to do is load the final output stored into the table for that particular txt file into 2 separated folder

which am able to achieve using conditional split and expression in it. Now the problem am facing is for every file looped its

loading into same destination txt files configured during the conditional split.

Is it possible to generate a new file while performing conditional split having the name same as the file processed.

Like i have 2 destination folders  FailedFile Folder & Processed Folder

So everytime data gets loaded into these folders using conditional split

Example

Should be like xyzfilefailed.txt and xyzProcessed.txt likewise abcfailed.txt and abcProcessed .txt  for the next input

Please advice.

Thanks

Priya

How can I copy a .dtsx Package

$
0
0

So I can see my .dtsx Package in SQL Server Management Studio...

ServerName00

Integration Services Catalog ==> SSISDB ==> Bayer ==> Projects ==> ProjectName ==> Packages ==> PackageName.dtsx

How can I copy the .dtsx Package so that I can Add it and get to it within Microsoft Visual Studio so that I can see what this package does? When I <Right-Click> on the Package in SSMS the only options I get is to Configure, Execute, Validate, Start Powershell, Reports, Refresh, Properties

I did Google this and saw that I can use Command dtutil? Is that what I need to use here? And what is the syntax to...

"Copy a .dtsx SSIS Package from a SQL Server to a flat file folder so I can then "Add Existing" within Microsoft Visual Studio"?

Any help is GREATLY appreciated. And Thanks in advance.

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>