Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

getting error when loading null values in ssis

$
0
0

Hi I have one doubt in ssis 
how to  load null vaules in inter datattype filed  
i want load csv file data into sql server table using ssis package.

soruce csv filedata is :emp
id | Name |  deptno
1  |ab    |NULL
2  |      |NULL
3  |dhd   |NULL

Targeet sql server table emp:

id --int ,name varchar(50),deptno int

 I treid datatype  using showadvaned mode for external columns and output columns as per target table datatypes 

after that I execute the package .here  i am unable to load null values in int deptno filed


and getting errors like below :



[OLE DB Destination [27]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".

[OLE DB Destination [27]] Error: There was an error with OLE DB Destination.Inputs[OLE DB Destination Input].Columns[depno] on OLE DB Destination.Inputs[OLE DB Destination Input]. The column status returned was: "The value could not be converted because of a potential loss of data.".

[OLE DB Destination [27]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "OLE DB Destination.Inputs[OLE DB Destination Input]" failed because error code 0xC0209077 occurred, and the error row disposition on "OLE DB Destination.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (27) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (40). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the 
failure.
 i tried another way using derived column : UPPER([deptno]) == "NULL" ? NULL(DT_I4) : [deptno] this also giving erro


can you please tell me how to load null values in sql server tables for integer datatypes using ssis packages


Script task - Replacing a 'hard coded' value with a variable in the existing C# code

$
0
0
Hi,
There is a folder which has many files, with date format at the end..Eg for one file is: Customer_data_2012_04_19.txt.
 I have a script task which orders the files in the folder, because in the DFT we want to process the files based on the date format it has.

I am a novice in c#. Here is the existing code. All I have to do is, just replace the hardcode part with a 'variable'.

I have created a variable and its name is 'User::FileFormat' and the value is 'Customer_data_*.txt.'

Can you please tell me how should I replace the Filename hard coded here with a variable I created.

//
public void Main()
            {
                // TODO: Add your code here
                string strFilesPath = Dts.Variables["User::strFilesPath"].Value.ToString();
                Dictionary<DateTime, string> dictionary = new Dictionary<DateTime, string>();
                List<string> orderedFiles = new List<string>();
                string[] files = { };

                // get the all files metadata from the directory
                if (Directory.Exists(strFilesPath))
                    files = Directory.GetFiles(strFilesPath,"Customer_Data_*.txt", SearchOption.AllDirectories);
                string strFilePath;
                string strFileName;
                DateTime dt;

                // get all the file names and datetime present in it as a Key Value pair in the Dictionary
                foreach (string file in files)
                {
                    strFilePath = file;
                    strFileName = Path.GetFileNameWithoutExtension(file);
                    dt = DateTime.Parse(strFileName.Replace("Customer_Data_", "").Replace("_", "-"));

                    dictionary.Add(dt, strFilePath);
                }
                // get keys into list
                var list = dictionary.Keys.ToList();
                // sort the list
                list.Sort();

                foreach (var val in list)
                {
                    orderedFiles.Add(dictionary[val]);
                }
//

SSIS Dynamic Oracle Connection - Script to Test Connection

$
0
0

Good day,

I have a config table containing 20 rows of oracle connections with credentials

I have set the For Each Loop & Import Task Event handler propagate=falsefor both  onError & onFailure (update a the table status)

it works until the last connection failure the entire package fails, the for each parent propagte is also set to false + on the ctrl flow i have onFailure constraint

im running out of time, i thought before failing capture the connection result, somehow i dont find a simple script to test the connection and move to onfailure constraint

please help,

Thanks

Script task - Help please!!

$
0
0

Hi,

In Script task, I have a code to sort the files based on the filename in a folder. We have used *.txt to find the files in the folder (please see the line in blocked lettere). But, this folder is going to have other department's txt files too. So, I need to change the '*.txt' to a variable. In the variable, I am going to mention the fileformat. Can you please help me on this?

The variable is  User::FileFormat

Here is the code and I would like to

1.Add the variable in the code and

2. Change the *.txt to this variable.

public void Main()
        {
            // Create a dataset. I named it unsorted, because it's not yet sorted 
            DataSet dsUnsorted = new DataSet();
            // Create a new table in the dataset 
            DataTable filelistTable = dsUnsorted.Tables.Add();
            filelistTable.Columns.Add("FilePath", typeof(string)); // Filepath needed for connectionstring. 
            filelistTable.Columns.Add("FileName", typeof(string)); // Filename used for sorting [optional]. 
            filelistTable.Columns.Add("FileDate", typeof(DateTime));// Filedate used for sorting [optional]. 
            // Get all files within the folder 
            string[] allFiles = Directory.GetFiles(Dts.Variables["User::SourceFilePath"].Value.ToString());
            // Variable for storing file properties 
            FileInfo fileInfo;
            // Loop through the files in the folder 
            foreach (string currentFile in allFiles)
            {
                // Fill fileInfo variable with file information 
                fileInfo = new FileInfo(currentFile);
                // Choose which the file properties you will use 
                // Columns: FilePath FileName FileDate 
                filelistTable.Rows.Add(fileInfo.FullName, fileInfo.Name, fileInfo.CreationTime);
            }
            // Filtering on *.txt extension. Note: like uses * instead of % 
            // Sorting the files on filename (or filedate: FileName DESC) 
            DataRow[] rows = dsUnsorted.Tables[0].Select("FileName like '*.txt'", "FileName ASC");
            // Create a new sorted dataset that the SSIS foreach loop uses. 
            DataSet dsSorted = new DataSet();
            DataTable filelistTableSorted = dsSorted.Tables.Add();
            // Only interested in the filepath which is needed for the connectionstring 
            filelistTableSorted.Columns.Add("FilePath", typeof(string));
            // Fill the new dataset with the sorted rows. 
            foreach (DataRow row in rows)
            {
                filelistTableSorted.Rows.Add(row["FilePath"].ToString());
            }
            // Store the dataset in the SSIS variable 
            Dts.Variables["dataset"].Value = dsSorted;
            Dts.TaskResult = (int)ScriptResults.Success;

        }

do not see PipelineComponents in sql 2016 machine

$
0
0

Hi everyone,

I am trying to deploy existing ssis packages to a sqlserver machine 2016. I deployed packages to sqlserver 2012 earlier and on 2012 machine I see path C:\Program Files\Microsoft SQL Server\110\DTS\PipelineComponents where I copied the dlls.I have to deploy the packages to sql2016 which has cusotm components too. I do not see folder C:\Program Files\Microsoft SQL Server\110\DTS\PipelineComponents on sql2016 machine. Sql201 machine has Windows server 2016 OS. can someone tell me what I am missing?

I would really appreciate that.

Thanks,

YoBroro

[Data Flow Task][OnError][Move File to Error Folder][Looks still in use to package]

$
0
0

Hi,

I want to move file to error folder in-case of error while loading so writes File System Task in Executable as Data Flow Task and Event Handler is OnError but getting following error.

[File System Task] Error: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".

It means file is still in use by Flat File Source Editor, Can someone guide me how to release the file in case of error so can move to error folder.


Many Thanks, Muhammad Yasir

How to pass parameter value as user Input to Stored Procedure in ExecuteSQL task

$
0
0

Hi.

I have a stored Procedure, which needs to be executed in SSIS. I have added the ExecuteSQL tool and have added the SP name.

The SP takes in a parameter.

I have created a variable - CurrYearMonth ,  scope is Package1 , DataType - Int32

In the Execute SQL task, I have given the SP name in SQLStatement

Example: sp_spname

In the parameter mapping, I have defined:

Variable Name: User::CurrYearMonth

Direction: Input

DataType: Int32

ParameterName: @YrMonth

ParameterSize: 0

How should the user be able to pass the data ? How to pass the parameter ?

Thanks

Executing multiple stored procedure in Execute SQL task

$
0
0

Hi.

I have 8 stored procedures which need to be run one after another. I added a Execute SQL task in SSIS and gave the below information: 

SQLSourceType: Direct Input

IsQueryStoredProcedure : True

SQLStatement box -  I gave as below:

sp_FirstSPName
GO
sp_SecondSPName
GO

I am getting the error when execting the task . Is this the right way to execute multiple SP ?

Thanks


create date on msdb package looks suspicious

$
0
0
hi we run 2014 enterprise.  i'm looking at the create date on a pkg deployed to one of our qa environments today and its back in 2007.  is the create date on pkgs sitting in [msdb].[dbo].[sysssispackages] tied more to the attributes of the dtsx pkg or to the add date on [msdb].[dbo].[sysssispackages]?  i suppose the pkg may just be a clone from something originally created in 2007.  i think ssis was on the scene by around 2004 ish (at least).

Load text file into a SQL table

$
0
0
Hi,
There is a 25 GB text file and I need to load into a SQL table. What is the best way to load it fast?

If I go with PCP, what are the steps and what permissions are required.

Thanks in advance!

what is recommended indexing rules for ETL in general?

$
0
0
Hello Experts,

What kind of indexes are recommended for staging area in ETL flow. For example unique index on natural key columns. I have surrogate key as well in data flowing in from source.

Also any suggestion/recommendation for indexing in other layers like dimensional data model layer and type 1 persistent data layer for full snapshot data.

Thanks, Rajneesh

What will be benefit of surrogate key in data warehosue layer?

$
0
0
Hello Experts,

OLTP source tables are having surrogate keys (numeric values) and natural keys (alphanumeric values), then can I skip creating surrogate keys in target OLAP DB (Dimensional Model) for dimension tables.
I know that I will need surrogate keys for fact tables as uniquke key for fact table will be a large set and I will need a single columns with numeric values as primary key there.
I am joining mutiple source tables for populating data into one dimension target then I am wondring to use unique id (numeric values) of driving table (this id is inherited from OLTP source) as primary key , provided that data granularity of resulted record is at driving tables' id level (resulted record is the record after main source driving table is joined with other source tables).

What will be benefit of surrogate key in data warehosue layer?

Thanks,
Rajneesh

Copy files to different folder and delete them in current folder

$
0
0

Hi All

I have 10 .csv files in Input folder and i need to copy them to the ArchiveFolder and then delete them from the Input folder. I added FileSystemTask to the control flow but when i try to configure the task , adding destination --> create file--> it will let me do only one file at a time. Do i need 10 file system tasks for this :-(

Please help

Thanks

How to import null string as null mark in csv files using ssis package

$
0
0

How to import null string as null mark in csv files using ssis package 
csv file have values : 
id |name |deptno | sal
1  |a    |NULL   |NULL
2  |b    |NULL    |NULL
3  |c    |NULL    |NULL
NULL|d   |10      |NULL
type of file : csv comma separated values


Target table emp : 
id |name |deptno | sal
1  |a    |NULL   |NULL
2  |b    |NULL    |NULL
3  |c    |NULL    |NULL
NULL|d   |10      |NULL


Here i am not getting db null values  and its getting string null values

expected out put like :





how to resolve null values issue in ssis package side

Create sub grid records in CRM automatically from SSIS.

$
0
0

Hi Team-

The GUID mapping was already done through SSIS between parent to child, however the sub grids records do not appear automatically in the entity. Is this possible through SSIS to show the records in sub grid automatically for the manual lookup done in CRM through mouse? Or any customization required?

Thanks and Regards,

Deepak.


Happy to help! Thanks. Regards and good Wishes, Deepak. http://deepaksqlmsbusinessintelligence.blogspot.com/


To many spaces before ending "

$
0
0

Hi

Package had no issue until today. One row in column Comments has around 2000 empty spaces before ending ".

for example "to many spaces before ending                                                                                                                  "

I used TRIM function it is not solving the problem. I can increase column with in package and table but this is not a good solution. Please help

 

Error when trying to copy database with Transfer Database Task

$
0
0

I'm getting this error:

Method 'SaveAndUpdateVersionToXML' in type
'Microsoft.DataTransformationServices.Project.DebugEngine.InterfaceWrappers.Sql2014ApplicationClassWrapper'
from assembly 'Microsoft.DataTransformationServices.VsIntegration,
Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'
does not have an implementation.
(Microsoft.DataTransformationServices.VsIntegration)

I think I need to install SSDT, but I'm running SQL Server 2016.  I don't want to upgrade to 2019.  Is there a way to install SSDT without upgrading?

SSIS 2012 Scheduled SSIS package fail to start – Execution timed out

$
0
0

Hi

I am currently running into an issue that seem to have a live Connect issue raised for it. The following is the link to the connect issue https://connect.microsoft.com/SQLServer/feedback/details/783291/ssis-package-fails-to-start-application-lock-timeout-in-ssisdb-catalog-create-execution

There is also a blog post that explains this issue in more detail, however the solution it provides does not work all the time and I am very reluctant to create custom stored procedures in the SSISDB. http://speaksql.wordpress.com/2013/06/27/ssis-2012-fail-to-start-execution-timed-out/

Just to give some more background. When scheduling 10 or more, in my case I scheduled 14, SQL Server Agent Jobs all executing a package in the SSIS Catalog and all kicking off on exactly the same time about 10% to 20% of these jobs fail stating one of the following error messages:

  1. The operation failed because the execution timed out.  Source: .Net SqlClient Data Provider  Started:  12:20:01  Finished: 12:20:07  Elapsed:  5.797 seconds.  The package execution failed.  The step failed.
  2. Description: Transaction (Process ID 66) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.  Source: .Net SqlClient Data Provider  Started:  14:06:48  Finished: 14:06:53 Elapsed:  4.719 seconds.  The package execution failed.  The step failed.

This timeout takes place both when using TSQL to start the package in the Job or when the step in the Job is set to start the package as an Integration Services Package.

Steps to recreate this:

1. Create one SSIS package that simply executes for around a minute. For example you can use an Execute SQL Task and simply execute the following TSQL "WAITFOR DELAY '00:01:00'"

2. Create 14 SQL Server Agent Jobs each executing the same package and schedule all of them to run at the same time.

3. Watch the execution of the jobs take place and note the failures. (This does not always happen and you might get one run in which everything executes without a problem. The next run might have 6 or 7 of the jobs fail.)

You can also create 14 different SSIS packages, one for each job, and the result is the same.

I am running on SQL 2012 SP1 CU4. Windows Server 2012 R2 patched to the lastest patch level.

This issue does not happen when the packages are deployed to the SSIS Package Store the way that it was done in SQL2008 and before. It only occurs when the SSIS packages are deployed to the SSIS Catalog.

Any help or feedback on this would be greatly appreciated.

Kind Regards,

Christo Olivier


Getting errors - not sure about them

$
0
0

Hi everybody,

I want to run a package that has for-loop container and data flow from a flat file into database table. I need to skip the second row in the file (the first row is the column names, the second - some garbage, the rest is data). In the Flat File Connection Manager I have set Data Rows to skip 1. The preview shows everything correctly, however when I try to run it I get this error

Warning: 0x80049304 at Data Flow Task, SSIS.Pipeline: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Flat File Source [11]: The processing of file "file name here" has started.
Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
Error: 0xC0202091 at Data Flow Task, Flat File Source [11]: An error occurred while skipping data rows.
Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC0202091.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Flat File Source [11]: The processing of file "file name here" has ended.
Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.

----------------

What does this error mean and how can I resolve it?

Also, I'm running this through RDP and I don't have admin password, I can only run every program as a user.


For every expert, there is an equal and opposite expert. - Becker's Law


My blog


My TechNet articles


Is there a way to continue where we left?

$
0
0

Hi everybody,

I created a package to load data from 3 txt files (two of them are huge and cannot be opened by any editors). I was running the package and it loaded 18 mln rows out of 65 and then had a failure. My package is designed to remove all rows for the file first and then attempt to load.

Do you know if there is a way to somehow adjust it and not delete rows after failed load but rather continue where we stopped?

Thanks in advance.


For every expert, there is an equal and opposite expert. - Becker's Law


My blog


My TechNet articles

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>