Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Issues running SSIS package remote

$
0
0

Hi all,

I have an SSIS package that runs fine when logged in directly to the Windows database server.

I am running Windows 2016 Server and SQL Server 2016.

However, if I try to run the same SSIS package remote from my Windows PC with SSMS, it fails on "cannot write data file" errors.

OLE error: cannot open the data file "\\myshare\ssis\staging"

This is weird because my user account has full privileges and so forth. What can I check to solve this issue so that I can get the SSIS package to run remote from SSMS on my PC?


How to copy dtsx packages

$
0
0
I have the dtsx file in F:\SSIS\Integration Services Project2\Integration Services
Project2\LoopSingleSheet (1).dtsx, how do I copy that file and give to the other user so that that person can use in other server. This is not a server to server copy. I am new to dtsx package and do not want to cause any issues while saving the file. I am using sqlserver 2008r2.

How to Impliment Send Error Mail on column Missing Or Extra Column in Client Data And Date Format Also.

$
0
0

Hi 

All Please Guide Me How to Impliment Send Error Mail on column Missing Or Extra Column in Client Data And Date Format Also.

if it Possible Please Attach Some Code Also

Thanks 

DTExec runs ssis package error

$
0
0

Hi,

I am trying to use DTexec to run a ssis package on my local machine. The ssis package is created in in VS2017 enterprise;

Target release server: sql 2017;

OS is windows 10, 64 bit;

SSDT 15.9 one standalone was installed side by side with one embedded in VS2017.

Here is error message I got:

C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\CommonExtensions\Microsoft\SSIS\150>dtexec.exe  /FILE "\\TestProjects\ssis2017\ssis2017\OnlineProject.dtsx" /MAXCONCURRENT " -1 "  /x86
Microsoft (R) SQL Server Execute Package Utility
Version 10.50.1600.1 for 64-bit
Copyright (C) Microsoft Corporation 2010. All rights reserved.

Started:  2:28:48 PM
Error: 2019-04-16 14:28:48.95
   Code: 0xC001700A
   Source:
   Description: The version number in the package is not valid. The version number cannot be greater than current version number.
End Error
Error: 2019-04-16 14:28:48.95
   Code: 0xC0016020
   Source:
   Description: Package migration from version 8 to version 3 failed with error 0xC001700A "The version number in the package is not valid. The version number cannot be greater than current version number.".
End Error
Error: 2019-04-16 14:28:48.95
   Code: 0xC0010018
   Source:
   Description: Error loading value "<DTS:Property xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:Name="PackageFormatVersion">8</DTS:Property>" from node "DTS:Property".
End Error
Could not load package "\\TestProjects\ssis2017\ssis2017\OnlineProject.dtsx" because of error 0xC0010014.
Description: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails.
Source:
Started:  2:28:48 PM
Finished: 2:28:48 PM
Elapsed:  0.063 seconds

C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\CommonExtensions\Microsoft\SSIS\150>dtexec.exe  /FILE "\\care1\is\carrie\TestProjects\ssis2017\ssis2017\OnlineProject.dtsx" /MAXCONCURRENT " -1 "  /x86
Microsoft (R) SQL Server Execute Package Utility
Version 10.50.1600.1 for 64-bit
Copyright (C) Microsoft Corporation 2010. All rights reserved.

Started:  12:47:57 PM
Error: 2019-04-17 12:47:57.83
   Code: 0xC001700A
   Source:
   Description: The version number in the package is not valid. The version number cannot be greater than current version number.
End Error
Error: 2019-04-17 12:47:57.83
   Code: 0xC0016020
   Source:
   Description: Package migration from version 8 to version 3 failed with error 0xC001700A "The version number in the package is not valid. The version number cannot be greater than current version number.".
End Error
Error: 2019-04-17 12:47:57.83
   Code: 0xC0010018
   Source:
   Description: Error loading value "<DTS:Property xmlns:DTS="www.microsoft.com/SqlServer/Dts" DTS:Name="PackageFormatVersion">8</DTS:Property>" from node "DTS:Property".
End Error
Could not load package "\\TestProjects\ssis2017\ssis2017\OnlineProject.dtsx" because of error 0xC0010014.
Description: The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails.
Source:
Started:  12:47:57 PM
Finished: 12:47:57 PM
Elapsed:  0.062 seconds

I got the same error when use dtexec in /sql... folder.



Data Quality

$
0
0

Hello Everybody,

We have been struggling with the data quality issues in our group. Our group is primarily a Database development group and we are responsible for the processing of the large amount of data, reporting requirements and supple necessary data to the OLTP systems. Processing data from several internal and external sources with the following sources

1) Flat file data processing through ETL

2) Connection to internal database system via OLE DB connections

3) Live Streaming/APIs from upstream systems

Our database design is compatible to handle source  data but as all of know, some time source systems data types changes without our knowledge/bad data comes in which trigger batch/streaming data processing to fail. We have implemented not to fail the entire job if in case few rows in a batch/API calls/Streaming data are bad.

I am looking to forward to implement robust framework/best practices/methodology to handle data processing and then build a some kind of dashboard to monitor data quality reporting.

Please let us know, if you have any framework implemented and some kind of design ideas.

Thanks in advance.


نمرة اصلاح غسالات اريستون المنصوره $01225025360 $ 01014723434 $ 0235695244

Change format of destination CSV file

$
0
0

I have a simple SSIS package that extracts from SQL and puts the contents into an existing CSV file.

All works which is great how ever I need each column to be in separate cells. Currently each row of data is in a single cell for example B1 might contain field1, field2, field3 etc
I'd like those fields to span across B1, C1, D1 just as if I had pasted the results grid into Excel.
I'm not sure how to achieve this.

manipulating dataflow from package programmatically

$
0
0

Hello, 

I have a Template package and I use script task to load this Template and I want to load data flow and campement existing in package and modify it.and alsoproject connexion.( I find only the way to retrieve packageconnection ) 

thanks for your time.

    string mySample = @"Template Load Stage.dtsx";
            // Create an application and load the sample. 
            RuntimeWrapper.Application app = new RuntimeWrapper.Application();
            Package pkg = app.LoadPackage(mySample, null);
            Connections myConns = pkg.Connections;


individual ssis package runs and loads data, but fails to load when deployed

$
0
0

I have  an SSIS package that reads a series of .csv files and loads into a sql server table...

it works fine individually, but when i deploy it and when i try to run it from ssisdb\ path,  i get a warning that files cannot be found and job completes fine but nothing is loaded.

Now in my individual ssis package, I specify what files to pick up in the parameter, and then the "Files" section of For each loop container gets automatically populated with whatever parameter I enter .csv...why is it not working when I deploy it 

Got "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." when execute a SSIS 2008 package

$
0
0

I have a SSIS package which was migrated from DTS SQL2000 to SSIS 2008.

The package designed to perform following jobs:

  1. Do the data cleansing by truncate the tables in SQL Server
  2. Transform data from Oracle Database into SQL Server tables

The package is working fine until the server (Windows Server 2008 R2) which install SQL Server 2008 enabled TLS1.2 and disabled TLS 1.0 & TLS 1.1

Error "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." thrown when executing task #2 (Oracle to SQL server data transformation). Data cleansing part was done successfully.

Provider for Oracle used = OraOLEDB

Provider for SQL Server = MSOLEDBSQL

Any help will be appreciated.


Calling Stored Proc from SSIS error but runs fine in SSMS

$
0
0

Hi All

I'm grtting this error when I run Stored proc from SSIS but it runs fine from SSMS

SQL Process All Changes:Error: Executing the query "EXEC [ProcessAllChanges];" failed with the following error: "The statement has been terminated.". Possible failurereasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly

There is no parameters in or out and no record sets to return.

Here is the Stored Proc. Nothing fancy.What am I doing wrong?It process about 95M records.

btw it worked before and just now start giving issues. We are running SQL Server 2016. It fails on the 1st [NOT EXIST] query.

ALTER PROCEDURE [ProcessAllChanges]
AS
BEGIN
	SET NOCOUNT ON;

	-- log opening of execution
	DECLARE @execution UNIQUEIDENTIFIER = NEWID();
	DECLARE @task VARCHAR(MAX) = (SELECT OBJECT_NAME(@@PROCID));
	EXEC [util].[log_message] @task = @task, @message = 'start', @execution = @execution;


	DROP TABLE IF EXISTS [#versionsThatAreNew];
	
	-- gather up all versions in [intake] that aren't already present in exact state in [persist]
	SELECT [i].[KeyHash]
		,[i].[RowHash]
		,[i].[AddedOn]
	INTO [#versionsThatAreNew]
	FROM [intake].[tableA] AS [i]
	WHERE NOT EXISTS (
						SELECT 1 
						FROM [persist].[tableB] AS [r] 
						WHERE [r].[KeyHash] = [i].[KeyHash] 
						  AND [r].[RowHash] = [i].[RowHash]
					); -- where the record isn't already exactly there

	EXEC [util].[log_message] @task = @task, @message = '1', @execution = @execution;
	
	
	DROP TABLE IF EXISTS [#versionsThatNeedInsertOrUpdate];
	
	-- produce a set of all the operations that need to happen below, which are:
	--- 1. update any [persist] version's effective window when the keyhash also has a new version in [intake] (means a new version of an old record)
	--- 2. insert any [intake] version that isn't already in [persist]
	SELECT [n].[KeyHash]
		,[n].[RowHash]
		,-1 AS [RowID] -- they don't get RowIDs so put in a non-null placeholder. later we'll join on this column during the update to only hit modified records, making the -1 fall out of that operation.
		,[n].[AddedOn]
		,CAST('0001-01-01' AS DATETIME2(2)) AS [RowEffective]
		,CAST('9999-12-31 23:59:59.99' AS DATETIME2(2)) AS [RowExpiration]
	INTO [#versionsThatNeedInsertOrUpdate]
	FROM [#versionsThatAreNew] AS [n]
	
	UNION ALL -- we know these are disjointed sets because it's (what's not already in persist) + (what's in persist). so union all for performance.
	SELECT [r].[KeyHash]
		,[r].[RowHash]
		,[r].[RowID]
		,[r].[AddedOn]
		,CAST('0001-01-01' AS DATETIME2(2)) AS [RowEffective]
		,CAST('9999-12-31 23:59:59.99' AS DATETIME2(2)) AS [RowExpiration]
	FROM [persist].[tableB] AS [r]
	INNER JOIN [#versionsThatAreNew] AS [n]
		ON [r].[KeyHash] = [n].[KeyHash];
	EXEC [util].[log_message] @task = @task, @message = '2', @execution = @execution;

	
	DROP TABLE IF EXISTS [#keyHashesThatHaveMultipleVersionsSoTheyNeedCalculatedEffectiveWindows];
	
	-- the working set may have nondistinct KeyHashes in cases where a new version came in to replace an old version (intake.keyhash = persist.keyhash)
	--- for each of these, we can't use the defaulted effective windows that were put on everything in the last step.
	--- so we find just these, to feed into the next step, where we calculate consecutive windows per version.
	SELECT [t].[KeyHash]
	INTO [#keyHashesThatHaveMultipleVersionsSoTheyNeedCalculatedEffectiveWindows]
	FROM [#versionsThatNeedInsertOrUpdate] AS [t]
	GROUP BY [t].[KeyHash]
	HAVING COUNT(*) > 1;
	-- for multi-version keyhashes, sequence them by AddedOn to get consecutive, non-overlapping time windows.
	--- then update the working set to have those rowEffective and rowExpiration dates.
	WITH [recalc] AS (
		SELECT [v].[KeyHash]
			,[v].[RowHash]
			,[v].[AddedOn]
			,LAG([v].[AddedOn], 1) OVER (PARTITION BY [v].[KeyHash] ORDER BY [v].[AddedOn]) AS [PriorAddedOn]
			,LEAD([v].[AddedOn], 1) OVER (PARTITION BY [v].[KeyHash] ORDER BY [v].[AddedOn]) AS [NextAddedOn]
		FROM [#versionsThatNeedInsertOrUpdate] AS [v]
		INNER JOIN [#keyHashesThatHaveMultipleVersionsSoTheyNeedCalculatedEffectiveWindows] AS [kh]
			ON [v].[KeyHash] = [kh].[KeyHash]

	)
	UPDATE [#versionsThatNeedInsertOrUpdate]
		SET  [RowEffective] = CASE WHEN [r].[PriorAddedOn] IS NULL THEN '0001-01-01 00:00:00.00' ELSE [r].[AddedOn] END
			,[RowExpiration] = ISNULL([r].[NextAddedOn], '9999-12-31 23:59:59.99')
	FROM [#versionsThatNeedInsertOrUpdate] AS [v]
	INNER JOIN [recalc] AS [r]
		ON  [v].[KeyHash] = [r].[KeyHash]
		AND [v].[RowHash] = [r].[RowHash];

	EXEC [util].[log_message] @task = @task, @message = '3', @execution = @execution;



	BEGIN TRANSACTION;


	-- update from the working set any version that exists by RowID in the [persist] also.
	-- we could also join by [KeyHash]+[RowHash], but the table is likely ordered by [RowID] already so this could be quicker.
	UPDATE [persist].[tableB]
	SET [RowEffective] = [v].[RowEffective]
		,[RowExpiration] = [v].[RowExpiration]
	FROM [persist].[tableB] AS [r]
	INNER JOIN [#versionsThatNeedInsertOrUpdate] AS [v]
		ON [r].[RowID] = [v].[RowID];

	DECLARE @RowsUpdated BIGINT = @@ROWCOUNT;
	
	EXEC [util].[log_message] @task = @task, @message = @RowsUpdated, @execution = @execution, @category = 'Rows Updated'


	-- insert to [persist] any version from [intake] that matches to the working set by [KeyHash] and [RowHash].
	-- because the working set didn't include versions that were the same in [intake] and [persist], this avoids re-inserting existing versions.
	INSERT INTO [persist].[tableB] WITH (TABLOCK)
	SELECT [i].*
		,ISNULL([o].[batch_id], -1) AS [BatchID]
		,[v].[RowEffective]
		,[v].[RowExpiration]
	FROM [intake].[tableA] AS [i]
	INNER JOIN [#versionsThatNeedInsertOrUpdate] AS [v]
		ON  [i].[KeyHash] = [v].[KeyHash]
		AND [i].[RowHash] = [v].[RowHash]
	LEFT OUTER JOIN [util].[batch_open] AS [o]
		ON  1=1;

	DECLARE @RowsInserted BIGINT = @@ROWCOUNT;
	
	EXEC [util].[log_message] @task = @task, @message = @RowsInserted, @execution = @execution, @category = 'Rows Inserted'




	COMMIT TRANSACTION;


	EXEC [util].[log_message] @task = @task, @message = 'stop', @execution = @execution;

END;



How to Impliment Send Error Mail on column Missing Or Extra Column in Client Data And Date Format Also.

$
0
0

Hi All Please Guide Me How To Impliment Send Email On Error File With Attachment.And Attached Some Peice Of Code.

Thanks

Abhay

Package fails with 'violation of primary key' error message

$
0
0

  Package fetches data from tables in database A on server A and loads into tables in database B on Server B.

This is an exercise out of Brian Larsons book "Delivering Business Intelligence with Microsoft SQL Server 2016, Fourth Edition, 4th Edition"

ERROR: Description: "Violation of PRIMARY KEY constraint 'PK_DimProductType'. Cannot insert duplicate key in object 'dbo.DimProductType'. The duplicate key value is (1)."

ODBC Source properties are missing in the Data Flow Task expressions dialogue

$
0
0

I am trying to create an SSIS package to transfer data from a MySQL database (ODBC Source) to SQL server (ADO.NET Destination).

I need to use a dynamic query for the extract from the MySQL database. In the past, I have done this by setting the SqlCommand property as an expression via the Data Flow Task properties window. However, no ODBC source properties are shown in the expressions dialogue.

I have tried deleting and re-creating the data flow task, with no improvement.

This does not appear to be a known bug (or, at least, I haven't managed to find anything by Googling). Can anyone suggest a solution for this, please?

I am using Visual Studio Professional 2017 (version 15.9.3) with SSIS Designer version 15.0.900.40

Thanks,

Rob


Sending Text File to FTP Location through SSIS

$
0
0

Hi Guys,

I am designing an SSIS package that loads data from SQL Table to a flat file in an FTP location. I know it is possible to first create the file in a local folder and then move it to an FTP location with SSIS. 

However, I want the SSIS to create the file in the FTP server instead.


me


User connects to sqlserver SSPI handshake failed with error code 0x8009030c, state 14

$
0
0

SSPI handshake failed with error code 0x8009030c, state 14 while establishing a connection with integrated security; the connection has been closed. Reason: AcceptSecurityContext failed. The Windows error code indicates the cause of failure. The logon attempt failed  [CLIENT: x.x.x.x]

This user does not exist in our company so I cannot use this solution:

Go to: Local Security Policy -> Security Settings -> Local Policies -> User Rights Assignment -> Access this computer from the network -> add the user

What is other alternative to fix this issue?

Obtain data from a device using SSIS package

$
0
0

Hi All,

I want to automate following task using an ETL package.

Right now we manually connect to a device using telnet: 

Run cmd

telnet some_ip_address port_#

Then we type command:

some_command

Then we get the response in cmd window.

What are my options in automating this process?

Thanks,


Remember to mark as an answer if this post has helped you.

SSIS Packages problem when executing the package via sql agent job step

$
0
0

Hello,

we have several sql agent jobs that start ssispackages (stored in a server store in the same server).
They worked fine for the last months. This weekend our sql server restarted due to hyper-v problems. After this restart
the ssis packages do not work any more when start via sql agent jobs. If i start the package via SSMS directly without a job step in sql agent
everthing works fine.

If i start the job it is running but it doesn´t end. If i log on to the sql server i see the process "dtexec" in the
task manager. But it doesn´t finish. i also have o log entries etc. If i then try to stop the sql agent job in SSMS i get an
error in the sql agent error log "[136] Job TEST SSIS reported: Unable to terminate process 1544 launched by step 1 of job 0x308211B401FBA641BC9F7762C3EB6868 (reason: Access is denied)"
But this error only appears if i try to stop the job.

what´s wrong with the sql agent? two day ago everything worked fine.

Thanks
Markus

SSIS programmatically add Aggregate Transform - Count Distinct

$
0
0

I am working on creating Aggregate transform with aggregation type as count distinct programmatically and i am able to create other aggregations like min,max,count.. but when it comes to count distinct i am getting below error

  • The component has detected potential metadata corruption during validation.
    Error at Data Flow Task - Load Count Dist [Aggregate - All [2]]: The "Aggregate - All.Outputs[Aggregate Output 1].Columns[col1]" is missing the required property "CountDistinctScale". The object is required to have the specified custom property.

I am unable to find "CountDistinctScale" custom property as this custom property doesn't exit for other aggregation and magically appears when count distinct is selected,is there a method which i need to call to create new custom property?

I understand there are not a lot of people who know how to programmatically create package, please help me find someone with knowledge or suggest me how i can get some help.

				IDTSComponentMetaData100 Aggregate = pipeline.ComponentMetaDataCollection.New();
                Aggregate.ComponentClassID = app.PipelineComponentInfos["Aggregate"].CreationName;
                // Get the design time instance of the derived column
                var DesignAggregate = Aggregate.Instantiate();
                DesignAggregate.ProvideComponentProperties();        //design time

                Aggregate.Name = "AggregateComponent";               

                IDTSPath100 AggregatePath = pipeline.PathCollection.New();
                AggregatePath.AttachPathAndPropagateNotifications(pipeline.ComponentMetaDataCollection[Prev_Transform.Transformation_Name].OutputCollection[Prev_Transform.Output_Number], Aggregate.InputCollection[0]);


                //update the metadata for the derived columns
                DesignAggregate.AcquireConnections(null);
                DesignAggregate.ReinitializeMetaData();
                DesignAggregate.ReleaseConnections();

               
                // Mark the columns we are joining on
                IDTSInput100 AggregateInput = Aggregate.InputCollection[0];
                IDTSInputColumnCollection100 AggregateInputColumns = AggregateInput.InputColumnCollection;
                IDTSVirtualInput100 AggregateVirtualInput = AggregateInput.GetVirtualInput();
                IDTSVirtualInputColumnCollection100 AggregateVirtualInputColumns = AggregateVirtualInput.VirtualInputColumnCollection;

                IDTSOutput100 AggregateoutputCollection = Aggregate.OutputCollection[0];

                // Note: input columns should be marked as READONLY
                foreach (IDTSVirtualInputColumn100 vColumn in AggregateVirtualInputColumns)
                {
					int sourceColumnLineageId = AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].LineageID;
					DesignAggregate.SetUsageType(AggregateInput.ID, AggregateVirtualInput, sourceColumnLineageId, DTSUsageType.UT_READONLY);

					// create a new output column
					IDTSOutputColumn100 newOutputColumn = DesignAggregate.InsertOutputColumnAt(AggregateoutputCollection.ID, 0,  vColumn.Name, string.Empty);
					
					// set the data type porperties to the same values as these of the input column   
					newOutputColumn.SetDataTypeProperties(AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].DataType, AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].Length, 0, 0, AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].CodePage);
					
					newOutputColumn.MappedColumnID = 0;
					for (int i = 0; i < newOutputColumn.CustomPropertyCollection.Count; i++)
					{
						IDTSCustomProperty100 property = newOutputColumn.CustomPropertyCollection[i];
						switch (property.Name)
						{
							case "AggregationColumnId":
								property.Value = sourceColumnLineageId;
								break;
							case "AggregationType":
								property.Value = 3;
								break;
							case "IsBig":
								property.Value = 1;
								break;
							case "AggregationComparisonFlags":
								property.Value = 0;
								break;
						}
					}
                }


Sam

Custom Data Flow Task throwing error when run from command prompt

$
0
0
 

Hi,

I developed a custom data flow task in .net 2.0 using Visual Studio 2005. I installed it into GAC using GACUTIL and also copied it into the pipeline directory. This task runs absolutely fine when I run it on my local machine both in BIDS and using the script in windows 2000 environment. However, when I deployed this package into a windows 2003 server, the package fails at the custom task level. I checked the GAC in windows\assembly directory and it is present. Also I copied the file into the PipeLine directory and verified that I copied it into the correct pipeline directory by checking the registry. The version of the assembly is still Debug. I looked up documentation in MSDN but there is very little information about the errors I am seeing.

The error I get is pasted below, Can somebody please help me as I am currently stuck and running out of ideas to fix this problem.

Code: 0xC0047067

Source: DFT Raw File DFT Raw File (DTS.Pipeline)

Description: The "component "_" (2546)" failed to cache the component metadata object and returned error code 0x80131600.

Code: 0xC004706C

Source: DFT Raw File DFT Raw File (DTS.Pipeline)

Description: Component "component "_" (2546)" could not be created and returned error code 0xC0047067. Make sure that the component is registered correctly.

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>