Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Date Conversion from SQL to a .csv file

$
0
0

We are doing an explicit conversion in our SQL Stored Procedure

CONVERT(VARCHAR, [#TempTable_Inventory_History].[ExpirationDate], 112) AS [Lot Expiry Date],

And within the SSIS Package and the Data Flow and the OLE DB Source

EXECUTE[Datapass].[InventoryExport846] ?

WITH RESULT SETS
(
(
[Lot Expiry Date]VARCHAR(8),...
)
)

The Metadata coming out of the OLE DB Source however is DT_DBDATE

Why? When I have explicitly converted to VARCHAR and 112 format and YYYYMMDD and have handled that within the EXECUTE WITH RESULT SETS

The data touch point is then extracting and creating to the .csv file as 2018-11-30

I think I even tried to do a Data Conversion within SSIS for this annoyance and that didn't work.

I just don't understand why.

Ok...I resolved it. I erroneously fir did RESULT SET as DATE...then changed it to VARCHAR(8). The Metedata NEVER updated. When I deleted the OLE DB Source and re-built it with RESULT SET VARCHAR(8) ...IT WORKED!!!

I guess the Metadata is not intelligent enough to refresh itself.

Stupid!


how to loop iterate for loop for 5 hour in ssis

$
0
0
Hi I have one doubt in ssis,
how to run forloop for 5 hours from execution time using for loop container in ssis
initial expression currenttime
evalution expression : next 5 hours
assign expression :  every second

can you please tell me how to write expression using ssis forloop container

SSIS and Integration service catalog packages

SSIS - "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object ''. The duplicate key value is . Though there are no duplicate records.

$
0
0

Hi,

I am providing support to one of our clients, where we have jobs scheduled to load the data from the tables in the source database to the destination database via SSIS packages. The first time load is a full load where we truncate all the tables in the destination and load them from the source tables. But from the next day, we perform the incremental load from source to destination, i.e., only modified records fetched using changed tracking concept will be loaded to the destination. After full load, if we run the incremental load, the job is failing with the error on one of the packages "Violation of PRIMARY KEY constraint. Cannot insert duplicate key in object '<tablename>'. The duplicate key value is <1234>, even though there are no duplicate records. When we try debugging and running the failing package, it runs successfully. We are not able to figure out why the package fails and when we run the next day it runs successfully. Request you to help me in this regard.

Thank you,

Bala Murali Krishna Medipally.

Non-SysAdmins have been denied permission to run DTS Execution job steps without a proxy account. The step failed.

Accessing data from Amazon Snowflake datawarehouse

$
0
0

Hi Friends - I need to read some data from Amazon Snowflake warehouse and load to SQL server. I am planning to implement it using SSIS. Please provide me the link to install the drivers for it and steps required to integrate the data to SQL server.

Regards..

Unable to Deploy Package With Project DQS Connection Manager

$
0
0

I've run into an interesting issue when trying to deploy SSIS packages that leverage project level DQS connection managers.

We're able to deploy SSIS projects and run the SSIS packages to our development environment just fine and run them with no issues. But when trying to deploy the exact same projects to our test environment, the deploy constantly fails at the same point: changing protection level.

After double and triple checking that the package protection level is the same as the project protection level, I decided to try removing the project level DQS connection manager and replacing it with a package level DQS connection manager. Voila! Deployed just fine with no issues.

I'm at a complete loss as to why 1.) the same project(s) deploys fine to one server and not another, and 2.) why a DQS connection manager would have anything to do with the protection level??

I know when deploying to a catalog, it changes the protection level to storage, but again, not sure what a connection manager has to do with that.

The only difference I can see between the two servers are the versions.

The server which has no deployment issues is at 14.0.3029.16

The server that is having issues is at 14.0.3035.2

Apparently 3035.2 is a security update meant to address a security flaw:

https://support.microsoft.com/en-us/help/4293805/security-update-for-remote-code-execution-vulnerability-in-sql-server

After reading this, I still don't see where this would impact deploying SSIS packages with a DQS connection manager.

We'd be fine with using a project level connection manager, but we have two DQS servers and package level connection managers don't travel well between environments.

If anyone has any insight into this or has seen this, please feel free to comment!

Thanks!


A. M. Robinson

How to merge 2 table from 2 database different ==> 1 table but can know info of source what?

$
0
0

Hello,

I have problem is I want add a column source_id  to 2  table data source when merge in SSIS

Data source: Customer 1, Customer 2

Customer1(Id, Name, Adress)

Customer2(Id, Name, Adress)

Destination: Customer

Customer (Id, Name, Adress, Source_Id)

1001 |  John    |  123, Abc, USA  | 1 

1002 |  sunny  | USA                  | 2

My English is not good, very sorry :(

Hope to get help!

Thanks,


Can anyone help with: How to load a large CSV with inconsistent double-quotes into SQL server?

$
0
0

I would like to find a way to import a large (above 1 GB, above 1 million rows) CSV file into SQL server fast.

Ideally the file columns structure would be detected dynamically (butlet's say it's a known structure for now)

I have tried:

1) PowerShell Import-Csv with datatables  => it is too slow - takes several hours for a 400K rows file.

2) Powershell dbatools  >  Import-DbaCsvToSql => I found the correct regexp, but for larger files (wide columns) is slower than Import-Csv

3) BCP with both XML and with non-XML format file. In both cases it fails as I don't know if the column value will be enclosed in "" or not.  Some values are in "", some are not, but missing values is always without "". I don't know if it's possible to use a format file in that case?

Help with a BCP Format file that handles double-quoted values which may be blank

$
0
0
Can anyone help with creating a BCP format file for the following file structure. 
The first row contains headers enclosed in double quotes
All text values are enclosed in double quotes, but missing values are not.
Example of the file:


"ID","Name","Colour","LogDate"
41, "Orange", "Orange",2018-09-09 16:41:02.000
42, "Cherry, Banana","Red,Yellow",
43, "Apple",,2017-09-09 16:41:02.000

What I have is this: but it doesn't handle missing values without doublequotes:

11.0
4
1       SQLCHAR             0       255       "\",\""     1     ID                SQL_Latin1_General_CP1_CI_AS
2       SQLCHAR             0       255       "\",\""     2     Name                 SQL_Latin1_General_CP1_CI_AS
3       SQLCHAR             0       255       "\",\""     3     Colour                  SQL_Latin1_General_CP1_CI_AS
4       SQLCHAR             0       255       "\r\n"     4     LogDate                      SQL_Latin1_General_CP1_CI_AS

Any ideas please?

Error: Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.

$
0
0

Hi 

I have an SSIS package that decrypts data and it has been running successfully till yesterday. I had exhausted all my options and anything I researched online would not solve the problem. 

I have a table that has 5 columns and they are encrypted with gpg and are varbinary data types.  Then Script Transformation Editor decrypts the data through C# to a varchar columns. The InPut DataType is DT_IMAGE and the OutPut Columns are as DT_STR 8000 DataType. And I am getting the following error. I changed the destination columns to varchar(max) and the package still fails. I changed the DT_STR to DT_TEXT and no success at all. I tried to use the data conversion still no luck. Here are the error messages:

[Decryption MyTableName 1] [83]] Error: Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Decryption MyTableName 1" (83) failed with error code 0x80131600 while processing input "Input 0" (95). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[Source - MyTableName]  [217]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Source - MyTableName returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

Here is the error message I am getting:

Here is what the Data Flow Task looks like:



IN~

Export query result to multiple excel sheets using SSIS

$
0
0

Hi,

I  want to export data from multiple select statements to different tabs in the same excel file (each select result on a different tab) using SSIS package. Can someone help me out how to do it with SSIS 2008 package?

Thanks

Parent/ child package

$
0
0

HI - 

I want to create the master package that would be scheduled to run every 5 minutes in sql agent job. I want that master package to act as scheduler to run the child packages at specific time. So. Master package can fetch the time to execute the child packages and execute them. what's the best way to do it? 

Thanks,

How dynamically fetch cell range data from excel source in ssis?

$
0
0

Please guys check this link , i am giving some processes.

https://docs.google.com/document/d/1LwCqFlx4XuJhVuqFKX4ukeehIc5MQ563x35Ffa0vDqQ/edit

SSIS Lookup not matching on equal datetime values when in partial or no cache mode

$
0
0

The Lookup is suppose to detect repeated records to filter out duplications and it works correctly under full cache mode. For performance tried to change it to partial cache and no cache, and the changes breaks correctness because the same value on datetime column fails to match and repeats are escaping the filter into destination data table.

1) If change back to full cache the problem goes away just will run out of memory when data grows to realistically massive size.

2) The other columns are in text and numeric types, and they seem to be OK. Just for test, if no longer compare the datetime column the Lookup matches as expected but we really need that datetime column.

I suspect there might be something wrong in collation or other stuff related, I am pretty new with SSIS.

The environment is SQL Server 2016, Visual Studio 2015, and Windows Server 2016. And the matching failure happens in Visual Studio debugging. More information available if needed, and thanks a lot in advance.



"Package execution completed.Click here to switch to design mode, or select stop debugging from debug menu"

$
0
0

Hi,

I am very new to ssis since from yesterday i am facing this issue "Package execution completed.Click here to switch to design mode, or select stop debugging from debug menu", I did not use any break points, and also i have tried debug in this way as well. From debug menu i chosen the option start without debugging option but still the error is same.



Visual Studio 2017 can't deploy SSIS project

$
0
0

I have Visual Studio 2017 version 15.4.2, sql server  2014 and  sql server 2016 installed in my personal computer (window 10)

After I create SSIS project I can't deploy, the Next button is glad out. What should I do?

Thank you in advance.

How to secure sensitive data for SSIS packages after deployment?

$
0
0

We have a SQL Server 2014 with SSIS, three software department of our organization needs to deploy their ssis packages to the same server and each department is responsible for their sensitive data and not allowed to share passwords (ORACLE & mysql databases) which are included in packages with other department, and also they are not allowed to share them with SQL Administrator or SSIS_admin.

how we can manage to protect sensitive data in this scenario?

How to secure sensitive data for SSIS packages after deployment?

$
0
0

We have a SQL Server 2014 with SSIS, three software department of our organization needs to deploy their ssis packages to the same server and each department is responsible for their sensitive data and not allowed to share passwords which are included in packages with other department, and also they are not allowed to share them with SQL Administrator or SSIS_admin.

how we can manage to protect sensitive data in this scenario?

Oracle 64 Bit Driver Issue with BIDS

$
0
0

Hi All,

I am a SQL DBA and currently i am facing an issue with one of the SSIS packages.

The requirement is to configure ODBC for oracle DB instance and use it within the package. Somehow i created an oracle TNS entry and mapped it to an ODBC connection. The ODBC connection seems to reflect and it works fine too.

It also comes up in the package. When trying to run the package the 32 bit oracle drivers are being picked.

TITLE: Connection Manager
------------------------------

Test connection failed because of an error in initializing provider. ERROR [IM003]

Specified driver could not be loaded due to system error  1114: A dynamic link library

(DLL) initialization routine failed. (Oracle in OraClient11g_home1, c:\app\oracle\product

\11.2.0\client_1\BIN\SQORA32.DLL).

------------------------------
BUTTONS:

OK
------------------------------

Requesting your help and i am sure someone would have faced this kind of an issue.


Balaji.G

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>