Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Conversion failed when converting the varchar value ' ' to data type int

$
0
0
I have the following table which I have created via SQL:

CREATE TABLE SAMPLETABLE ( 
[SAMPLECOLUMN] varchar(50) 

GO 

I then pull in the values for this sample column and others in my query which I have omitted for simplicity like so:


SELECT null as [SAMPLECOLUMN]
FROM TABLEX


I then try to update it...

update SAMPLETABLE 
set 
SAMPLETABLE.SAMPLECOLUMN = (SELECT CASE WHEN TABLEJ.COLUMNJ IN (...) THEN 'EXAMPLESTRING' END FROM TABLEJ) 
from SAMPLETABLE 
where SAMPLETABLE.SAMPLECOLUMN IS NULL

When I run this update in Visual Studio I get the error "Conversion failed when converting the varchar value 'EXAMPLESTRING' to data type int." The batch of statements used for the update runs perfectly fine in SSMS. It is clear that there is a conflict of data types according to the error, but I don't know why there should be. The 'EXAMPLESTRING' is clearly a string which is what I want and the data type of SAMPLECOLUMN is varchar(50). 

I also get this same error if the value of column that I am using to update the null consists of strings.

What am I doing wrong here?

Need the header of the file change dynamically

$
0
0

I am trying to create 10 flatfiles from a table in database. The flatfile data(more than 100 columns) will be delimited with no column names. I am running using for each loop. I need filename in the header of the file. I am not able to change the file header dynamically depending on the filename. 

Please let me know is there anyway I can change the file header dynamically.

Thanks in Advance.


BALUSUSRIHARSHA

Run SSIS package from SQL Agent Job -

$
0
0
Hi All,

I've created an SSIS package in VS2019 with Azure Feature Packs installed for 2017. It's getting data from ODBC connection and store it on Azure Data Lake v2 usingFlexible File Destination. It's working perfect from VS.

I deployed it into my local SSISDB and try to execute it but it shows the following errors:

I can't deploy it Azure Data Factory as there are a limitations against my data source (Sybase). So, the only way to automate my package is hosting it on local server inside SSISDB - then automate it using SQL Server Agent Job.

waiting for your inputs.

Best Regards,

Mo



Regards, Mohamed Abdel Ghaffar | http://sharepointfoundation2010.blogspot.com

SSIS Script Component - Get details of an existing connection manager

$
0
0

I have a script component in an SSIS data flow in which I want to gain access to a pre-existing connection manager but I can't seem to find a way to do this. Is it possible and, if so, how? I'm coding in C# if you provide any code samples.

Thanks in advance.

SSIS job hangs during data flow from SQL Server to Oracle database

$
0
0

Hey guys. I've been wrestling with a problem for the past few months where my SSIS package hangs during loading data from a SQL Server table into an Oracle database. Unfortunately, it's random - sometimes it works; the times that it does not it hangs at different points.

This job used to work without this problem. I'm pretty stumped on what the issue is (network? driver? is there a workaround I could implement in the SSIS package?), and would greatly appreciate any suggestions.

Here's what I know:

  • It loads data from two SQL Server tables to two Oracletables. One with about 350,000 records; the other, 900,000.
  • Connection Manager Provider for the Oracle connection: Native OLE DB\Oracle Provider for OLE DB. Uses a tnsnames.ora that we have for the server info.
  • The random hanging problem occurs when I try to run it while deployed on our Windows server (Server 2008). When I run the job from my own local machine, I do not get the issue.
  • When it hangs, nothing seems to happen. The destination table stops getting filled like it should; the job continues to display as "Running". Only at 1 am each morning the server seems to stop it and call the job a failure.
  • When the job hangs, the last message is always: "Execute phase is beginning."
  • The TargetServerVersion is SQL Server 2012.

Please let me know if more information is needed, or what I should look into. Cheers!

SSDT version 15.8.0 broken links

$
0
0

This message is addressed to the SSIS support team. The installation for SSDT version 15.8.0 fromhere is broken. It fails to install and from inspecting the installation log it shows the following error:

[34F8:6DA0][2020-04-21T02:34:57]i338: Acquiring package: VSTA2017, payload: VSTA2017, download from: https://go.microsoft.com/fwlink/?LinkId=2013980&clcid=0x409

[34F8:6DA0][2020-04-21T02:34:58]e000: Error 0x80070002: Failed attempt to download URL: 'https://go.microsoft.com/fwlink/?LinkId=2013980&clcid=0x409'

and another one:

[0C84:0C8C][2020-04-20T21:47:03]e000: Error 0x80070002: Failed attempt to download URL: 'https://go.microsoft.com/fwlink/?LinkId=2013979' to: 'C:\Users\test\AppData\Local\Temp\{4478FC35-BED8-402E-8E14-07C486FACFE4}\VCRedist2010'
[0C84:0C8C][2020-04-20T21:47:03]e000: Error 0x80070002: Failed to acquire payload from: 'https://go.microsoft.com/fwlink/?LinkId=2013979' to working path: 'C:\Users\test\AppData\Local\Temp\{4478FC35-BED8-402E-8E14-07C486FACFE4}\VCRedist2010'

--------------

We need your assistance.


SSIS Tasks Components Scripts Services | http://wwww.cozyroc.com/

Problem calling a MySQL stored procedure which returns a value

$
0
0

I want to call a stored procedure on a MySQL database from an "Execute SQL Task" and store the returned value into a user variable. However I'm getting the error:

[Execute SQL Task] Error: Executing the query "CALL proc1(?);" failed with the following error: "OUT or INOUT argument 1 for routine dbname.proc1 is not a variable or NEW pseudo-variable in BEFORE trigger". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

The configuration is:

  • SQLStatement = CALL proc1(?);
  • Parameter Mapping:
    Variable Name = User::procResult
    Direction = Output
    Data Type = String
    Parameter Name = 0
    Parameter Size = -1

The MySQL statement is:

CREATE PROCEDURE proc1 (OUT rs VARCHAR(10))
BEGIN
    SELECT "Testing" INTO rs;
END //

If I call a procedure which doesn't have any parameters, it works fine. It's only when I include the parameter that I get the problem.

Thanks.

Run SSIS package from SQL Agent Job -

$
0
0
Hi All,

I've created an SSIS package in VS2019 with Azure Feature Packs installed for 2017. It's getting data from ODBC connection and store it on Azure Data Lake v2 usingFlexible File Destination. It's working perfect from VS.

I deployed it into my local SSISDB and try to execute it but it shows the following errors:

I can't deploy it Azure Data Factory as there are a limitations against my data source (Sybase). So, the only way to automate my package is hosting it on local server inside SSISDB - then automate it using SQL Server Agent Job.

waiting for your inputs.

Best Regards,

Mo



Regards, Mohamed Abdel Ghaffar | http://sharepointfoundation2010.blogspot.com


SSIS Script Component - Add an error output to a transformation

$
0
0

Is there a way to mark an output from a transformation as an error path? I've seen posts which state that the "IsErrorOut" value needs to be changed to "True" and others which state that this option requires the "Uses Disposition" value to be set to "True".

However, both of these settings are greyed-out.

Any ideas?


Use a vector variable as a Parameter in a Ole DB Source filter

$
0
0

Hi,

I am in a situation where I have a vector of integers( SSIS object variable ), and I need to do a select in a really really huge table( hundreds of millions rows ), but I only need the data that match some register of my vector( foreign key ). So what I wanted is do something like that in the OleDB Source:

select * from table where columnA in ( ? ) 

And mapping the '?' to my variable vector of object type. I tried above and not work. What can I do? Is it possible to use the vector on the ole DB source? If not, what approach would help me to filter these items?

The other option that I thought was do a for each loop package in my vector, and do the select for each item. But my table is huge, I don't want do the select multiple times

Best regards,

Luis

ispac file deployment

$
0
0

What is the best method of deploying ispac file?

Deployment wizard or using command prompt

1. Right click and projects and selecting deploy project or ETL packages  

2. Deploy ETL.ispac file.
Deploying using command prompt

  i. Copied the ETL.ispac file to D\ETL\ folder
ii. Open command prompt
iii. Type cd\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn
vi. Type ISDeploymentWizard.exe /S /SP:"D:\ETL\ETL.ispac" /DS:"<ServerName>" /DP:"/SSISDB/ETLs/QAIspac"

<ServerName> changed it to right server name where it will be deployed 
"/SSISDB/ETLs/QAIspac" -- folder where the ispac file will get deployed.


Neil

connect to integration service using SSMS

$
0
0

I am using SSMS 18.4 to connect to integration service instance of SQL server 2017.

I got below error:

Connecting to the Integration Services service on the computer "<ServerName>" failed with the following error: "Class not registered".
This error can occur when you try to connect to a SQL Server 2005 Integration Services service from the current version of the SQL Server tools. Instead, add folders to the service configuration file to let the local Integration Services service manage packages on the SQL Server 2005 instance.

For help, click: http://go.microsoft.com/fwlink/?LinkId=506689
------------------------------
Connecting to the Integration Services service on the computer "<ServerName>" failed with the following error: "Class not registered".

This error can occur when you try to connect to a SQL Server 2005 Integration Services service from the current version of the SQL Server tools. Instead, add folders to the service configuration file to let the local Integration Services service manage packages on the SQL Server 2005 instance.

So apparently the message is not right. the instance is 2017 not 2005.I  tried to login to see maintainance plan packages.

I know there is the deployment of project model of SSIS packages using ssis catalog, but we still have a lot of old packages that using file system storage, and maintenance plan still stored here.

Microsoft should still make SSMS  backward compatibility with this. It is not good approach that  we have to install old version of SSMS to open 2017 instance of SSMS .

Thanks

 


Thanks

ERROR: Method not found: 'System.String Microsoft.SqlServer.VSTAHosting.IVstaHelper.get_TemplateFilePrefix()'

$
0
0

Hi Guys,

I met a problem when I tried to edit Script Task.

Error Message: 

===================================

Cannot show Visual Studio 2010 Tools for Applications editor. (Microsoft Visual Studio)

===================================

Method not found: 'System.String Microsoft.SqlServer.VSTAHosting.IVstaHelper.get_TemplateFilePrefix()'. (Microsoft.SqlServer.ScriptTask)

------------------------------
Program Location:

   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask.get_ProjectTemplatePath()
   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTaskUI.ShowIDE()

I'm using SQL server 2012 edition. I tried to uninstall and re-install a few times but it didn't work. I've also tried to install the same installation package on a different computer, and it worked just fine.

What can I do in order to fix this? Thanks.

visual studio 2019 can't find Data Conversion component in the ssis toolbox

$
0
0

I just started to use visual studio 2019 ssis. I just notice no Data Conversion component in the ssis toolbox. and some other components like Aggregate in old version but not in 2019. Do you know where I can find them please

Thanks a lot

tree20035@hotmail.com

Upgrade packages to 2008 to 2016

$
0
0

Hi, I have 1300+ packages. I want to upgrade all these packages visual studio 2010 to visual studio 2016 and SSDT 2016. I went thru few articles and found different mechanisms.

Approach 1 - Upgrade all packages - I would like to know is this good approach? What are pros or cons with this approach?

Approach 2 - Application.Upgrade() or Application.SaveAndUpdateVersionToXml() -  If I go with this then what is pros & cons?

With above approaches, what changes I can see and what not in packages. I went thru this article -https://blog.bartekr.net/2017/12/24/upgrading-ssis-projects-part-i/. It says approach 2 is good. I have a question, when I opt for 2nd approach how and what would be different in xml (package file)? Could someone help me to understand more and best approach to do my task.


CDC Not configured properly

$
0
0

Hi All, 

I am trying to configure my first CDC using the docs.microsoft.com example, however; I have some configuration problems.

I will like to know how I could upload the folder or files. These are training materials there is no security issues as they are Microsoft documentation example


Encrypt CSV files in SSIS with AES 256 encryption

$
0
0

Hi Everyone,

I am working on one of the migration project where we are migrating all hadoop things in SQl/SSIS. One of the task is creating csv files and encrypt with AES.

This is already done in hadoop in shell scripting and i am finding a way of doing the same in Sql server/ssis .

I need to replicate the steps in sql/ssis ..... Help will be appreciated:

Hadoop ( shell scripty) using openssl as below:

******************************************

#Encrypt files and generate keys
echo Encrypting data file with AES-256 encryption and generating encryption keys
if hadoop fs -cat $PAth/part* | openssl enc -aes-256-cbc -nosalt -pass pass:$passvar  -p -out File_Name_$namevar.csv.enc > aeskey.txt ; then
echo Success
else
echo AES Encryption Failed
exit 1
fi
#Read keys and insert into Control File
echo Reading encryption keyfile
if eval `cat aeskey.txt | tr -d " " ` ; then
echo Success
else
echo failed to read encryption keys
exit 1
fi
#Update Control file with encryption keys and filename
echo inserting keys in control file
if hdfs dfs -cat $Scriptsdir/control.xml | sed "s/__KEY/$key/g" | sed "s/__IV/$iv/g" | sed "s/__FILENAME/File_Name_$namevar.csv.enc/g" > File_Name_$namevar.xml ; then
echo Success
else
echo failed to read/update control file
exit 1
fi
#Encrypt Control File
echo Encrypting Control file with RSA encryption
if openssl rsautl -encrypt -inkey Biztalk_Pubkey -pubin -in Control_File_name_$namevar.xml -out Control_File_Name_$namevar.xml.enc ; then
echo Success
else
echo failed to encrypt control file
exit 1
fi

******************************************


Shakky

How to archive SQL database tables using SSIS

$
0
0

Hi All,

How to archive SQL database tables on monthly by using SSIS packages 

Conditions for archive :

1. need to track which table was archived 

2. How many records 

3. When it happens 

MySQL/ODBC write performance

$
0
0

I am writing records to a MySQL table using an ODBC Destination and the performance is appalling. I've set the batch size to 1000, 10000 and 100000 but I can see the the row counter on the input feed always stops in 9777 row increments. If I use MySQL as the *source*, then performance is great and I don't see any delays or batching.

Has anyone got any ideas how to resolve this? I'm guessing there's a setting in the driver or configuration that needs changing but I haven't found anything obvious.

Failing that being a solution, does anyone have any recommendations for writing to MySQL from SSIS in a performant way?

Thanks.

Incorrect column width information in ODBC Source

$
0
0

I'm using an ODBC source to read from a MySQL database and the varchar column widths being reported are 3 times larger than reality:

CREATE TABLE `table_1` (
   `product` varchar(25) NOT NULL,
   ...

The MySQL collation is utf8_general_ci but creating the target columns as nvarchar results in an error.

I can't change the data type properties of the source task as they just get changed back again.

I'm guessing this is something relating to unicode, but how can I get the errors to go away? I can create the target columns 3 times larger than expected but as this data will be sent back again, I don't want to get a similar error at the upload stage.

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>