Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

How to configure an SSIS package which uses a parameterized stored procedure as data source and populates SQL Server destination table.

$
0
0

Hi,

I have to build a package which uses a result set of a parameterized stored procedure(2 input parameters); and pushes the result set from SP into a SQL Server destination table. Can some one please help me in configuring a package? 

Thank you.

 


DTS Package running on SQL Server 2008 R2 64 bit - can't find runtime download

$
0
0

Current system is running SQL Server 2008 R2 on 64 bit machine.  I am trying to edit a DTS package which has been upgraded previously to work on SQL2008 (not migrated to integration services, this DTS package has just been imported).  I receive the error "Install the special Web download 'SQL Server 2000 DTS Designer Components' to use this feature (Microsoft.SqlServer.DtsObjectsExplorerUI)"

I look online for the web download and it is a dead link.  I've checked to see if backwards compatibility is installed on this SQL Server and confirmed that it is installed.

Anyone know a way I can edit this DTS Package?

Stored procedure apparently not fully completing or committing before next task is executed

$
0
0

I believe I'm seeing this both when running a T-SQL script directly in SQL Server Agent, or when executing SQL Tasks in an SSIS package.

I have a stored procedure that does a lot of inserts in to Table A and then subsequent SQL statements that act on those inserted rows in Table A.

The problem in both SQL Server Agent (when using a T-SQL step) or when running an SSIS package in SQL Server Agent** is that it sure seems that the task running the stored procedure, just to illustrate, inserts 5,000 rows and then control is passed to the next Execute SQL statement when really the stored procedure has not finished yet and will insert another 4,000 before it is done.

I know this because if I run the same T-SQL script in SSMS, the results are correct and as expected.

In any event, any thoughts appreciated.  Thanks.

** the T-SQL step and the SSIS package do the same thing, the code in the SSIS package just uses Execute SQL Statements that were ripped out of the T-SQL step.

Linked Server joins mitigation

$
0
0

hi all,

besides copying down a table from a remote server to a local server, to join it to another table (in the local server), is there a way in SSIS to take the results of a query (on a remote server) to be used as an input to another query (in the local server)? I'm looking at Execute SQL Tasks or OLEDB Source/Destinations to use table parameters, I'm guessing.  Is this feasible?

thx much,
Cos



Custom task Creation IN SSIS

$
0
0

Hi,

Currently i build some 40 odd ssis packages to dump data from 40 different files. Now we got a requirement where we have to process another 40 files. But this time the requirement has changed like this " Create only one custom component in ssis which will read the 40 files and dump the data into db rather than creating one one SSIS packages for 40 different files. So that if in future any more file will come we don't have to create ssis package for that. Its the custom component which will take of that new file."

Please suggest me how to proceed for the same.

SSIS can't export varchar() to nvarchar()

$
0
0

I'm writing a pretty simple SSIS package in SSMS for exporting SQL table data to a plain text file. My source table has the following definition for the column Telephone:

[Telephone] [varchar](max) NULL,

Other columns in the table are NVACHARs. The file I'm preparing is also to be written in Unicode. But when my wizard runs this package (created by far on his own), I get  the following error:

Messages
Error 0xc020802f: Data Flow Task 1: The data type for "input column "Telephone" (121)" is DT_TEXT, which is not supported with Unicode files. Use DT_NTEXT instead and convert the data to DT_TEXT using the data conversion component.
 (SQL Server Import and Export Wizard)
 
Error 0xc0202094: Data Flow Task 1: Unable to retrieve column information from the flat file connection manager.
 (SQL Server Import and Export Wizard)
 
Error 0xc004701a: Data Flow Task 1: component "Destination - Participants-J_txt" (85) failed the pre-execute phase and returned error code 0xC0202094.
 (SQL Server Import and Export Wizard)
 

I don't understand what it wants from me. Could you please tell me what to do to have this file created without this issue?

SSIS flat file connection manager with file encoded in UTF-8 and fixed width

$
0
0

Hi,

I have flat file connection manager with file encoded in UTF-8. I've set fixed width on it and I've put all information regarding columns widths. The file has data which are from multilingual sites - so some chars are on 1 byte some on 2 bytes.

I've noticed strange behaviour of SSIS flat file manager - it seems to read bytes instead of chars - so if it find 2 bytes char in row it shift result by one position.

Could you please help me with this?

conditional split with date

$
0
0

background: 15yrs informatica 2 weeks sqlserver 2008r - need to do for a POC

I am using

((isnull(firstname)? "$0$" , first name ) != isnull(lkp_firstaname)?"$o$"  , lkp_firstname )          working well  no issues

How do I do similar comparison for dates field  - Just format please - no explanation

thx in advance


create user account using API function in SSIS

$
0
0

hi All

I need to use one webservice function that is ADDUSER using SSIS.

that create user account one of website.

http://www.livedrive.com/ResellersService/ResellerAPI.asmx?op=AddUser

I have one table in SQl SERVER i need to take all input values from there thiose are needed for this API Function call.

here i'm attaching a asmx link.

Kindly help me .

thanks

With Regards

BI_group

For each loop each on varibale

$
0
0

hi

i'm getting below error in For each kindly suggest me some solution.

Error: The type of the value being assigned to variable "User::Order_URL" differs from the current variable type.
 Variables may not change type during execution. Variable types are strict, except for variables of type Object.


Error: The type of the value being assigned to variable "User::Account_Info_URL" differs from the current variable type.
Variables may not change type during execution. Variable types are strict, except for variables of type Object.

Excel error 64-bit version of SSIS

$
0
0
I have a 64bit system and installed ssis on my system.
How do I changed the ssis project to 32 bit. I have this error:

Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.

Error: 0xC00F9304 at Package, Connection manager "Excel Connection Manager": SSIS Error Code DTS_E_OLEDB_EXCEL_NOT_SUPPORTED: The Excel Connection Manager is not supported in the 64-bit version of SSIS, as no OLE DB provider is available.

Error: 0xC020801C at Data Flow Task, Excel Source [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC00F9304. There may be error messages posted before this with more information on why the AcquireConnection method call failed.

Error: 0xC0047017 at Data Flow Task, SSIS.Pipeline: component "Excel Source" (1) failed validation and returned error code 0xC020801C.


Joe

SQL PASS MDS/DQS: 31 January - Understanding Data Quality Services: Knowledge Base, Knowledge Discovery, Domain Management, and Third Party Reference Data Sets

$
0
0
PASS Master Data Services/Data Quality Services Virtual Chapter Meeting January 31st - 9 am PST / 12 pm EST

Register to Attend:  

https://clicktoattend.microsoft.com/en-us/Pages/EventDetails.aspx?EventID=167463

Understanding Data Quality Services: Knowledge Base, Knowledge Discovery, Domain Management, and Third Party Reference Data Sets

With the release of Data Quality Services (DQS), Microsoft innovates its solutions on Data Quality and Data Cleansing by approaching it from a Knowledge Driven Standpoint. In this presentation Joseph Vertido from Melissa Data will discuss the key concepts behind Knowledge Drive Data Quality, implementing a Data Quality Project, and will demonstrate how to build and improve your Knowledge Base through Domain Management and Knowledge Discovery. What sets DQS apart is its ability to provide access to Third Party Reference Data Sets through the Azure Marketplace. This access to shared knowledge empowers the business user to efficiently cleanse complicated and domain specific information such as addresses. During this session examples will be presented on how to access RDS Providers and integrate them from the DQS Client.

A Data Quality Analyst at Melissa Data, Joseph Vertido is an expert in the field of data quality. He has worked with numerous clients in understanding their business needs for data quality, analyzing their architecture and environment, and recommending strategic solutions for how to successfully integrate data quality within their infrastructure. He has written several articles and gives frequent webinars implementing different techniques for data quality implementation. His efforts include research in the field of Data Quality for product development. Joseph also manages the Melissa Data MVP Program which is a network that caters to community leaders, speakers, consultants, and other experts.

PASS Master Data Services/Data Quality Services Virtual Chapter Meeting January 31st - 9 am PST / 12 pm EST

$
0
0
PASS Master Data Services/Data Quality Services Virtual Chapter Meeting January 31st - 9 am PST / 12 pm EST

There was an issue with our previous registration system.  Please accept my apologies.  Please register using the link below.

Register to Attend:  mailto://events@passmasterdata.onmicrosoft.com?subject=SQL%20PASS%20MDS-DQS%20VC%202013%2001%2031  Just click send.  
You will receive an email from events@passmasterdata.onmicrosoft.com containing instructions on how to join the conference.  Please add this address to your whitelist so it isn't considered spam.

Our conferences leverage Lync.  If you don't have the Lync Client installed on the computer where you will be attending the conference, please download Lync Attendee from http://www.microsoft.com/en-us/download/details.aspx?id=15755

Understanding Data Quality Services: Knowledge Base, Knowledge Discovery, Domain Management, and Third Party Reference Data Sets

With the release of Data Quality Services (DQS), Microsoft innovates its solutions on Data Quality and Data Cleansing by approaching it from a Knowledge Driven Standpoint. In this presentation Joseph Vertido from Melissa Data will discuss the key concepts behind Knowledge Drive Data Quality, implementing a Data Quality Project, and will demonstrate how to build and improve your Knowledge Base through Domain Management and Knowledge Discovery. What sets DQS apart is its ability to provide access to Third Party Reference Data Sets through the Azure Marketplace. This access to shared knowledge empowers the business user to efficiently cleanse complicated and domain specific information such as addresses. During this session examples will be presented on how to access RDS Providers and integrate them from the DQS Client.

A Data Quality Analyst at Melissa Data, Joseph Vertido is an expert in the field of data quality. He has worked with numerous clients in understanding their business needs for data quality, analyzing their architecture and environment, and recommending strategic solutions for how to successfully integrate data quality within their infrastructure. He has written several articles and gives frequent webinars implementing different techniques for data quality implementation. His efforts include research in the field of Data Quality for product development. Joseph also manages the Melissa Data MVP Program which is a network that caters to community leaders, speakers, consultants, and other experts.

SSIS 2012 Error - Cannot run on the installed edition of Integration Services.

$
0
0

We’re migrating from SQL Server 2008Pro (64-bit) to SQL Server 2012BI (64-bit). The new installation is on a brand new Windows Server 2008R2 server.

I have an SSIS package that I upgraded from 2008 to 2012 using SSDT (VisualStudio 2010) upgrade wizard. The package runs just fine when I run it on my development box, so I know my upgrade was successful. It’s been deployed to the Integration Services Catalog and a SQL Server Agent job was created to execute it on a schedule. BTW, my development machine is a 64-bit Win7 box.

When execution starts, I get the following error message: DFT - Extract data from HRMS:Error: The OS - Get HRMS Data cannot run on the installed edition of Integration Services. It requires Enterprise Edition (64-bit) or higher.

I get the same error whether I execute the package as a scheduled SQL Server Agent job or if I run it manually (Right-click on job and Execute)

Any help would be much appreciated.

Thanks Ron...

Script Task for connecting and FTP'ing files to another server

$
0
0

Hi,

Following the examples in this topic I have attempted to get the FTP to work in my SSIS package with no luck.   If I use the FTP Task in SSIS and setup the connection, when I execute that it works fine.  The problem I'm having like the others is that the FTP Task does not store the password.  So, in my search to find another method of creating the FTP connection I came across this thread.  At first I tried creating a dsconfig file that stored the information for the fTP connection but that didn't work.  Next and with more success I used the following code however it too gives me a 550 access denied error which to me doesn't make sense since I can use the FTP Task wizard with the same parameters and it works.  Below is my code:

PublicSub Main()Try'Create the connection to the ftp serverDim cm As ConnectionManager = Dts.Connections.Add("FTP Connection Manager")'Set the properties like username & password
            cm.Properties("ServerName").SetValue(cm,"mydestinationservernameHere")
            cm.Properties("ServerUserName").SetValue(cm,"mydestinationUsernamehere")
            cm.Properties("ServerPassword").SetValue(cm,"mydestinationuserpasswordhere")
            cm.Properties("ServerPort").SetValue(cm,"21")
            cm.Properties("Timeout").SetValue(cm,"0")'The 0 setting will make it not timeout
            cm.Properties("ChunkSize").SetValue(cm,"1000")'1000 kb
            cm.Properties("Retries").SetValue(cm,"1")'create the FTP object that sends the files and pass it the connection created above.Dim ftp As FtpClientConnection =New FtpClientConnection(cm.AcquireConnection(Nothing))'Connects to the ftp server

            ftp.Connect()'Build a array of all the file names that is going to be FTP'ed (in this case only one file)Dim files(0)AsString

            files(0)="thesourcefilelocation"'ftp the file 'Note: I had a hard time finding the remote path directory. I found it by mistake by creating both the FTP connection and task in the SSIS package and it defaulted the remote path setting in the FTP task.

            ftp.SendFiles(files,"the destination file location for the destination user which is /",True,False)' the True makes it overwrite existing file and False is saying that it is not transferring ASCII

            ftp.Close()Catch ex As Exception
            Dts.TaskResult = Dts.Results.FailureEndTryEndSub

I don't know why I'm getting the 550 access denied error. I really need some help with this.

thanks


Transfer Data from SQL2008 to Oracle

$
0
0
Hi,
I have application on SQL2008 R2.
I have a form in that application where data has to be transferred  to Oracle Database once the form is completed.

How can I achieve this and can I use SSIS to transfer data ? and how can I trigger the data transfer

Thanks

Irshard Zahir

config file, is this a bug?

$
0
0

I'm playing around with XML config file and noticed this behavior.

When I have a config property name in my file that doesn't exist in my package I get this error. "Cannot resolve a package path to an object in the package". For example, if I have a connection object named "DB_Conn1" in my package, and in the config file, I have a property for something called "DB_Conn2", then the package errors saying that there is no connection object named "DB_Conn2" in my package.

BUT, it does't always behave this way. If I go into my connection manager in my package and put in connection information that is valid, then the package seems to ignore the fact that I have a non-existing property in the config file and just uses the connection info that is saved in the package.

This seems like inconsistent behavior to me. Does anyone know what I'm talking about?

Column Width When Importing Flat File with SSIS

$
0
0

I had an earlier problem with using BULK INSERT that involved all sorts of permissions issues that were just way too restrictive. I think I understand - vaguely - why those issues are there, but they are causing me all sorts of grief.

The problem was that the Import Wizard assigned a column output width - even when the data is delimited, in my case with a tab - that defaults to 50. Fine, I can analyze the current data dump and determine the max width I need for each column, adjust the column widths in the Wizard accordingly and save the details as an SSIS package. But, next month, the widths may not be the same. That's why I was trying to go with BULK INSERT, because I could just set the fields that are variable with varchar(max). If the widths fluctuate, no big deal, the format can handle that. But, there's no way to put "max" in the field width sections of the wizard - at least, not one that I've found.

So, someone suggested that I use SSIS. OK, that seemed interesting and I see that SSIS is pretty powerful. But, the Flat File Connection Manager still has the same problem as the Import Wizard did; I can't define variable length output column sizes.

How can I tell the Connection Manager that the output columns need to be variable length?

For that matter, since the file is defined as delimited and not fixed length, why do I even need to bother with defining column widths?

Thanx!

Passing a table into a script task

$
0
0

Hello:

I am very new to SSIS and Visual basic as a whole. Does anyone know of a code snippet that I can use on a script task in a data flow where I can pass all columns through a replace function. I am looking to remove delimiters that exist in our database from extract data. For example if "|" existed in an address then it would not be passed to the flat file and added as a new field. I am reading this from a flat file and writing it to a flat file.

Any help would be appreciated.

Thanks

Michael

Exporting PDF data using SSIS 2008

$
0
0

I am a newbie using SSIS

I need to export data from a pdf file 312 pages into sql server 2008 table, does anyone know how to accomplish this?

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>