Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Execute Process Task quote encloses arguments

$
0
0

Is there any way to not have SSIS Exceute Process Task quote enclose the arguments?

This is what the script task is echoing out to me via DTS.Events.FireInformation method - it is in the format I need.

-File "C:\Development\KP Integration\KP Integration\ADExport.ps1" "C:\Test\ADUsers.csv"

By the time the Execute Task Component consumes it, it quotes the entire expression. There is a prefix " before -File and a suffix quote at the end of the statement as shown below.

[Execute Process Task] Error: In Executing "C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe""-File"C:\Development\KP Integration\KP Integration\ADExport.ps1" "C:\Test\ADUsers.csv"" 

Again, I've verified quotes are in the right places for the arguments, but the script component seems to want to qualify the entire argument string as one long string and not take the value as is, which is causing an exception

Thanks,

Morgan


Transactions and errors management

$
0
0

I have two different problems, both related to transactions. Although the use-cases are basic, I can't find simple solutions.

Before somebody suggests using RetainSameConnection = true and manual BEGIN TRAN / COMMIT / ROLLBACK, this is NOT a solution because my data flow requires multiple connections and SSIS will NOT satisfy my RetainSameConnection demand.

[1] Is it possible to exclude a read-only connection manager from transactions?
I ask because I have a connection manager that cannot enlist into DTC -- and doesn't need to. But it makes make package fail.
Current solution is to use two data flows, one for reading (no tx), one for writing (inside a tx) and pass the data around into a raw file. This is ugly and inefficient.

[2] Inside my transactional flow, I would like to log error rows into a special table for further error analysis. This seems a basic requirement to me but it turns out to be hard to do. The DTC seem to be all-or-nothing, so if I redirect error rows to a DB destination, they are enlisted into the transaction and rolled back after the failure of the data flow.
Possible solutions could be: use another connection manager outside the DTC if possible (same as [1], seems impossible); or exclude a destination task from the DTC (also seems impossible).
Only solution I see: write the errors into a non-transactional destination (i.e. raw file) and save them later after the rollback. Like one [1] this is ugly and inefficient.

What is the recommended/right way of doing this?

SSIS Most efficient way to do multiple lookups

$
0
0

Let's say that I have a table of 60,000 patients, that include columns such as Date of Birth, Primary Care Physician, Insurance Type, Collection Status, and Balance. 

If I wanted to do a lookup to find the associated invoice for a particular visit, I would want to first try to match on multiple criteria, but if there wasn't a "perfect" match, I would like to try to match on fewer criteria.  For example, if the 60,000 records try to match on Date of Birth, Primary Care Physician, Insurance Type, and Collection Status, and 45,000 get matched, then 15,000 would go to the No Match output.  So then I would want to try to do the SAME lookup against the SAME reference dataset for the remaining 15,000 on, say, Date of Birth, Insurance Type, and Balance.

The way I have it set up is with multiple lookups, each no-match output leading to another lookup's input, working downward in terms of relevance.  The issue is that it takes so long to perform due to the fact the reference source has around 2,000,000 records, and each of the lookups seems to need to cache that table, even though it is the same for all lookups.  How can I "share" that reference table for all of the lookups?

Issue with Data flow task in ssis

$
0
0

Hi all,

I am using oledb source and oledb destination inside dataflow task but each time run the project it returns different number of rows . sometimes 219 rows sometime 90 rows etc..

I am using the below query it works fine when I run on the sql server. 

SELECT     Orders.OrderDate, Customers.PostalCode, Orders.EmployeeID, Products.ProductID, Orders.ShipVia AS shipperid, 
                      SUM([Order Details].Quantity * [Order Details].UnitPrice) AS [Total amount], SUM([Order Details].Quantity * [Order Details].UnitPrice * [Order Details].Discount) 
                      AS Discount, SUM([Order Details].Quantity) AS [Unit Sales], NorthwindDW.dbo.DimTime.TimeKey
FROM         dbo.Orders INNER JOIN
                      dbo.[Order Details] ON Orders.OrderID = [Order Details].OrderID INNER JOIN
                      dbo.Customers ON Orders.CustomerID = Customers.CustomerID INNER JOIN
                      dbo.Products ON [Order Details].ProductID = Products.ProductID INNER JOIN
                      NorthwindDW.dbo.DimTime ON Orders.OrderDate = NorthwindDW.dbo.DimTime.OrderDate
GROUP BY Orders.OrderDate, Customers.PostalCode, Products.ProductID, Orders.EmployeeID, Orders.ShipVia, NorthwindDW.dbo.DimTime.TimeKey

SQL PASS MDS/DQS: 31 January - Understanding Data Quality Services: Knowledge Base, Knowledge Discovery, Domain Management, and Third Party Reference Data Sets

$
0
0
PASS Master Data Services/Data Quality Services Virtual Chapter Meeting January 31st - 9 am PST / 12 pm EST

Register to Attend:  https://readytogo.microsoft.com/en-us/MPE/Pages/Preview.aspx?CurrentEventID=167463

Understanding Data Quality Services: Knowledge Base, Knowledge Discovery, Domain Management, and Third Party Reference Data Sets

With the release of Data Quality Services (DQS), Microsoft innovates its solutions on Data Quality and Data Cleansing by approaching it from a Knowledge Driven Standpoint. In this presentation Joseph Vertido from Melissa Data will discuss the key concepts behind Knowledge Drive Data Quality, implementing a Data Quality Project, and will demonstrate how to build and improve your Knowledge Base through Domain Management and Knowledge Discovery. What sets DQS apart is its ability to provide access to Third Party Reference Data Sets through the Azure Marketplace. This access to shared knowledge empowers the business user to efficiently cleanse complicated and domain specific information such as addresses. During this session examples will be presented on how to access RDS Providers and integrate them from the DQS Client.

A Data Quality Analyst at Melissa Data, Joseph Vertido is an expert in the field of data quality. He has worked with numerous clients in understanding their business needs for data quality, analyzing their architecture and environment, and recommending strategic solutions for how to successfully integrate data quality within their infrastructure. He has written several articles and gives frequent webinars implementing different techniques for data quality implementation. His efforts include research in the field of Data Quality for product development. Joseph also manages the Melissa Data MVP Program which is a network that caters to community leaders, speakers, consultants, and other experts.

For each loop each on varibale

$
0
0

hi

i'm getting below error in For each kindly suggest me some solution.

Error: The type of the value being assigned to variable "User::Order_URL" differs from the current variable type.
 Variables may not change type during execution. Variable types are strict, except for variables of type Object.


Error: The type of the value being assigned to variable "User::Account_Info_URL" differs from the current variable type.
Variables may not change type during execution. Variable types are strict, except for variables of type Object.

Visual Studio 2010 SSDT - Integration Service Projects deployment inclusive packed SQL files

$
0
0

Hello,

I'm developing huge SSIS-packages which often have to perform SQL Tasks running lot of plain sql statements. For developing the sql statements I'm using SQL Server Data Tools which supports debugging, syntax highlighting and IntelliSense (which isn't supported in SSIS with SQL Tasks). This is very important because the sql files can be very huge. Later I copy the resulting sql statements in the SQL Task of the SSIS-package. The deployment to SQL Server 2012 will be performed during Visual Studio.

I'm searching for a solution to reference SQL files in a SQL Task. I know method for flat file connections or direct connections in SQL Task. This is possible but therefore I need a directory on our SQL Server where the productive SQL files are stored (path can be configured over parameter and variables, thats OK) and the development will always be on the productive system (if I don't want to copy the statements of the SQL files).

Best solution will be in my opinion to reference SQL files with relative paths. At build time the sql files should be packed to the resulting ispac file. During deployment with the wizard (or in Visual Studio) the sql files should be eather stored on file system (predefined directory) or in SSISDB database. As result I've got a fully ipsac solution packed in a single which contains all parts and can be easy transfered an stored. The development of my SQL statements and the SSIS packages can be made with one tool, the SSDT. Versioning can be made with Team Foundation Server because all parts are in one VS solution.  But this functionallity doesn't exist at SSDT or MS SQL Server 2012.

The folder structure of a SSIS project contains a folder "Miscellaneous". I've tried to store sql file in it. But opening for editing will result in a crash (and restarting) of Visual Studio (using Visual Studio 2010 Ulitmate with SP1 and installed Microsoft SQL Server Data Tools). This must be a bug or not? At the moment only possibility for me is to add another project (like C#) to the VS solution which contains the sql files. Then performing, debugging and editing of sql files is possible and I'm only using one IDE and one solution for develope sql statements and SSIS.

Does any body have a solution for this scenario? Why doesn't Microsoft implement such a functionality? How can I expand functionality of ipsac files for storing other files (like sql, txt or so on)? Is it possible to run sql statements stored in SSISDB tables in SSIS?

How to change the data types for names in ssis

$
0
0

Hi All,

I need help.

I extracted data from China server into Excel file. This file contains names in Chinies language. so when I load the data from Excel file to Table using ssis, Names are displaying as '?' in the Table. Is there any way to resolve this problem using SSIS. I loaded the data from Excel file to table by using SSIS package. Is there any need to use any transformaion ? Plese can any body give the solution?

Cheers

LD


Flat File CM doesn't read the columns after consecutive pipes ie blank column (|)

$
0
0

We are supposed to load a pipe delimited flat file through a SSIS 2010 task using Flat File Connection Manager. We faced a weird scenario where a row has a blank column. Below is the example :

EMP_ID|EMP_NM|JOB_TITL|TERRITORY|ADDR|CITY|ST|ZIP|MANGR

123|ABC||X12345|128 Rolling ST|Rochester Hills|MI|59309|MXG

567|XYZ||Y13425|523 Grace Avenue|Greenville|GA|48459|MXG

SSIS loads the columns as usual till "XYZ" and then rest all the columns get populated as blank except the last column "MXG" in the row. 

The underlying database in SQL Server 2010


Creating a Custom SSIS Task (in 2008R2)

$
0
0

Hi

In this forum

http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/f7ffe0d5-1817-435d-8e63-0b3e80de385f

I was given reference to

http://linchpinpeople.com/2013/01/creating-a-custom-ssis-2012-task-notes-from-my-experience/

It is for SQL 2012

I did all steps for SQL2008R2 in VS 2012 Express

I see ,messages

D:\vs_prj\MyFirstTask>"C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\gacutil.exe" -if "C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Tasks\MyFirstTask.dll"
Microsoft (R) .NET Global Assembly Cache Utility.  Version 4.0.30319.17929
Copyright (c) Microsoft Corporation.  All rights reserved.

Assembly successfully added to the cache

D:\vs_prj\MyFirstTask>"C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\gacutil.exe" -if "C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Tasks\MyFirstTaskUI.dll"
Microsoft (R) .NET Global Assembly Cache Utility.  Version 4.0.30319.17929
Copyright (c) Microsoft Corporation.  All rights reserved.

Assembly successfully added to the cache

But I see no item MyfirstTask in BIDS tooolbox

I tried to refresh toolbox - no success

1?) Should this solution work for SQL2008R2

2?) If Yes - Any ideas why MyfirstTask is Not in toolbox and how to get it there

Unable to start execution of step 1 (reason: Error authenticating proxy IAM\xyz, system error: Logon failure: unknown user name or bad password.). The step failed.

$
0
0

Please help me with the permanent solution for the following error.

Unable to start execution of step 1 (reason: Error authenticating proxy IAM\arao, system error: Logon failure: unknown user name or bad password.).  The step failed.

I created a SSIS package and scheduled it to run daily at 7am. Till yesterday it was working fine. But yesterday I changed my windows password and today morning I got the above error. So, I changed the credential account password(which will be used in the proxy account) to new password and again it started working fine.

But I know this is not the permanent solution since windows passwords will change periodically. What is the permanent solution to fix this issue. Please help.

 

getting variable value in Experssion

$
0
0

hi Friends

I have one variabel in SSIS

I need to get its value in Expression

I'm using ;

"SELECT lead_id,DATE_FORMAT(entry_date,'%d-%m-%Y %h:%m:%s')as entry_date,
DATE_FORMAT(modify_date,'%d-%m-%Y %h:%m:%s') as modify_date,status,user,vendor_lead_code,source_id,list_id,
gmt_offset_now,called_since_last_reset,
phone_code,phone_number,title,first_name,middle_initial,last_name,address1,address2,
address3,city,state,province,postal_code,country_code,gender,
alt_phone,email,security_phrase,
comments,called_count,DATE_FORMAT(last_local_call_time,'%d-%m-%Y %h:%m:%s')as last_local_call_time
,rank,owner,entry_list_id
FROM asterisk.vicidial_list l
where lead_id <= @[User::maxlead_id]  (DT_I1)"

but  i didnot get values during evaluation .

I know doing mistake in  commas kindly help me

SSIS Data Profiling

$
0
0

Hi Guys,

I have used Data Profiling Task and noticed ...numeric , Text columns was not captured in the profile output...

Please advice on it ..thanks


/R.

Running Parts of a Package Depending on Day of Week

$
0
0

Hello,

I have recently been thrown in at the deep end with very little SSIS knowledge to decode a existing SSIS package.

The package is currently sat on a schedule that runs 7 days a week but on only 5 days does it produce a output file.  Looking at the history of the job give or take it takes the same amount of time to run each day, I was wondering if there was something clever you could do in SSIS that said on a weekday run this block of code and on a weekend don't.

Any suggestions of where to look would be great.

Thanks

Andy

Lineage Id by id instead of name

$
0
0

Hi,

I am transporting our SSIS project to another server.

I made a clean install of SQLServer 2012 + SSDTSetup + CozyRoc SSIS+

If i try to open a data flow on the new server error loading package: Failed to load data flow objects

I traced the problem with a simple test project to the following 

old server:

name="NewInputLineageIDs">
                  <arrayElements
                    arrayElementCount="14">
                    <arrayElement
                      dataType="System.Int32">#{Package\Data Flow Task\OLE DB Source.Outputs[OLE DB Source Output].Columns[Address]}</arrayElement>

New Server:

    name="NewInputLineageIDs">
                  <arrayElements
                    arrayElementCount="14">
                    <arrayElement
                      dataType="System.Int32">56</arrayElement>

So the new server doesnt use name based Lineage IDs. 

But i cannot get the new server to use name based lineage ids instead of numbers

The only difference between the servers (that i can spot) the new server has Visual Studio 2010 shell integrated while the old one has Visual Studio shell Isolated. (and the new server has a couple newer software releases)

So im a little bit lost, any help is appreciated

Thanks for the response!


what is considered sensitive data in ssis packages?

$
0
0
When you set the ProtectionLevel in the package, what data are you protecting? Passwords and whatelse?

How do I preserve legacy DTS packages in sql 2005 after uninstall/install

$
0
0
Hi
I have to uninstall sql server 2005 enterprise edition and then install sql server 2005 standard edition. I have 74 legacy dts packages in the instance. I would like to know the best method of importing the packages back after I install the standard edition.

How to read gmail attachment in SSIS as a Data Flow Source?

$
0
0
Hi,

I have got a package in SSIS which does all the cleaning and loads the data into Sql server table. But at the moment the input source (csv file) is provided manually and saved at a location for the package to pick the file from that location. I get this csv file into our gmail account as an attachment.

I am looking for a way if SSIS can read the email and fetches the attachment so that this manual step can be removed.

Do I need to have some kind of plugin installed or can it be done using inbuilt options in SSIS?

Any help would be much appreciated.

Thanks

H

Connection times out before drop second table executes

$
0
0

I have an SSIS package, which I generated from the export data wizard. It has a series of drop table, create table, data flow task, drop table, create table, data flow task items in it. Each works for 5 tables. So, it drops 5 tables, creates those 5 tables, then loads data to those five tables. Then it drops 5 different tables, creates those tables, then loads data to those five tables. Repeat process 29 times until all tables in my database are transferred. So, the first drop works fine, the first create works fine, and the first data flow works fine. Somewhere along the way though, my connection seems to have been lost, so that the second drop fails...which doesn't cause the package to fail, but when the create tables tries to run, it fails as they already exist, and then package dies. I just found out that my "partner' that I am exporting "to", found that their edge firewalls are terminating stale connections, and thinks that may be the issue. I'm not sure how when I just finished transferring data 2 seconds ago, my connection is stale...but I need to find a way around. The workaround suggested for my client connection is to add a tcpKeepAlive=true. I'm not sure where I could do this in my ado net connection. I set my "Default Command Timeout" to "0", to hopefully allow it to stay open, but that didn't work. Is there another setting I can tweak to keep the connection alive while my lengthy data flow runs? Or can/should I add another task of some sort before each drop table task that re-establishes the connection? Advice appreciated!

Thanks in advance!

mpleaf


package variable being set in script component is sometimes filled in and sometimes not.

$
0
0

I know this is an odd question.  Let me explain.

I set my package variable in a script component post execute.  I am able to view them using messagebox just after I set them in the same script component.  My last step in the control flow is a file system task that uses the variable to rename an output file with the variable as part of the filename.  This all works perfectly fine.  Using a breakpoint after my DFT (post execute) the variable is filled properly.

Here's my problem.  I want to use the variable in a derived column (in the same DFT where I set it) to set a column in the output file.  It is coming out blank.  I put a script component just prior to the derived column that will just display the variable (message box) and the variable is blank.

This is very strange because the variable is used in the last step of control flow to rename the file and it works properly.

What's could be happening?  Do I not have access to the package variable anywhere after it is set?

Thanks for any and all help.


shash

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>