Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

need help

$
0
0

Here is the scenario.

I have to select 1 row from source and then i have to pass that row to 5 dataflow ( each data flow destination has different structure ). So that, those 5 dataflow will right 5 rows into one target file. Next i will pass the second row from source to those data flows to write another 5 rows into to target file. This will continue for all the rows in my source.

Note: I want to use a query for source like select col1,col2,col3,col4,rw from source where rw= @variable

This variable needs to be incremented to fetch each row one by one. But I am not sure how to do this and achieve my requirement.

Appreciate your help on this.


what version of dtsx is it?

$
0
0

I have some SSIS package dtsx in 2005 and in 2010 version.   Is it possible for one to open the metadata inside dtsx file to tell which version of dtsx is?  I don't think the content will say it's for 2010 or 2005 or 2008.

 

 

Send Mail Task Email Format SSIS 2012

$
0
0

I have a 2012 sql server stored procedure that concatenates rows of data into a single record. Each record is seperated by the litteral string, 'crlf'. When the data is return to my 2012 ssis package, the return data is set to a string variable named 'emailMessage'. The variable is examined in an expression, and all instances of 'crlf' are replaced with line breaks '\n'. This is done so that each record will be contained on a seperate line in the email message.

The problem that I have is that the proc performs some formatting to make sure spacing of the data is correct, but the results render differently between email clients like MS Outlook. If I take the results I get in MS Outlook, and put them in notepad, everything lines up correctly in notepad. In Outlook, the 'columns' of data do not line up correctly.

Is there a way, with the Send Mail Task, that I can control whether or not a message is sent as text or html?

NOTE: I'm only returning records through the Send Mail Task this way because the client will not allow for script tasks, nor the use of the db_sendmail system proc.

the following is my procedure:

--******************************

CREATE PROCEDURE dbo.ReturnNames

@ConcatRecordLocal varchar(8000) OUTPUT

as

set nocount on

--Need to know what the max length of every field, and column name:
declare
 @MaxLenFirstName int
 ,@FirstNameColumnNameLen int
 ,@MaxLenMiddleName int
 ,@MiddleNameColumnNameLen int
 ,@MaxLenLastName int
 ,@LastNameColumnNameLen int


--Set the length of the names of the columns:
set @FirstNameColumnNameLen = 9
set @MiddleNameColumnNameLen = 10
set @LastNameColumnNameLen = 8

--get the max length of the column data:
select
 @MaxLenFirstName = max(len(firstName))
 ,@MaxLenMiddleName = max(len(middleName))
 ,@MaxLenLastName = max(len(lastName))
from
dbo.Names


--if the name of the column is longer than any of the data in it, use the length of the column name:
if @FirstNameColumnNameLen > @MaxLenFirstName
 begin
  select @MaxLenFirstName = @FirstNameColumnNameLen
 end

if @MiddleNameColumnNameLen > @MaxLenMiddleName
 begin
  select @MaxLenMiddleName = @MiddleNameColumnNameLen
 end

if @LastNameColumnNameLen > @MaxLenLastName
 begin
  select @MaxLenLastName = @LastNameColumnNameLen
 end

--use char to make string timming easier:
declare @ErrorTable TABLE
( NameID int identity(1,1) not null
 ,firstname char(25) not null
 ,middleName char(25) not null
 ,lastname char(25) not null
)

--variables to assist with building a string of records:
declare
 @RecordCount int
 ,@Counter int
 ,@ConcatRecord varchar(7000)
 ,@CRLFIndicator char(4)

set @Counter = 0
set @ConcatRecord = ''
set @CRLFIndicator = 'crlf' -- will use this in SSIS to find and replace with a line break.

insert @ErrorTable
(firstname, middlename, lastName)
select firstname, middlename, lastname from dbo.names

--this drives the number of times the 'while' loop executes:
select @recordCount = count(*) from @ErrorTable

--initialize @ConcatRecord with field names:
select @ConcatRecord = 'firstName ' + 'middleName ' + 'lastname ' + @CRLFIndicator


--build the one record of concatenated rows of data:
while (@Counter<= @recordCount)
 begin

  select @ConcatRecord = @ConcatRecord + LEFT (firstname, (@MaxLenFirstName + 1)) + LEFT(middleName, (@MaxLenMiddleName + 1)) + LEFT(lastName, (@MaxLenLastName + 1))  from @ErrorTable where Nameid = @Counter
  select @ConcatRecord = @ConcatRecord + @CRLFIndicator

  from dbo.names where NameID = @Counter

 set @counter = @counter + 1

 end

--return the result:
select @ConcatRecordLocal = @ConcatRecord

--***********************************************

Thank you for your help.

cdun2


The connection "" is not found. This error is thrown by Connections collection when the specific connection element is not found

$
0
0

I've got a package which reads a text file into a table and updates another.  I set up configurations so that I could import it into the SSIS store on both my dev and live servers.  Now, I'm getting this error.  I tried removing the configs and am still getting it.

I've been through each step and everything looks okay.  Does anyone have any idea (a) what's wrong, (b) how to localise the error or (c) get any additional information?  Or do I just have to recreate the package from scratch?

 

TITLE: Package Validation Error
------------------------------

Package Validation Error

------------------------------
ADDITIONAL INFORMATION:

Error at PartnerLinkFlatFileImporter: The connection "" is not found. This error is thrown by Connections collection when the specific connection element is not found.

Error at PartnerLinkFlatFileImporter [Log provider "SSIS log provider for SQL Server"]: The connection manager "" is not found. A component failed to find the connection manager in the Connections collection.

 (Microsoft.DataTransformationServices.VsIntegration)

------------------------------
BUTTONS:

OK
------------------------------

What's the best way to get a MySQL database into SQL Server 2012 Reporting Services?

$
0
0

I'm trying to get a MySQL database into SQL Server 2012 reporting services as a shared data source, and getting absolutely nowhere after a day.

I've tried a couple of options. First, my specs:

Machine: Workstation running Windows 7 64 bit, SQL server: SQL Server 2012 Enterprise (evaluation), MySQL version: 5.1.68-cll, ODBC version: MySQL ODBC 5.2 driver (both 5.2a and 5.2w)

Ok, now my attempts:

  1. Create an ODBC connection and use this directly as a data source in MS SSRS.

This worked well initially, after a little mucking around I was able to get the ODBC connection working as a data source in the Report Manager interface.

The problem came when I went to use this data source in Report Builder 3.0. I was able to set this as the data source (connection tested fine). However, whenever I came to build a dataset from the data source, I always came up with this error: "ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified".

I tried every solution I could find to fix this, no luck.

Then I came across a post that said using a connection like this in report builder is always problematic and limited, and the best way to do this was to create a linked server then use this as the data source.

So that was method 2:

I have been trying to create a linked server in MS SQL Management Studio to the MySQL database. I've entered all the details according to what info I could find on the web, but it always hangs when I create it. The green circle just sits there spinning, and stays like that till I crash the whole program.

The linked server shows up in the server browser when I restart again, but when I test the connection it freezes management studio again, and the only way out is to crash it.

I would really appreciate some guidance on this, I'm only fairly new to all this and this has just about driven me crazy.


Parallelism in SSIS Data Flow

$
0
0

I’m busy working on performance optimisation in my SSIS packages. I have done quite a few things which have proved very useful indeed. Now, I’m at a point where I load my fact table. I have a series of Lookup components which work on data with count of more than 200million rows. I was wondering if it is possible to split this data so that Lookup exercise can be done in parallel. Below are the two options which I’m not sure about as to which one is best, or whether they work at all.

  1. To use a single Data Flow Task (DFT) for my Lookups. In that DFT I will have a single OLEDB source component. The next component in that pipeline will be a Conditional Split Transformation where I will define that rows be split into three separate partitions based on date (months). From this point on, all the transformation components will be similar and they will all load into the same destination table even though using different destination components.
  2. To use three separate Data Flow Tasks. Partition of rows will be performed at source query level. Everything else will be identical except for the WHERE clause in the source queries whereupon partitions are specified.

I would like an option which will make most of parallelism. Reading around from different sources I have come across factors like buffers, threads, and execution trees which should be considered under these circumstances. Unfortunately I’m not quite familiar with those things and for that reason I have decided to post this question for advice.

Many thanks,


Mpumelelo

SSIS Error: DTS Package already exists with different ID in this category

$
0
0

Hi

 I am trying to deploy DTS package on sql 2000. Getting this error "DTS Package 'PAckahe_name' already exists with different ID int this category '

any solution for this?

Thanks


Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah


SSIS Script Task Error

$
0
0

Hi

i get this error when i execute my SSIS package.

   at System.RuntimeMethodHandle._InvokeMethodFast(IRuntimeMethodInfo method, Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeType typeOwner)  at System.RuntimeMethodHandle.InvokeMethodFast(IRuntimeMethodInfo method, Object target, Object[] arguments, Signature sig, MethodAttributes methodAttributes, RuntimeType typeOwner)   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks)   at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)   at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams)   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTATaskScriptingEngine.ExecuteScript()

the script that i use is this (it works perfectly fine when i manually set the variables [static]):

string _strFtpUserName = (string)Dts.Variables["User::FtpUserName"].Value;
string _strFtpPassword = (string)Dts.Variables["User::FtpPassword"].Value;
string _strFtpServer = (string)Dts.Variables["User::FtpServer"].Value;
string _strFtpPort = (string)Dts.Variables["User::FtpPort"].Value;
string _strFtpRemoteFolder = (string)Dts.Variables["User::RemoteFolder"].Value;
string _strLocalWorkingPath = (string)Dts.Variables["User::LocalWorkingPath"].Value;
string _strRemoteBackup = (string)Dts.Variables["User::RemoteBackup"].Value;

            // asdf           
            string _strConfigFile = "option batch abort\r\n" +
                "option confirm off\r\n" +
                "open " + _strFtpUserName + ":" + _strFtpPassword + "@" + _strFtpServer + ":" + _strFtpPort + "\r\n" +
                "option transfer automatic\r\n" +
                "get " + _strFtpRemoteFolder + " " + _strLocalWorkingPath + "\r\n" +
                "mv " + _strFtpRemoteFolder + " " + _strRemoteBackup + "\r\n" +
                "close\r\n" +
                "exit\r\n";

            if (File.Exists("C:/Program Files (x86)/WinSCP/SSIS-SFTP-Get-Config.txt"))
            {
                File.Delete("C:/Program Files (x86)/WinSCP/SSIS-SFTP-Get-Config.txt");
            }

            StreamWriter file = new StreamWriter("C:/Program Files (x86)/WinSCP/SSIS-SFTP-Get-Config.txt");
            file.WriteLine(_strConfigFile);
            file.Close();

            Dts.TaskResult = (int)ScriptResults.Success;

If anyone could please assist as that is all the information that the error message gives me, i have checked around on other forums and i cant seem to find a solution as it is not a right issue as i have domain admin rights.


How to Take Multiple Fields In the Same Record and Put Them in Multiple Records in the Same Field

$
0
0
In Excel this would be a Transposition. I have Codes in up to 9 Fields of one record  (MemberID1, Code1 [123], Code2 [456], Code3 [789]; MemberID2, Code1 [432]; MemberID3, Code1 [543], Code2 [798]). {Semicolons are record separators}. This data needs to be loaded in a Table that has MemberID and Code. So the data sets I presented would look like: (MemberID1, Code [123]; MemberID1, Code [456]; MemberID1, Code [789]; MemberID2, Code [432]; MemberID3, Code [543]; MemberID3, Code [798]).  How do I do this in SSIS?

MOVE SSIS Package from one server to another automatically on regular Basis

$
0
0

Hi guys

Can you suggest ways to move SSIS package automatically on regular basis from one Server to another?

Thanks


Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah



Sql Job using attunity teradata connector failing..

$
0
0

I have a 2008 ssis package pulling data from teradate using attunity connector. The packages are developed in my laptop, and called thru a job from the sql 2008 box. ) Shared ICU Libraries For Teradata nt-x8664
2) Teradata GSS Client nt-x8664
3) Teradata cliv2 nt-x8664
4) ODBC Driver for Teradata (64 bit)
6) TPT Operator support library (poslc000_ x8664)
7) TPT Load Operator (plodc000_x8664)
9) TPT Update Operator(pupdc000_x8664)
10)TPT Stream Operator (pstmc000_x8664)
11)TPT API (papic000_x8664)

Are the stuff thats installed on the Win 2003 server, sp2 - 64 bit sql 2008 box...works fine in dev but then when we moved to uat i get an error saying -
Executed as user: NA\svc-sqlinst09-clst81. Microsoft (R) SQL Server Execute Package Utility  Version 10.0.2520.0 for 64-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    Started:  4:35:26 PM  Error: 2009-11-09 16:35:28.85     Code: 0xC0014005     Source:       Description: The connection type "MSTERA" specified for connection manager "Teradata Connector 1" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.  End Error  Error: 2009-11-09 16:35:28.85     Code: 0xC0010018     Source:       Description: Error loading value "<DTS:ConnectionManager xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:Property DTS:Name="DelayValidation">0</DTS:Property><DTS:Property DTS:Name="ObjectName">Teradata Connector 1</DTS:Property><DTS:Property DTS:Name="DTSID">{0C558FD4-924C-4D23-841B-3C77F" from node "DTS:ConnectionManager".

Creating connection managers dynamically in SSIS 2012

$
0
0

Hi,

Is it possible to create connection manager objects dynamically at runtime while executing SSIS packages? Please note that, I'm not talking about assigning connection string dynamically to connection mangers, but I need to create the connection manager objects dynamically at the project level. The reason I want to create the connection managers dynamically is that, the number of databases that I need to access depends on my execution instance and an operation table contains the list of all the databases/servers and tells me which task would need which database/server. I might be dealing with say, 3 data connections or 15 data connections depending on my execution instance. I also have a lot of parallel invocation and hence, can not just create one (or a few) connection managers and keep changing their connection strings. The ideal scenario is, I create as many connections as needed in the beginning and my SQL tasks can dynamically pick up whichever connection is relevant for it and the connection manager name can be assigned using a variable/expressions.

I'm using project deployment mode in SQL 2012.

SSIS, SQL Server Agent, SQL Server Agent Service Account

$
0
0

SSIS2008

Created a package in BIDS2008, deploy to SQL2008 MSDB. The package resides in DEV and function of package is go to PROD and refresh table from PROD. Works under my login (sysadmin priv in both machine) in BIDS2008.

Created a job, run it, fails because SQL Server Agent Service Account not in PROD.

DEV's SQL Server Agent Service Account - SQLSvcDev

PROD's SQL Server Agent Service Account - SQLSvcProd

So I went back to Job Step properties and "Run as:" is a dropdown and only has SQL Server Agent Service Account.

Is there a way to change Run as? IF not, any suggestions on how do I fix this issue if adding the Service Account username (SQLSvcDev) isnot an option in PROD box?

Thanks.


Programmatically map large number of output columns for database insert?

$
0
0

I'm parsing XML from each of 8 web service requests. The responses to these 8 requests return a total of 450 columns.

With a small number of columns, the logical way to insert columns into SQL Server would be to use a Script Component, defining the output columns and mapping them to an OLEDB connection or something. I simply can't imagine doing all of this work manually for 450 columns. I'm inclined to just write a C# console script to handle this job. The requirement for the job, however, is to use SSIS.

As such, what is a better approach for SSIS - if SSIS is even the right tool?

Would it be wiser to use a Script Task, executing insert statements on each record as the XML is deserialized?

Reading the MSDN documentation for connecting data flow components dynamically seems like a possible approach, but clunky, isn't it? More like programmatically trying to force the GUI to do what's desired rather than a lean parse/insert process in code.

http://msdn.microsoft.com/en-us/library/ms136086.aspx

Welcoming suggestions.

Thanks!

XML

$
0
0

Hi,

Got an complex XML file and ask me to load data into multiple tables dynamically..

used XML task and XSLT to generate a new XML file..

and used XML source and manually generated XSD by clicking 'generate xsd' button and its works fine gave expressions and working for that particular XML file

and i have another xml file with same elements but different data... i don't want to go into XML source and click generate XSD button ...Is there any alternative best way that just user need to upload xml file, by using xslt its need to generate XSD dynamically and data needs to load in SQL db....

Thank's in advance..

Vinni


SSIS Convert DateTime

$
0
0

Hi

I have col1(string),col2(datetime),col3(string).I want dervied new column using 3 columns.

col1=1222222

col2=2012-05-11 00:00:00.000

col3=abcd

I want output like below.

1222222May 11 2012 12:00AMabcd

how to implement this logic in Derived Column Tansformation

Thanks

CMK

 

How to truncate three tables before inserting the records into destination tables in SSIS

$
0
0

How to truncate three tables before inserting the records into destination tables in SSIS

I added Exceute SQL Task aNd hook to the Dataflow task. 

I double-clicked on Exceute SQL Task and wrote the below line. 

SELECT 'TRUNCATE TABLE ' + tblproduct1 + tblcustomert1 + tblcustomert1 

After debug the pakage, i am getting below errors

1. Error: Incorrect syntax near ','.

2. 

[Execute SQL Task] Error: Executing the query "TRUNCATE TABLE tblproduct,
TRUNCATE TABLE tblcust..." failed with the following error: "Incorrect syntax near ','.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

3. Task Execute SQL Task failed

And i am importing excel data into three destination tables.

Actually without Exceute SQL Task control run succfully. Butt actual excel rows are 10224. If run so many times, records adding into table like 10224 * 5.

So, if run many times records should should be show 10224 in table. 

So, now i need to truncate the destination tables before importing the records from excel to three destination tables.

I

Just getting started and needing help

$
0
0

I've completed the SSIS tutorial on microsoft's web site and pretty much duplicated it inside my first project as it completely fits.  I need to extract from a flat file source, lookup student; instructor; test and date records in DIM tables and finally write out testScores and Agregates to a few FACT tables.  I've created all the connection managers I need and created the flat file source and lookup objects in my data flow task.  At this point I'm forced to veer from the tutorial as I need to run a query against another database to obtain Student and Instructor data when they don't exist in the appropriate DIM tables.  Can anyone provide a list of source, transform and or destination objects I would need to accomplish the following:

If Student doesn't exist in DIM table Then

Query production database for student bio information

using result set from query above, insert new record into dimStudent

End If

Continue processing flat file.

Thanks in advance

Script help: SP_Addlinkedserver - SQL Server 2008r2 to Oracle DB Add a linked server to Oracle DB

$
0
0

Added the Oracle driver to SQL Server (from the oracle web site

The SP Failed: (note - server and login text are changed to protect security)

USE [master]
GO
--Create the Oracle Linked Server:  

EXEC sp_addlinkedserver @server, @srvproduct, @provider, @datasrc
EXEC sp_addlinkedserver 'FinderView2' , 'Oracle', 'OraOLEDB.Oracle', ' mr.world:1521/Dens249'
EXEC sp_addlinkedsrvlogin @rmtsrvname, @useself, @rmtuser, @rmtpassword
EXEC sp_addlinkedsrvlogin 'FinderView2', 'FALSE', 'AUserName', 'APassword'      

<result> Msg 137, Level 15, State 2, Line 2
Must declare the scalar variable "@server".
Msg 137, Level 15, State 2, Line 4
Must declare the scalar variable "@rmtsrvname".

also: the actual view for this password is RMWell_Reg is that missing somewhere?


Rx

SSIS package under SBS 2008 validation takes a long time

$
0
0
We have a very simple SSIS package that basically generates a Text file from a view.  We are running this under SBS 2008 which is bundled with SQL Server 2005.  When running either inside or outside of Visual Studio the package takes almost 90 seconds to validate.  The actual execution is fractions of a second as we watch it under the debugger.  I have seen several posts related to slowness in the development environment, but this happens running it embedded in a stored procedure using:

 
dtexec /sq packagename

What is curious is that if I export the package and install it on another test machine it runs less than a second.  The test machine is running Server 2008 R2.

http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/cf39e553-36ab-47db-a1d7-f4587ed89cea/ sounds similar, but I am not sure how to identify the "SQL Agent service account user" to attempt this solution.

We are using a SQL Command in the SSIS package to select one column out of the view.  All very simple.

B. Warner

Viewing all 24688 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>