Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Script Task to Create a ".CSV" file and save it on SFTP.

$
0
0
I have a sql query that is being pulled from a variable. Using this variable I want to create a .csv and save it on SFTP. I do not want to use any third party tools. I really appreciate if anyone can help me with the code. I do not know coding.  

SUM and GROUP BY and...another field

$
0
0
I need to do a SUM and GROUP BY but I need a descriptive field.  Is that even possible?

Connectiong SFTP through WinSCP using SQL Agent

$
0
0

Hi Everyone,          

I have an SSIS package, which connects to SFTP site and downloads the files. The package runs perfectly fine when executed through BIDS or command line prompt.

To connect to SFTP, I am using winscp and connecting to it through script task

But when I execute the package as the job, it is not getting connected to SFTP using WinSCP.

I am getting an error that server's host key was not found in cache. I am getting below error

----------------------------------------------------------------------------------------------------

The server's host key was not found in the cache.
You have no guarantee that the server is the computer you think it is. 
The server's dss key fingerprint is:  ssh-dss 1024 c1:70:cb:3b:27:16:10:ed:a5:c8:e5:ba:8f:1b:ec:ad 
If you trust this host, press Yes.
To connect without adding host key to the cache, press No.
To abandon the connection press Cancel. 
Continue connecting and add host key to the cache?  (Y)es, (N)o, C(a)ncel, (C)opy Key: Cancel  Host key wasn't verified!  Authentication failed.

Please let me know how this issue can be resolved

Thanks in advance,

Raksha




Flat file import issues with delimited and fixed width--suggestions please.

$
0
0

I'm trying to import a wide table with (50 columns) from Sql to flat file and from the flat file to sql from 2 different system.

The problem I'm facing are:

1. Most columns are of Unicode data. Out of which:  2 are NTEXT. so, I'd to use varchar(max) or else it gives me conversion

errors.

So, the approach I took was that I converted all the UNICODES columns to varchar(length) and varchar(max) and using dts extracted as a tab delimited file. But when I try to read it using dts/ssis to insert it back to sql, the file is all messed up.

I tried to read it as a fixed length (by reading the source definition), but  I didn't know how to calculate row length since it has 2 varchar(max) fields.

How can I properly accomplish this task - from sql to flat file first, and from flat file to sql?

I'd appreciate any help and suggestions.

Foreach Loop Teradata Macro using variable

$
0
0
 

Hi All,

 

    I am running a Teradata Macro that I need to loop through the Foreach Loop using a variable from this loop.  I used a script task to get the variable from the object variable but the format is

Example:

        "26" the macro does not recognize this the macro needs the value to be '26' or 26. 

 

Can anybody tell me how i can achieved this?

Thank you in advance!!!

DHowe


DHowe

Composite key constraint in SSIS

$
0
0

Hi All,

I have created one SSIS package to move data from one table to another table in same database using oledb connection.Now I want to enforce composite key constraint while moving data as follows:

Table A has following contents :                                                 

Col1   col2  col3

1         a        b

2         c        d

3         a        b

So,while moving data, i want to verify the contents of col2 and col3(col2+col3) ie,composite key made of col2 and col3.In this case i want to move data of row 2 only and data of row 1 and 3 will go to error log file.

I am trying to use lookup here but no luck yet. Can anybody help me to achieve this .

Thanks in advance,

Sanket 

Create a subfolder with FTP task

$
0
0

Hi

quick question... from filezilla if I create from the root '/toto/tata' => I ve got a new folder 'toto' and a subfolder 'tata'

If I load a variable with the value  '/toto/tata' to create a new folder from a FTP task : it doesn't run ... Why

I don't hope so to read as the solution that we have to create in first 'toto' then 'tata'...

No Nightmare please :-)

passing the same input parameter twise in execute sql task

$
0
0

Hi All, I want to insert some values to 3 different tables in sql server. Execute sql task is used to populate three tables. Here is the sql statement.

DECLARE @Dt AS DATE

SET @Dt = ?

INSERT INTO TABLE1 SELECT ?, COL2, COL3 FROM TABLE_A

INSERT INTO TABLE2 SELECT ?, COL2, COL3 FROM TABLE_B

Input parameter is mapped as follows :

Variable name : User::EffectiveDate

Direction : Input Data Type :

Date Parameter name :0

Parameter size :-1

User::EffectiveDate is datetime variable.

When the package is executed, it throws an error.

[Execute SQL Task] Error: Executing the query " " failed with the following error: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. I am not sure what I am doing wrong here. If anyone could point me to the right direction, I really appreciate.

Thanks


shamen



Web service Task - Value pair as variable issue

$
0
0

firstly .. I  not an .net Developer. 

We have scenario in which I have a execute SQL task which sends a set of rows with two columns as an variable (object) to a for each loop container.The for each loop container has the web service task and each time it runs.

The web service task accepts allows/accepts a single variable but I need to provide the above two columns as pair to this complex type.....any help is appriciated....


Running a package with a configuration file

$
0
0

Hi folks:

I built an SSIS package that adds and updates a database. In BIDS, I used the connection manager to point lookups and database access to the test database. All working fine. Now I am trying to redirect the SSIS package to run against the production database without having to recode all the lookups and database accesses.  I set up a config file in BIDS after reading posts which BIDS is reading as I execute in debug mode but can not get it to run against the production database no matter what permutation or combination I use. Thoughts?

SSIS 2012 URGENT - Trouble passing varible value to data flow it doesn't work the same as 2008r2.

$
0
0

Hi All,

     I am trying to EXEC a TeraData Macro, this should work the same as a SP.  In the Source OLE DB i have set the variable to have the followingvalue.

EXEC OUTCOMES_APP_OWN.GET_RA_MEMBER_MACRO (30000,20130101,20130201);

     variable used in this statment

                     var_Plan      30000

                     var_Start_Date   20130101

                     var_End_Date      20130201

    I have set up variable prior to this data flow using Script Task (this piece has been tested) 

    The problem is when i run the package the value seen above are what is being use in the macro not the values i have set them too?  In 2008R2 all I needed to do was have the variable value in the EXEC statement point to the correct variable is something else needed to 2012? 

Thanks for your help in advance!!!


DHowe

Visual Studio 2012 and SSDT 2012

$
0
0

Hello and Happy New Year!

I have been using BIDS for a long time. We just upgraded our Visual Studio to 2012. I also just downloaded SSDT for 2012. Is there a doc anywhere or Blog that says how to import the BIDS 2008 SSIS Projects? I don't really know where to get started.

Thanks
Mike


Mike Kiser

in ssis how to load one to many relation data

$
0
0

Hi,

   I have a destination in which we have 9 columns which has grouped of 3 i.e. set of 3 columns are related with each other. Previously source DB has same structure but now there are changes in schema of source DB. Now instead of 9 columns we have set of related 3 columns and instead of 9 columns we have 3 rows each for 9 columns in separate table.

Can some one give me any idea how to map these 9 column from new table to old table.

Below is the schema of source & destination tables

Destination table schema

Source Tables schema & Bridge table detail (Have 1 to many relationship with source table)

 

 

 

 

 


DS

Creating a Source with the Script Component - Problem with OutputBuffer

$
0
0

Hello everyone,

I hope some of you can help with this issue. I'm trying to pull users from LDAP into SQL Server database. I've created an SSIS package that:

  1. Pulls LDAP users into a package variable called LDAPUsers type Object (string[]) using a Script Task
  2. Truncates the target table
  3. Using a script component (transformation) as the source inside a dataflow, I pull each and every record in the object variable and then I add it to my Output (only 1 column). The destination is an OLE DB Destination to my SQL Server database.

The issue that I'm facing is that for one LDAP group - let's say Group A - where I have around 60,000 users I only get 57344 inserted in my table, while for Group B with only 1004 users I get 0 rows wrote.

I did a quick experiment and I changed the DefaultBufferMaxRows to 100,000 and executed my package to pull users from Group A (60,000) the result was the same as Group B; 0 rows wrote. If I set the value back to 10,000 then I get the 57344 users. I tried to downsize the DefaultBufferMaxRows to 1000 and see if I could get the users from Group B but it didn't work.

Not sure what I am doing wrong, if it weren't working at all I'd get 0 rows wrote all the time, not just with group B.

In my code below you will find a counter which I placed just to check how many times this line is executed

EmployeeIdsOutputBuffer.AddRow();

If I use a breakpoint at the end of the method I can see that i is equal to the total number of users in each group (A or B), so the rows are being added to Output EmployeeIdsOtuputBuffer

[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute] public class ScriptMain : UserComponent { DataTable dataTable = new DataTable(); string[] ldapUsers; public override void PreExecute() { base.PreExecute(); dataTable.Columns.Add("employeeid"); ldapUsers = ((IEnumerable)this.Variables.LDAPUsers).Cast<object>().Select(x => x.ToString()).ToArray(); } public override void CreateNewOutputRows() { int i = 0; foreach (string ldapUser in ldapUsers) { DataRow row = dataTable.NewRow(); row["employeeid"] = ldapUser; dataTable.Rows.Add(row); i++; } i = 0; foreach (DataRow row in dataTable.Rows) { EmployeeIdsOutputBuffer.AddRow(); EmployeeIdsOutputBuffer.employeeid = row["employeeid"].ToString(); i++; } Console.WriteLine(i); }

Thanks.


Ricardo Barona Application Developer


File System Task - using a wildcard in variable

$
0
0

Hi,When using the "delete file" operation in the file system task I get an error stating that I have an incorrect path when using the wildcard symbol.  Is there a way around this or do I have to spell out the whole name for each file in the directiory.  I cannot not use the "delete directory contents" operation because I have two sub directiories in my directory.  I'm trying to use the following when I get the error

\\devbox\df\*.csv  when I spell out whole name of the file it works.  Any suggestions?


LISA86


Column cannot convert between unicode and non-unicode string data types

$
0
0

Hi,

I have an SSIS package which when run from Development studio, is running absolutely fine, but when i run the same package from SQL Server Agent JOB, it throws the following error:

"Column cannot convert between unicode and non-unicode string data types"

Any idea about such behavior?

Using Credentials with OPENROWSET

$
0
0

I am trying to use an OPENROWSET to access an excel file that is located on a shared drive in order to import the data. I cannot place the file on the SQL Server Machine, so I need to access using the shared drive. I had assumed a Credential would provided the needed access to that file, but cannot determine the syntax. The below gives me the error: Incorrect syntax near the keyword 'with'. If this statement is a common table expression or an xmlnamespaces clause, the previous statement must be terminated with a semicolon.'  How can I access the file on the shared drive using OPENROWSET?

INSERT INTO tbl_tempImport 
   SELECT * 
   FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
                   'Excel 12.0;Database=//vm/folder/file.xlsx;HDR=YES',
                   'SELECT * FROM [Sheet1$]')
   WITH CREDENTIAL MyCredential

EDIT Attempt as suggested by RSingh():::

Attempted using a proxy. Created CREDENTIAL as SharedDriveAccess and proxy as SharedDriveAccess

Below SQL yields error "

Msg 102, Level 15, State 1, Line 7
Incorrect syntax near 'SharedDriveAccess'.

"

INSERT INTO tbl_tempImport 
   SELECT * 
   FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
                   'Excel 12.0;Database=//vm/folder/file.xlsx;HDR=YES',
                   'SELECT * FROM [Sheet1$]') EXECUTE AS 'SharedDriveAccess'


EDIT Attempt as suggested by pituach:::

Attempted including credentials in OPENROWSET criteria. Error 

OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" returned message "The Microsoft Access database engine cannot open or write to the file '\\dcr-kpm-ms-01\McareWorklistTempFiles\tempImport.xls'. It is already opened exclusively by another user, or you need permission to view and write its data.".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".


INSERT INTO tbl_tempImport 
   SELECT * 
   FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
                   'Excel 12.0;Database=//vm/folder/file.xlsx;HDR=YES;UID=user;PWD=pass',
                   'SELECT * FROM [Sheet1$]')


SSIS Fact table load

$
0
0

Hi Experts,

Need you valuable suggestion.

I am loading a fact table which is having 16 look ups to get the surrogate keys from dimensions.Volume is around 3million.

It is taking lot of time to load the table (around 3-4 hours). Is it because of large number of lookup in a single package?

can you please give some design tips to reduce my load time? Also I am having 6 date fields to lookup with Date dim. Is there any way to get connection once and use it ? I came to know about cache manager for that. Can anyone please help me to understand that?

Thanks

mukejee

script ssis - cdc packages

$
0
0

Hi all,

I have setup a SSIS package that reads CDC tables and uses the CDC-related tasks to transfer changes to a non-SQL Server destination.

This works fine for 1 table, but I'd need to do so for approx 5000 tables spread over 100 databases spread over 5 instances.

Anyone any experience with scripting this kind of SSIS packages?

Thanks


Geert Vanhove DCOD ------ http://geertvanhove.wordpress.com/ ----------- Please click the Mark as Answer or Vote As Helpful if a post solves your problem or is helpful!

WCF and other ways to move data from application database

$
0
0

Hello everybody,

I'm thinking about an efficient way to transport data from our "calculation application" to DWH and would appreciate some help.
The "calculation application" imports data about financial positions, instruments, ratings etc. every night in order to evaluate the corresponding risks. The imported data + the calculated risk estimations are persisted in the application database for 10 days. After each calculation the risk estimations (facts) and corresponding dimensions are also loaded into DWH using SSIS packages where the data is persisted for much longer time.
There is also some logic which is implemented in stored procedures in the application database. An example would be a mapping of positions to organisational structure based on some business rules. This logic is needed during the calculation and additionally these stored procedures will be called when loading the DWH because the DWH also needs this information. The mapping result loaded into DWH is relative big - around 1 Mio records per load.
Now comes the change: application developers want to move the logic from stored procedures into application code and access it over .NET WCF. This works fine for them because the whole code is .NET, mostly doesn't need a complete mapping result but rather a small part of it and they can build a good in-process cache for such data.
For DWH I have came up with 2 options:
a) The SSIS packages will be rewritten so that they use a WCF Service insted of plain old OLE DB DataSource.
b) Application pushes the mapping result into stage of DWH after imports are done (for instance, with SQLBulkCopy)

To (a) - I know, it's technically doable, but I have my doubts whether it will be fast and reliable enough (it will be definitely slower compared to T-SQL way doing of things). Please, share your experience if you have seen such things in production.
Any other ways of accomplishing the same task (moving the data into DWH) are also very welcome :)

Viewing all 24688 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>