Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Oracle connection failure

$
0
0

Hi,

am trying to load a oracle table into sql server.

I created a connection manager for oracle and test connection succeded.but when i use that conn mgr in my execute sql task, its throwing me err

[Oracle Source [1]] Error: The AcquireConnection method call to the connection manager XXXX failed with error code 0x80004005.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.

The connection manager type is MSORA

Can you pls help me on this



Automation for Teradata to SQL Server PDW transformation

$
0
0

Hi all,

  We are working on a trasformation project where the requirement is to migrate both objects and data from teradata to SQL server PDW.

  Is it possible to automate some part of work while transforming from teradata to Sql server PDW??

 Your suggestion will be very helpful.

Thanks,

Dinesh

 

PDW Destination Adaptor

$
0
0

Hi All,

Heard a lot on PDW and the destination adaptor it has got for SSIS too. Has anybody have an idea of whether this adaptor comes as part of Parallel DW configuration or is an independand downloadable available? Can some one point to the right spot?

FTP task runs fine from SSIS Package Store, always times out in scheduled job

$
0
0
  • SQL Server 2012
  • VS 2010 SQL Server data tools
  • FTP Connection Manager
  •  - port 21, chunk size 1kb, passive mode=false, saved plain text pw
  • Downloads 1 csv file to local directory on SQL Server box

This always works when run from the package.  And always gives a timeout error when scheduled.

Message
Microsoft (R) SQL Server Execute Package Utility
Version 11.0.2100.60 for 64-bit

Started:  11:04:59 AM
Error: 2014-03-27 11:05:31.12
   Code: 0xC001602A
   Source: xxx Connection manager "FTP Acme1"
   Description: An error occurred in the requested FTP operation. Detailed error description: The operation timed out
End Error
Error: 2014-03-27 11:05:31.14
   Code: 0xC002F304
   Source: Acme FTP Task
   Description: An error occurred with the following error message: "An error occurred in the requested FTP operation. Detailed error description: The operation timed out".
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started:  11:04:59 AM
Finished: 11:05:31 AM
Elapsed:  31.949 seconds

SQL Server Agent is set to Logon As my domain account, and in the package history it says "logged on as " my account.

Any suggestions?  Thank you

Unpivot table between two dates

$
0
0

Hi everyone

I have a calendar table and there are two date columns which are start and end dates. It indicates the start and end date of the appointment. As you can see below, I am starting to holiday on 12.th July and will end on 24.th June 2013. 

However, when I use in a report then it displays holiday in the first day of the holiday (12th. July) 

How can I convert table as below? 

Thanks in advance..


KAdir


KAdir

How to get flat file size & row count with only a file name mask

$
0
0

Hi All,

Question: How do I get a flat file size and row count with only the file name mask?

Background: Since the FTP server file(s) I want to download have the date and time in the file name (example: "somefile_2014-03-22-12-15-32.csv"), I use a mask  like "somefile_2014-03-22-*.csv" to automatically download every file for a given day.  (Yes, they occasionally split the total contents into multiple files.) We run the download package every day, but occasionally our processing must wait a day or two, and looping through dates is how we automatically download older files that weren't download on the day it was generated.

Since we don't want to import files out of order, the entire download process must stop when a file row count or file size is below the expected threshold amount.  If the count or size is not up to expectations, I set a "kill switch" variable that prevents further file downloads in the following loop executions.  No problems there.

Since I have the file mask and folder location, I assume: 1) the easiest method is to use a Script Task in my loop, 2) I need only loop through every file that meets the name mask and accumulate the row counts and file sizes, 3) setting variables to these values would allow me to later make decisions based on those values as well as log those values, like whether or not to hit the kill switch.

Easy, right?  Except I don't know how to do this in C#.  I imagine I can figure it out using web resources, but could somebody please tell me if I am on the right track?  I see this post (http://social.msdn.microsoft.com/Forums/sqlserver/en-US/7f82245c-3bbf-4de0-bd82-dc458891002a/how-to-loop-flat-text-files-which-have-different-row-count?forum=sqlintegrationservices) where Reza Raad suggests using a loop and setting a flat file connection manager to the file in each, THEN checking the file line count, etc.  This seems like excessive work for what I need to accomplish.

Would appreciate any help!

-Eric B.

Loading multiple .csv files into a table using SSIS

$
0
0

Hi,

I have a requirement where I have 200+ csv files to be loaded into Netezza table using SSIS.

Issue I am facing is all columns have different number of columns, for ex, file 1 has columns A,B,C and file 2 has columns C,D,E. My target table has all columns from A to E in it. 

But, when I am using for each loop container, only the file for which I have specified filepath+filename in loop variable, that is getting loaded. Rest all files, no data is getting loaded from them and package is executing successfully.

Any help is appreciated.

Regards,

VT

Which SSIS Data Conversion type is needed for Excel destination when the source SQL text field contains semicolons

$
0
0

Hello,

I am using Visual Studio in a SQL Server 2012 environment for an SSIS package designed to output my sql query to excel.

I am having trouble with the Data Conversion section of my data flow for one field in my source SQL query (works fine in sending everything to excel when I un-check this field in the data conversion box so I am not handling this field correctly).

The field contains a compilation of values separated by a semicolon. I am trying to use Data Type: Unicode string DT_WSTR and tried lengths of 50, 100, 200), but it fails.

Here is an example of this field's contents for 1 record:

BH   and HH Highlight Access; Quality/Risk/UR Staff; MTH and Leadership; Base   Group; Meaningful Use; Meaningful Use - BH; BH and HH Scorecard Access

Any suggestions?

Thanks for your assistance

DaveDVF




Logic to import data from n number of CSV to n number of tables dynamically

$
0
0

I have C:/source folder which would contain n of csv files. 

Suppose, I have 2 CSV files in this folder

I need to select first CSV and put all records into first SQL table and then same for 2nd CSV (from 2nd CSV to 2nd table) 

I don't know no of columns & column details of CSV. 

How can we achieve it? which all tasks can we use?

Note: I tried to use for each task.. and DFT inside it.. but I am confused how to use destination table task


-Vaibhav Chaudhari

SQL PDW Destination Adapter hanging

$
0
0

HI,

I have used a SQL PDW Destination Adapter component in my SSIS package to copy data from SQL OLE DB table to PDW Table. As a sample test my source contains just 2 rows of data.

My package is stuck at "Execute Phase is beginning" .

I am able to insert records in the target table through NEXUS into the PDW table.

Any idea why i am facing this issue? Please help if i am missing some configuration/setting here.

Thanks.

Database Documentation tool

$
0
0


Hi,

 Does anyone have experience in using any tool that would document the SSIS package, I know there are few in market, Is there something that I could use that would help me document the package in detail, focussed on extracting complex query logic.

 


Best Regards, Arun http://whynotsql.blogspot.com/

100 packages for 100 seperate files in SSIS 2012---------

$
0
0

Hi,

I am new to this Microsoft Forum.

I have worked for last 3 years in MSBI(mainly SSIS and SSAS). Recently I have joined a new company and been assigned to new project. Here we have to first build Operational Data Stores from 50 source files (which is built-100 tables). then from this we need to pull data in Data Warehousing and later Cube.

My question is: My architect telling me to create 100 separate packages for each source file from source to staging (one to one source to staging mapping) and connect it through master package. then when we pull data from staging to Dev. there would be 30 destination tables so 30 packages!!

I have not designed any ETL in this way.

the files can be segregated as per daily, monthly, weekly and quarterly, so I would have created daily, monthly and so on package. In each package I would have included source to staging and staging to development phase.

His points are, testing team  can work separately on each package, multiple developers can work on separate files at the same time. can anyone tell me how feasible the design is.....advantage/disadvantage... But Would  it be good to handle 100 odd packages for one database where I can achieve this by creating 6 packages.

thanks, Rims

Conditional split editor: How to set allowable data values

$
0
0

Environment: SQL Server 2008 R2
Purpose: Data validation and proper handling. If the loaded data mismatch our requirement redirect those data into a flat file, while the right ones go to staging table for loading process
Problem: How to set allowable data values in conditional split editor or by any SSIS transformation for the following items

Field#1
"R" , "I" , "II", "III", "IV" ,"5" or "5.0", "5.1" , "5.2",
"5.3" , "5.4" , "5.5" , "5.6" , "5.7", "5.8", "5.9", "6.0", "6.1" ,"6.2" , "6.3" , "6.4"
"6.5", "6.6" , "6.7" , "6.8", "6.9"

Field#2
0,0+,1,1+,2,2+,3,3+,4,4+,5

Field#3 SSN is numeric and length not > 9

Field#4
 [A,B,C,D, Z]

Field#5
1,2,3,4,A,C,D,F,H,M,N,O,X,Z

Here is the following picture, where my attempt have failed for some of the field while the rest didn't know how to address

Please help



disable logging when log path not exists.

$
0
0

i use this expression to initial the log connection "\\\\" + @[User::Server] + "\\" + @[User::Path] + "\\" + @[System::PackageName]+".log".

In some case there are may not have shared folder.so i should not  generate log file.

i put \Package.Properties[LoggingMode] = 2 (disable)in the set value tab in job properties. i intent to disable logging.

but it still check the log connection and get failure. how to make it not to check the log connection when \Package.Properties[LoggingMode] = 2 (disable)

Lookup transformation from sample AdventWorks giving error

$
0
0

I am following the sample given in the msdn adventworks and I am on the step 9 of the testing package phase and I can't seem to fathom what the problem might be, I have mapped key to the correct columns and is still giving me error , the error details are displayed here: 

SSIS package "Lesson 1.dtsx" starting.
Information: 0x4004300A at Extract Sample Currency Data, SSIS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Extract Sample Currency Data, SSIS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Extract Sample Currency Data, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Extract Sample Currency Data, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Extract Sample Currency Data, Extract Sample Currency Data [1]: The processing of file "C:\Users\bishal.TRIANGLE\Desktop\sample adventworks\Integration Services\Tutorial\Creating a Simple ETL Package\Sample Data\SampleCurrencyData.txt" has started.
Information: 0x400490F4 at Extract Sample Currency Data, Lookup Currency Key [26]: component "Lookup Currency Key" (26) has cached 14 rows.
Information: 0x400490F5 at Extract Sample Currency Data, Lookup Currency Key [26]: component "Lookup Currency Key" (26) has cached a total of 14 rows.
Information: 0x402090E2 at Extract Sample Currency Data, Lookup Currency Key [26]: The component "Lookup Currency Key" (26) processed 14 rows in the cache. The processing time was 0.016 seconds. The cache used 728 bytes of memory.
Information: 0x402090E4 at Extract Sample Currency Data, Lookup Date Key [51]: The component "Lookup Date Key" (51) succeeded in preparing the cache. The preparation time was 0.001 seconds.
Information: 0x4004300C at Extract Sample Currency Data, SSIS.Pipeline: Execute phase is beginning.
Information: 0x402090DE at Extract Sample Currency Data, Extract Sample Currency Data [1]: The total number of data rows processed for file "C:\Users\bishal.TRIANGLE\Desktop\sample adventworks\Integration Services\Tutorial\Creating a Simple ETL Package\Sample Data\SampleCurrencyData.txt" is 1097.
Error: 0xC020901E at Extract Sample Currency Data, Lookup Date Key [51]: Row yielded no match during lookup.
Error: 0xC0209029 at Extract Sample Currency Data, Lookup Date Key [51]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "component "Lookup Date Key" (51)" failed because error code 0xC020901E occurred, and the error row disposition on "output"Lookup Match Output" (53)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Extract Sample Currency Data, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Lookup Date Key" (51) failed with error code 0xC0209029 while processing input "Lookup Input" (52). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at Extract Sample Currency Data, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Extract Sample Currency Data, Extract Sample Currency Data [1]: The processing of file "C:\Users\bishal.TRIANGLE\Desktop\sample adventworks\Integration Services\Tutorial\Creating a Simple ETL Package\Sample Data\SampleCurrencyData.txt" has ended.
Information: 0x40209314 at Extract Sample Currency Data, Lookup Date Key [51]: The component "Lookup Date Key" (51) has performed the following operations: processed 1 rows, issued 1 database commands to the reference database, and performed 0 lookups using partial cache.
Information: 0x402090DF at Extract Sample Currency Data, Sample OLE DB Destination [76]: The final commit for the data insertion in "component "Sample OLE DB Destination" (76)" has started.
Information: 0x402090E0 at Extract Sample Currency Data, Sample OLE DB Destination [76]: The final commit for the data insertion  in "component "Sample OLE DB Destination" (76)" has ended.
Information: 0x4004300B at Extract Sample Currency Data, SSIS.Pipeline: "component "Sample OLE DB Destination" (76)" wrote 0 rows.
Information: 0x40043009 at Extract Sample Currency Data, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: Extract Sample Currency Data
Warning: 0x80019002 at Lesson 1: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Lesson 1.dtsx" finished: Failure.
The program '[7056] Lesson 1.dtsx: DTS' has exited with code 0 (0x0).

Any help would be greatly appreciated.


"dtutil", how to tell which configuration file for packages using after deployment?

$
0
0

Hello All, 

Trying to achieve this feature, 

Using DOS command to automatic my deployment process--glad I found dtutil. However, it doesnt give you any chance to identify which configuration file to be used by SSIS packages after deployment.

Deployment Method: file deployment

Configuration sued: XML file and SQL Table. IN XML file, it tells which DB connection for packages to look up the configuration table.

Who can share some thoughts on this?

 

 


Derek

does analysis service process task honor dynamic connections?

$
0
0

simple question.. I have a ssis package which contains analysis service process task.. it went well in the dev env. Now, I've deployed to production and connection manager was set by configuration tables.. however, it seems to me that the connection manager on production is still pointing to the dev env... any workaround? 

thanks


--Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --

How to use lengthy query in a "SQL Command from Variable"

$
0
0
My oracle SQL Query length is more than 14000 characters, so how can i use this for "SQL command from variable".

Sarvan

How to add commnets in Data Flow

$
0
0

Hi Experts

Is there any way or option to add some text or comments about each data flow components that I use in my package. for example:

If I use lookup transformation and want to put some comment like the purpose of this look up is....... Please advice

Regards

Muz

SSIS - Practise Packages

$
0
0

Hi,

I have Sql Server 2012 installed and i have the adventureworks and NorthWind Database.

I am looking for some practise T-sql and SSIS excercise packages on those database. Please provide the links or download paths


Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>