Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

iterate through Folders

$
0
0

Hello friends

Need help

My current Path  :-

C:\Data\Application \2015\abc 2015.csv

i am using a For Each Loop  and foreach file enumerator and data for this file is loading fine

Noow we got new file for current Year and that is stored as  :

C:\Data\Application \2016\abc 2016.csv

So it becomes C:\Data\Application 

2015\abc 2015.csv

2016\abc 2016.csv

Now I need to iterate through 2 folders and read the files and insert it into single table

plz help with steps


No Rows From Source

$
0
0

Hi all

My Source is oracle etrm.i can able to pull data from all tables and views except some,all permissions and connection strings are same for every package

But for only some views and tables i cant able to see the data through OLEDB SOURCE but data is actually exsist

I cant  see the data in OLEDB source when i hit preview (Its empty rows with only  columns) but data is there in oracle server


ADKR

Usage of Attunity drivers - Microsoft Connectors v2.0 for Oracle

$
0
0

Dear,

One basic query, I know there are many third party transformations and other tasks available. However I was asked to not use any other third party provided t/F or tasks, because of securit concern

However the connectors of oracle by Attunity is given in Microsoft site itself to download, and I understood it is pretty fast than other connectors. So my query is that is it fine to use this in prod environment and it doesnt create any security issues.

Apologize in case if this is not a valid query

Microsoft Connectors v2.0 for Oracle

Thanks!

Sham

SSIS + Slow performance on SCD component on a compressed dimension table with only 150.000 rows

$
0
0

Hi all,

I have a dimension table with only 150.000 rows. I'm using a SCD component which is running way too slow. (Atleast 15+ minutes). Other dimensions which are a lot bigger are running faster. I have no idea 

The SCD query behind this :

(@P1 uniqueidentifier)SELECT [BK_VISIT], [CAT_VISIT_STATUS], [DTT_VISIT], [TX_NAME], [TX_VISIT_STATUS_NAME],[FK_DATE], [FK_INVOICE_DATE], [FK_POS],[FLG_STATE_CODE],[FLG_STATUS_CODE] FROM [pos].[TD_VISIT] WHERE ([BK_VISIT]=@P1)


My only thought so far is that the uniqueidentifier component is slowing this down. Any experience with this?

EDIT : apologies, can someone move this to the SSIS section?

DATAFLOW TASK IS HANGING WHILE LOADING DATA FROM ORACLE SOURCE TO SQLSERVER DESTINATION

$
0
0

Hi,

we designed a staging ssis solution in which 50 child packages are running through master package.

The problem is in some child packages, after loading data from oracle source table to sql server destination table in data flow task, remaining data is not inserting. i.e., suppose consider 3lakhs rows are there in source oracle table, after inserting 1lakh rows around in sql server destination table, it is showing same count in data pipeline of source to destination(means data is not inserting) and keep on executing mode and not showing any error in progress. earlier we don't have this problem. now we are facing this problem. why? we are facing this problem randomly(may occur in any package and can't predict).

note1: after hanging dataflowtask, if we stop exection manually and started masterpackage again, then data is inserting fine in respective child package for some time(here hanging package starts againg becasue of our package design). most of time, i observed initially data is inserting fine and after some time of execution, not inserting rows(this problem is in middle of execution and not at starting of execution)

note2: checked task manager, it is showing ram is not full.physical memory is 56% and cpu usage is 17% and also we configured some properties(default buffer max rows are 40000 and default buffer size is 8gb and ram is 16gb as suggested in google) and also configured rows per batch to 10,000 rows in some packages loading large volume of date and unchecked table lock also in ole db destination in all packages

what is the solution for this?


tsrkreddy


SSIS

$
0
0

SSIS FF issue

how to skipp the records 

flat ile

#Web log for Wise Owl site for first 5 hours of 23rd April 2013
#Confidential - not for redistribution in any form
#Fields: date time cs-uri-stem cs-uri-query s-port cs-username c-ip cs-version cs(User-Agent) cs(Cookie) cs(Referer) cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken 

flat file should pick the columns as

date |time |cs-uri-stem |cs-uri-query| s-port| cs-username| c-ip| cs-version| cs(User-Agent)| cs(Cookie)| cs(Referer)| cs-host| sc-status |sc-substatus| sc-win32-status| sc-bytes| cs-bytes| time-taken 

Code: 0xC0047038 SSIS Error Code DTS_E_PRIMEOUTPUTFAILED

$
0
0
Hi All, My environment is SQL Server 2008 R2. I get the following message when I try to schedule a package via SQL Server Agent. The same package when ran from another MS SQL Server machine runs just fine. I've tried to change the DFT's property BufferTempStoragePath but no luck. Warning Message before the Error Message: =========================================== DFT STG_MOB_FACT, Not enough storage is available to complete this operation. Error: ====== Executed as user: MZERO\dw_service_acct. Microsoft (R) SQL Server Execute Package Utility Version 10.50.1600.1 for 64-bit Copyright (C) Microsoft Corporation 2010. All rights reserved. Started: 4:30:23 PM Error: 2010-12-07 16:30:25.36 Code: 0xC0047038 Source: DFT STG_MOB_FACT SSIS.Pipeline Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE Source STG_MOB_FACT" (1) returned error code 0x8007000E. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 4:30:23 PM Finished: 4:30:25 PM Elapsed: 1.389 seconds. The package execution failed. The step failed. Please help!!!
GBM

Oracle Password not working after 24hrs

$
0
0

As the title says, my password for any Oracle connection seems to 'expire' after 24hrs. This is using either the OLE DB provider from Oracle, or the Attunity provider.

For both providers, I can change the password in Oracle SQL Developer and then setup a provider in my SSIS project. Test the connection successfully, Execute a few tasks, add some additional tasks etc and its all good.

Lock my computer, come back the next morning and.... Invalid Username/Password. Delete the Connection and create a new one with the same TNS, same result.

Opening Oracle SQL Developer I can still access the DB with my old credentials, including creating a brand new connection off the same TNS. Unfortunately, the only way to get in via SSIS is to change my password and update my connection manager\config\variable\whatever in the SSIS package. Checking and changing 32/64bit runtimes, restarting Visual Studio, restarting the dev machine etc does nothing.

So.... help? What the hell can this be?


The Coggs Machine


#TempTable for Email Message

$
0
0

I have created a #TempTable for storing an email message and am trying then to UPDATE the #TempTable_EmailMessage contents with an email message. When I attempt running it, I am getting...

UPDATE failed with the following error:

"Invalid object name '#TempTable_EmailMessage'. Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly

The ResultSet property is set to None. Is this some sort of pre-validation eror since I'm creating the #TempTable on the fly? Do I need to create a ##TempTable instead???

Thanks for your review and am hopeful for a reply.

SSIS to Azure: Erratic error "Login Timeout expired"

$
0
0

Hi Community

We have several SSIS packages transferring data from a local SQL server to an Azure server. The packages are executed after each other by an SQL agent job, which is scheduled hourly.

The job is configured to re-execute a package if it fails after 3 minutes for 3 times before moving to the next step.

Unfortunately we face very inconsistent login timeout issues with these packages. There is not much else running on the azure server but still we receive the following errors several times a day, in a row or sometimes for hours:

Message

Executed as user: NT Service\SQLSERVERAGENT.

Microsoft (R) SQL Server Execute Package Utility  Version 12.0.2000.8 for 64-bit  Copyright (C) Microsoft Corporation. All rights reserved.   

Started:  4:00:00 AM 

Error: 2016-01-15 04:03:05.82    

Code: 0xC0202009    

Source: RVVDataTransfer_Material Connection manager "OneBitAzureDB"  Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.  An OLE DB record is available. 

Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description:"Login timeout expired".  An OLE DB record is available. 

Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description:"A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.".  An OLE DB record is available. 

Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Named Pipes Provider: Could not open a connection to SQL Server [53]. ". 

End Error 

Error: 2016-01-15 04:03:05.83    

Code: 0xC020801C    

Source: Transfer Material Data Destination tbl_Material [11]     Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "OneBitAzureDB" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed. 

End Error 

Error: 2016-01-15 04:03:05.83    

Code: 0xC0047017    

Source: Transfer Material Data SSIS.Pipeline     Description: Destination tbl_Material failed validation and returned error code 0xC020801C. 

End Error 

Error: 2016-01-15 04:03:05.83    

Code: 0xC004700C    

Source: Transfer Material Data SSIS.Pipeline     Description: One or more component failed validation. 

End Error 

Error: 2016-01-15 04:03:05.84    

Code: 0xC0024107    

Source: Transfer Material Data      Description: There were errors during task validation. 

End Error 

DTExec: The package execution returned DTSER_FAILURE (1). 

Started:  4:00:00 AM 

Finished: 4:03:05 AM 

Elapsed:  185.579 seconds. 

The package execution failed.

 

Sometimes the packages can be executed without any issues, than from one minute to another the login timeout expired error occurs and disappears suddenly after hours, minutes or even seconds.

While packages raise this error we are still able to login to the azure server with the management studio without any recognizable delay or even error message.

We have also already changed the provider used for the connection from ADO to OLE without success.

Any further hint would be very much appreciated.

Issue in importing data from Excel SSIS

$
0
0

Hi,

I am trying to import the data from an excel file with multiple sheets named as

Sheet1

Sheet2 etc

I am following the approach mentioned here

http://www.techbrothersit.com/2013/12/ssis-read-multiple-sheets-from-excel.html

When I am trying to load my SSIS package is errored out as it is expecting some sheet Sheet1_ or Sheet2_, however my file has no hidden sheets

Any suggestion.

SSIS XML source root attributes missing

$
0
0

When I add a XML source and point it to s XML file I am trying to load I do not see the root attribs.

I have tried the local create XSD as well as Trang (nice command line tool)...

In either case I get the three other elements but not the root.

Here is the top of the XSD.. 

 

 

<?xml version="1.0" encoding="UTF-8"?>

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">

  <xs:element name="Company">

    <xs:complexType>

      <xs:sequence>

        <xs:element ref="Positions"/>

      </xs:sequence>

      <xs:attribute name="AsOfDate" use="required" type="xs:NMTOKEN"/>

      <xs:attribute name="Date" use="required"/>

      <xs:attribute name="Description" use="required"/>

      <xs:attribute name="GeneratingSystem" use="required"/>

      <xs:attribute name="ImportID" use="required"/>

    </xs:complexType>

  </xs:element>

  <xs:element name="Positions">

    <xs:complexType>

      <xs:sequence>  ....... and so on.. 

 

I am trying to get at the AsOfDate.. 

Now i did see a way to maybe use a XML Task... but these files are large and the task seem to scan the whole file.. which takes way to long..

 

It really seems like a bug within the XML Source component... 

Please tell me there is a faster / better way..

 

Thanks.

 

 

Create XML file From SQL Table in SSIS

$
0
0

Hello

I want to create XML File everyday with today's date in the file name from SQL Table in SSIS 2008 r2.

I tried writing sql query and created XML file, but format of the file is different than i want. I have inserted screenshot of the format of xml file i want.

Thanks in advance

table partition and parallel load in SSIS

$
0
0

1. load data for different batches in the same set of tables parallel using SSIS 2012

Steps :

a) Created 4 file Groups and 4 ndf  files .

b)1st ndf file points to C drive

c) 2nd ndf file point to D drive

d) The remaning 2 ndf is pointed to D drive

3. Create Partition Function for values 10,20,30  (RIGHT Partition On BatchId  ) which is available in the Target Table  as column 

4. Create Partition Scheme on the 4 File Groups

5. The Target Table has been created using Partition Function using BatchId

6. While Loading data parallel for 2 batches i.e 1 to 10 and 11 to 20 locks are coming.

Please suggest the best approach 

SqlServer 2014 - SSIS Package Execution - version 14.0 script not supported

$
0
0

I developed a SSIS project by SQLServer Data Tools (SSDT) 2015 which used the script task function. The package run successfully on SSDT. But when the package was deployed on SQLServer 2014 and executed, it was failed with the following  error message. Please help.

Error:

There was an exception while loading Script Task from XML: System.Exception: The Script Task "ST_88e6571b80144dd3b19beae856cbbab1" uses version 14.0 script that is not supported in this release of Integration Services. To run the package, use the Script Task to create a new VSTA script. In most cases, scripts are converted automatically to use a supported version, when you open a SQL Server Integration Services package in %SQL_PRODUCT_SHORT_NAME% Integration Services.

   at Microsoft.SqlServer.Dts.Tasks.ScriptTask.ScriptTask.LoadFromXML(XmlElement elemProj, IDTSInfoEvents events)


Best practices about using SSIS project parameters in a prod environment - SSIS 2012

$
0
0

Hi,

I'd like to manage some SSIS projects with a little administrative effort for SSIS developers and a separate management between prod and dev environments.

In order to do a such task, I think to use a set of project parameters to associate as properties for the connection managers used in a SSIS project.

To test the use of a project parameter:

a. I've created a project parameter to specify a SQL Server connection string;

b. I've associated this parameter to the connection string of the desired connection manager;

c. I've deployed the SSIS project on the SSIS catalog;

d. I've created a PROD environment for the SSIS project deployed on the SSIS catalog;

e. I've created for this PROD environment the related variables with the same names and values of the project parameters created and used in the SSIS project deployed;

f. selecting the project deployed in the SSIS catalog, for a specific parameter I've choosen the Use environment variable assigning the corresponding environment variable above created;

g. I've checked the Environment option and specified the PROD environment for the specific job step in order to use the project parameter.

The value of the environment variable about the connection string doesn't override the value of the project parameter created inside the SSDT: I'd like to manage once time for the prod environment the connection string for the connection managers letting free the SSIS developers to specify the values for the project parameters.

Any helps to me in order to define the right best practices to use project parameters in prod and dev environments?

Many thanks

Can i reference and use the .net framework 4.5 signed dll in BIDS script task?

$
0
0
I have created a security dll in .net framework 4.5 and I am using it in our web application which is in ".net Framework 4.5". We have "SQL SERVER 2008 R2" and I created aSSIS package using BIDS and for security reasons we need to use this dll in SSIS package developed inBIDS. My question is is it possible to use the dll developed in .net framework 4.5 in BIDS or do I need to create my dll in .net framework 3.5 and use it BIDS?

Rebuild Index Task how to reuse connection

$
0
0

Dear all,

I develop a SSIS-project to transfer data to a data mart. As the last step I want to rebuild indexes in the tables with new data. But I Have a very strange situation: "Rebuild Index Task" does not "see" defined connections (please see attached pictures). Is this normal situation or I do something wrong?

Load data from Sharepoint list view using SSIS pacakge

$
0
0

Hi All,

Can anyone please give me the congratulation for SSIS SharePoint list source control to load data from Sharepoint view. I was  able to load data from List not from the View.


Loop 10 SQL statement to insert data in Single Table

$
0
0

Hi Friends 

I have 10-15 DFT (Data flow task) which is basically fetching data from Source and inserting to Destination table. Every query has its own Business Logic but Destination table is same.

Instead of this 15 Data Flow task I want use a single Data flow task which Pull loop from Query 1 to Query 10 and Insert data into destination table since table remains same for all 10 query.

Please Help me out to perform this task. 



Abhishek Parihar BI Consultant (Please mark the post as answered if it answers your question)





Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>