Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

how to know the source connection closed after successfully run of SSIS package.

$
0
0

Hi Team,

i am loading the data from oracle to sql server,so i am connecting the source oracle,how to check weather the source connection is closed or not after the package runs fine.

as far i know ssis engine closes the connection automatically once the task is complete.

but i need to know the proof.can anybody help me here.

Thanks,


Using $expand query option with OData, SSIS and SharePoint 2013

$
0
0
I recently had an issue with trying to expand the lookup & user columns in a SharePoint 2013 OnPrem list which I`m importing into a SQL Server 2012 database.

I had thought that an OData query would be the answer and whilst I can achieve it via a URL in my browser it doesn`t seem to be accepted by the OData adapter in SSIS.

For example the simple URL below returns the Item ID, Name and Email Address of the last person to modify the item:

<a href="https:///_vti_bin/listdata.svc/<LISTNAME>?$select=Id,ModifiedBy/Name,ModifiedBy/WorkEmail&$expand=ModifiedBy">https://<URL>/_vti_bin/listdata.svc/<LISTNAME>?$select=Id,ModifiedBy/Name,ModifiedBy/WorkEmail&$expand=ModifiedBy

It works fine in the browser but when I use the same in the OData adapter I get the error below. I did try the _api URL but that doesn`t supply a ATOM Service Document so is classed as an invalid feed by the OData adapter.

Can anyone assist?

An error occured when reading the OData feed. (Microsoft.SqlServer.IntegrationServices.DataFeedClient)
------------------------------
Program Location:

   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedODataReader.InterceptODataException[T](Func`1 function)
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedODataReader.ReadNextODataEntry()
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.BufferedRowsReader.FetchNextRow(IRow reuseRow)
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.BufferedRowsReader.MoveNext()
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedDataReader.Read()
   at Microsoft.DataTransformationServices.DataFlowUI.ODataConnectionPage.PreviewButton_Click(Object sender, EventArgs e)

===================================

An error was read from the payload. See the 'Error' property for more details. (Microsoft.Data.OData)

------------------------------
Program Location:

   at Microsoft.Data.OData.Atom.BufferingXmlReader.ReadNextAndCheckForInStreamError()
   at Microsoft.Data.OData.Atom.BufferingXmlReader.ReadInternal(Boolean ignoreInStreamErrors)
   at Microsoft.Data.OData.Atom.BufferingXmlReader.Read()
   at System.Xml.XmlReader.SkipSubtree()
   at System.Xml.XmlReader.Skip()
   at Microsoft.Data.OData.Atom.ODataAtomEntryAndFeedDeserializer.ReadFeedContent(IODataAtomReaderFeedState feedState, Boolean isExpandedLinkContent)
   at Microsoft.Data.OData.Atom.ODataAtomReader.ReadAtEntryEndImplementation()
   at Microsoft.Data.OData.ODataReaderCore.ReadImplementation()
   at Microsoft.Data.OData.ODataReaderCore.ReadSynchronously()
   at Microsoft.Data.OData.ODataReaderCore.InterceptException[T](Func`1 action)
   at Microsoft.Data.OData.ODataReaderCore.Read()
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedODataReader.ReadNextODataEntryInternal()
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedODataReader.<ReadNextODataEntry>b__6()
   at Microsoft.SqlServer.IntegrationServices.DataFeedClient.DataFeedODataReader.InterceptODataException[T](Func`1 function)

Need to convert all the fields from the Oracle ADO source in data flow.

$
0
0

Hi All,

I am eager to take advice on the situation I am in.

I have one table in oracle database in which the datatype of the fields are declare like Varchar2(40 BYTE), VARCHAR2(80 BYTE) etc. There are 30 such fields which are having string data fields and this is my Ado.net source for oracle.

So, when i check all these 30 fields have DT_WSTR in output columns in advanced editor of source.

Now, my destination is SQL Server and I have these 30 fields as Varchar datatypes.

There are two situations 

1] Whether i can put data conversion in the data flow for 30 fields.

2] Or Double the size of the SQL table columns with Nvarchar; (Not sure if this will remove the conversion error I am getting right now).

I know the overhead of converting 30 fields in data conversion task in ssis for order table of 1 month period daily.

Please advice.

Thank you 

 


Dilip Patil..

SQL AGENT SSIS JOBS IN HUNG STATE

$
0
0

SQL SERVER VERSION:Microsoft SQL Server 2014 (SP2-CU10-GDR) (KB4052725) - 12.0.5571.0 (X64)
   Copyright (c) Microsoft Corporation
    Enterprise Edition: Core-based Licensing (64-bit) on Windows NT 6.3 <X64> (Build 9600: )
OS version - WINDOWS SERVER 2012 R2 STANDARD

SSIS - SQL 2014 - Microsoft SQL Server 2014 (SP2-CU10-GDR) (KB4052725)

Last  few months we are facing the issues 2 -3 times

We have observed that Jobs which are scheduled using T-SQL are getting completed successfully, issue is with  SQL jobs which are configured by SSIS and those are getting halted after some time post re-starting the Integration  service.

The SSIS Jobs will works for while and start to get hung and wont do anything -Anyone can tell me what could be the issue Why only the SSIS jobs hung and others are works fine?


Jay

SSIS Passing a Parameter into a Lookup

$
0
0

I have a value I want to lookup the ID for is not in the source table, so I must pass it in as a parameter.

So basically I just want to take a String parameter, and lookup the numeric ID for that value in another table (that is not the source table) so that I can put the returned ID into the destination table.

Is this possible?

CGI TO SQL INTEGRATION

$
0
0

we use application which stores all the data to local machine . The application has a centralized server located at Application provider location. The application provider provides access to local web services of application using CGI.

I mean, We have local IP address and Port number to access web service portal of application but it using CGI ( Which means the request goes to application provider server and then the application provider sever gets data from our local hosted application and sends data to local web service to which , we have access . 

I am trying to Integrate the SSIS , I want to get data from web service and dump it into our SQL database and then Query it using Business Intelligence ( according to our requirement ).

I have tried many steps but not able to get anything in SSIS integration.

The service provider has just provided us some scripts file like this .

http://server : port /cgi-bin/queries1.cgi

i have a problem when i started running my SSIS project .Im working with visual studio 2017 (sql server 2017)..Help please :)

$
0
0
TITLE: Microsoft Visual Studio
------------------------------

Failed to start project

------------------------------
ADDITIONAL INFORMATION:

Nom de répertoire non valide (Microsoft.DataTransformationServices.VsIntegration)

------------------------------
BUTTONS:

OK
------------------------------

SQL Job Failed on data type not match

$
0
0

I've a simple data flow task which pulls data from mysql to sql db.  Use DataReader Source, .Net Providers\Odbc Data Provider.  The task runs fine on my machine but it fails when I put in sql job.  Below is error:

Source: import temp_tbl DataReader Source [1]     Description: The data type of "output column "project" (25)" does not match the data type "System.String" of the source column "project". 

The source data type is varchar.  In Advanced Editor for DataReader Source, the external column data type is DT_WSTR so I change it to DT_STR but the job still fails.  To further troubleshoot the issue, I only pull the data that is Integer data type and the jobs run fine.  I even use Data Conversion but it still fails.

Really appreciate any inputs.  Thanks in advance.


Ash_



Global variable in Execute SQL Task comes with quotes

$
0
0

Hello,

I have a global variable that represents a file name and I would like to use this variable inside of a Execute SQL Task when I'm writing a query. 

The query I have in the Execute SQL Task is: CREATE TABLE SCHEMA.?

This is supposed to create a table inside the Netezza system which is where my OLEDB is connected to. (The connection is successful)

The problem that I'm getting when I look at the output is: ERROR: 'CREATE TABLE SCHEMA.'TableName' ' Expecting an identifier found a "keyword".

As you can see, it adds quotations around my tablename. In the parameter mapping tab inside of the Execute SQL Task Editor, I set the variable's datatype to NVARCHAR and my  variable is stored as a string. How can I access the value of the variable without the quotations?

Row count

$
0
0

Hi All,

is there a way in SSIS Package to check the number of rows in between dates. For example i have a ssis package that loads data from the Source (sql server table) to Target(SQLServerTable). It is scheduled to run every night.

01/01/2019 : loaded 1001 rows

01/02/2019 : 500 rows

01/31/2019 :0 rows

Is this information stored somewhere in the hidden sys tables??? just curious or how can i capture this information.

Thanks

Flat File Connection Manager Adding Blank Columns

$
0
0

Good afternoon, all -

I am using a Flat File Connection Manager to read data from a pipe-delimited .txt file with 80-or so thousand rows.

Unfortunately, when I preview the data or look at it using the Columns option in the CM Editor, I see that it is adding 68 empty columns to the end of each row.

At first, I thought the client might have sent data with a gob of pipes at the end of each line and the CM was interpreting these as columns. But, looking at the raw data, I saw this wasn't the case. However, I did notice that none of the rows had a final delimiting pipe. Now, I don't know if this matters, but I thought that, perhaps, SSIS has some sort of buffer that it uses to import each row and, if a final delimiter isn't there, it interprets this as additional blank columns until the buffer is filled.

So, my thought was to add a pipe to the end of each record using a Script Task. Unfortunately, my C#-fu is poor and I hoped that someone had already met this problem and could give me an assist. Of course, this is predicated on the presumption that my guess about what's happening is correct. If I'm wrong, then such an idea won't work. If that's the case, does anyone know why the Flat File Connection Manager might be adding additional empty columns to the end of my data?

Thanx in advance for any assistance!

SSDT Visual studio data tool 2015 Access Oracle DB through 32 driver

$
0
0

Hi I am creating an packages to connect to oracle and  output write to excel file. I am using SSDT 2015 version . On this task "OLD DB source" task I create a connection to oracle database using 32 bit driver(ODTwithODAC121010). connection has tested and working. But when I place just simple sql on "OLE DB Source" and click "preview" I am getting following error

The system cannot find message text for message number 0x80040e51 in the message file for OraOLEDB.
(OraOLEDB)

Note: I configure 32 bit ODBC driver to connect to the Oracle DB and its connecting.  Not sure why "preview" get error. any one seen this before


SSIS transfer sql server objects task RUNS more than TWICE AS SLOW AS A SQL AGENT JOB AS OPPOSED RUN FROM BIDS/SSDT

$
0
0

Hi

I have a dev and live server and I created a transfer sql server objects task in ssis that pull tables from a source server to the dev (destination) server and saved it as a package. I have then copied this package and changed the destination server to the LIVE server.

When I run the package in dev and live they both run successfully. Both servers have identical spec. But when run as  SQL AGENT JOB  from both servers, it takes more than twice as much time on the LIVE server than the dev server. I have traced the job on both servers and the actual copying of data seems to be pretty much the same but the LIVE server spends half of its time trying to get the connection or validation...….all I can say is that it works on both servers but more than twice as slow in the LIVE server. The 2 ssis jobs are exactly the same except that the smo connection for destination on both are different (each pointing to its own server).

So if I run the task from integrated services on the live server, it still takes nearly 3 times longer.....

While monitoring the LIVE server I can see the sql activity monitor showing the ssisdb database as a suspended task state with a wait type of sleep_task……..I can see this in the DEV server also, but in the dev server it begins the transfer object task  after seconds and it takes about 4 minutes before it kicks into action on the LIVE server

No other jobs are running against either servers while I am doing this.

Both servers are windows server 2012 R2 and sql server 2016

Can anyone help me understand why its much slower in one than the other...…and maybe suggest where I can go to resolve the issue...

Thanks for any help....


Regards David McGarry




Error importing existing package

$
0
0

In visual studio 2015, we have a standalone package (.dtsx file) with a vb script task.  When opening the package, we can view the script task without any issues.  If we try to import the package into a project (using same version of visual studio), we get the following incompatibility error.  Why can this be opened standalone, but can't be imported into a project?

There was an exception while loading Script Task
from XML: System.Exception: The Script Task "somename" uses
version 15.0 script that is not supported in this release of Integration Services.

cdkhjiolf

$
0
0
mfjueudjkjkjieuiujijiyhjbjgyt3eryhijoi[ojuoe

'ISServerExec.exe not found' error while deploying package or executing previously deployed package.

$
0
0

The file is not available in the location also.

The error is in SQL SERVER 2014. 



Script task to generate dynamic tables from flat files for any type delimiters

Case statement to check for delimiters

$
0
0
Hi, Please help me in write out the C# code to find delimiters using case statement in for loop, and assign the corresponding value to the variable. 

Adding CRLF for an header of a flat file to create - SSIS 2016

$
0
0

Hi,

for a SSIS 2016 package, in order to produce a txt file with one header row, some body rows and one footer row, I'd like to put a carriage return + line feed inside the header of the flat file destination that writes the footer row.

I've executed some proofs but without any results.

Any suggests to me, please? Thanks

dynamically load data using a mapping table

$
0
0

Hello All, 

need expertise advice on this,

Currently we get a file with fixed number of columns and gets loaded into OLTP system

What ask we got to make this code changes more efficient is that how easy can we make a file with new columns getting added every iteration from file into OLTP system.

right now.. everything file gets a new format with new columns SSIS changes to add those columns which is very less work

All sprocs needs to change with new columns and new validations, and data look up for FKs and then load it into OLTP.-- where i see majority of work and want to reduce it.

SO somehow i have this idea stuck in my brain, i could use a mapping table, like file format columns and which columns in OLTP match to that columns. and change sprocs in such a way to so all we have to do is keep adding new columns in mapping table and the sprocs take care of loading it into OLTP system.

Is this a viable approach or anybody has better solutions.. please please share 

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>