Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS Update Column on Database from the current recordset in memory

$
0
0

Hello All

I have a package that runs after a previous insert made to the database. After that execute another dataflow task and do another select and combine different things to update a specific column in the same table. Once i get the result i want, how can i update the same records on the fisical table with the values i have on the record that is in memory. 

thanks in advance!


Execute dynamically references SSIS packages

$
0
0

Hello, 

I have a master package with execute package task. I should execute dynamically SSIS package stored in same project. 

but I had a error in execute SSIS task (packge is not specified).

 I use project reference in reference type.

Got "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." when execute a SSIS 2008 package

$
0
0

I have a SSIS package which was migrated from DTS SQL2000 to SSIS 2008.

The package designed to perform following jobs:

  1. Do the data cleansing by truncate the tables in SQL Server
  2. Transform data from Oracle Database into SQL Server tables

The package is working fine until the server (Windows Server 2008 R2) which install SQL Server 2008 enabled TLS1.2 and disabled TLS 1.0 & TLS 1.1

Error "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." will be thrown when executing task 2 (Oracle to SQL server transformation). Data cleansing part was done successfully.

Provider for Oracle used = OraOLEDB

Provider for SQL Server = MSOLEDBSQL

Any help will be very much appreciated.

Native Client 11.0 error when importing.

$
0
0

I am getting the below message when importing data into a database on a remote server:

This just started happening after the installation of Windows 365.  Any ideas?

client unable to establish connection
TCP Provider: An existing connection was forcibly closed by the remote host. (Microsoft SQL Server Native Client 11.0)

------------------------------
Program Location:

   at System.Data.OleDb.OleDbConnectionInternal..ctor(OleDbConnectionString constr, OleDbConnection connection)
   at System.Data.OleDb.OleDbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningObject)
   at System.Data.ProviderBase.DbConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionFactory.CreateNonPooledConnection(DbConnection owningConnection, DbConnectionPoolGroup poolGroup, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionClosed.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionInternal.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
   at System.Data.OleDb.OleDbConnection.Open()
   at Microsoft.SqlServer.Dts.DtsWizard.DTSWizard.GetOpenedConnection(WizardInputs wizardInputs, String connEntryName)
   at Microsoft.SqlServer.Dts.DtsWizard.Step2.OnLeavePage(LeavePageEventArgs e)

.csv load, target table has more rows than source

$
0
0

this is weird, I have loaded a series of .csv files into one sql server table, each file belongs to one day of the month

so when i check the counts for one of the days, the target table has more rows than the source csv file

1. how to find out why target table has more rows

2. in csv, everything is text, right now my table has even date columns as char, can i have date columns in my target table

when loading from csv

Calling oracle statements from Script Task

$
0
0

Hi

I am executing oracle statements from Script task in my SSIS 2008 package

OleDbConnection oraConnection = (OleDbConnection)(Dts.Connections["ABCConnection"].AcquireConnection(Dts.Transaction)asOleDbConnection);

            oraConnection =

newOleDbConnection(oraConnection.ConnectionString);

            oraConnection.Open();

DataSet ds =newDataSet();

OleDbDataAdapter adapter =newOleDbDataAdapter("select count(*) from emp", oraConnection);

            adapter.Fill(ds);

                       

Int32 dataCount =Convert.ToInt32(ds.Tables[0].Rows[0][0]);

            oraConnection.Close();

My code is returing 0 recrods. But I have data in that table.

Can you please help me on this.

Thanks.

           

Populate and Import From Webform

$
0
0

Good morning, all -

I am working on an SSIS tool to import a csv file from a web source. The particular data to be imported is created by a web form to select the particular fields, the date range and the source database.

Unlike other site I import data from, this one does not retain the field selection from the prior day's run. So, I have to go through the whole list of checkboxes, checking the ones I need individually. This can become tedious and opens up a potential error, in case a check box is mistakenly checked or a needed one is missed.

Is there any way to have SSIS programmatically fill out the web form and then import the csv?

Thanx in advance for any assistance!

DTLoggedExec Util

$
0
0

I am working on a new logging framework and trying to move our current packages and processes away from hardcoded configs and connection strings. I've upgraded DTLoggedExec utility to work with MSSQL 2012 and 2014 and would like to build a lot of my framework around it.

I like its external process; lightweight; and allows changes outside of the package.

The drawback I have with it is it right now provides 2 log's (Console or CSV). I'd prefer to log in to the MSSQL database directly so that for a long running process; I can leverage SQL/SSRS reports from a dashboard perspective. I need to wait for the process to finish before I load in the files to the database currently.

Before I go down this road; has anyone else run in to this situation or written their own customer LogProvider? Any input would be appreciated!


Employee Dimension Truncated everyday in Datawarehouse

$
0
0

I am developing a new data warehouse and my source tables for the employee dimension gets truncated every day and reloaded with all history and updates,deletes and new inserts.

The columns which tracks these changes are effective date & effective sequence.We also have a audit table which helps us determine which records are updated,inserted and deleted every day by comparing table from today & previous day.

My question is to how can I do a incremental load on the table in my staging layer so the surrogate key which is a identity columns remains same.If I do a truncate on my final dimension then I get new surrogate key each time I truncate and hence it mess up my fact table.

.I found a way to keep track of the identity column DBCC CHECKIDENT ('.dimEmployee' , RESEED, 1) .Does this approach have any loop holes or scenarios which it does not captures.I will also truncate and reload the fact table which will use the employee key. 

How to check SSIS failed jobs error messages

$
0
0

Hi Msdn, I am unable to find exact ssis error message either from job history or from SSIS report. Could you help me with this.



how to compare columns structure before processing files in ssis

$
0
0
Hi I have one doubt in ssis

I want load every day 3 different metadata files into 3 different table using ssis.
day by day in the source path files name are coming different 

for sample files.
1st day files are a.txt and b.txt and c.txt 
2nd day files are xy.txt and yz.txt and za.txt 
3rd day files are comes some other different name.
above 3 days are sample filesnames.file names not come exactely it will come different names.
here inside files should be diffrent columns for 3 different files.
so here I need to find columns structure comparision(source to target) before processing the source files using ssis.


sourcepath :  c:\test\
source path have files 1)a.txt   
                       2)b.txt
                       3)c.txt 
a.txt file structure : id |name 
b.txt file structure : id | name | deptno
c.txt file structure : id |name  | Locaiton|sal|deptno
Target have  3 tables : 1)emp and structure : id|name 
                        2)emp_dept and structure : id|name |deptno
                        3)emp_sal and structure : id|name |location|sal|deptno

some times file names are comes different name, that time before processing we need compare source to target columns structure
then process it.
here I follow the steps like below using ssis 

in controal flow: drag and drop 1 dft and go the dataflow here I used 3 differet flatfile source and create 3 diferent flat file connections
2nd way I took 3 dft  and each dataflow task I create one flat file soruce connection and configure to destionaton tables.

then  conffigure to 1flat file source to emp tabe  and 2flatfilesource connection to emp_dept and 3rd flatfile soure connection to emp_sal

after that used file system task for move processed files into archive folder .

its working file when i run the first day

if i run same package next day  package will failed due files names are comes different and flatfile connection configruation file names are not exist in the sourcepath.

day by day file names are  comes different ,
can you please tell me how to  compare columns structure before loading to the table and how will I decide which source file which table need to load
please tell me how to achive this task in ssis .




           


What should be recovery model for SSIS DB and what data does SSISDB stores ?

$
0
0

Please provide answers to the following questions :

1. What should recovery model configuration of SSISDB

2. What data does SSIDB stores ?

3. Is it required to take back up SSISDB  ? If yes what should be the method of storing the back up

While populating computed field data from list in sharepoint into table through SSIS package, getting error

$
0
0

While populating computed field data from list in sharepoint into table through SSIS package, getting below error

cannot access a disposed object.
Object name:'System.Net.HttpWebResponse'.

Please let me know how to resolve this issue.

Thanks in Advance.

SSIS Data Flow Task hanging

$
0
0

I am busy running a package that brings in data from a number of mariaDBs. I built the package with SSIS and it has been running fine for a few months. Last week Friday the package started hanging on one of the Data flows. It normally hangs on one of the data flows (the first) but occasionally hangs on another. It doesn't always hang on the same table in each flow and if you keep trying to run it, it eventually works. To test the issue I picked one table that it hangs on often and created a package to load that table. Sometimes it hangs and sometimes it works. When it hangs it doesn't always hang at the same point. The data in the table doesn't appear to be an issue but there may be something that I am missing. 

There is no error message in the logging (See below). The output from the debugging in visual studio states: "SSIS package [path] finished: Canceled". Does anyone have any ideas as to how I can fix this?

Tried:

Change transactions from required to supported

Change validate external metadata to false

Running in visual studio and SSMS

Separating the load of the table into different tasks

Increasing the size of the columns that the data is inserted into

Setting delay validation to false

last lines in the logging:

Execute phase is beginning.  
PrimeOutput will be called on a component. : 2 : ODBC Source : ODBC Source
Data flow engine will call a component method. : 2 : ODBC Source : PrimeOutput
Rows were provided to a data flow component as input. :  : 148 : ODBC Source Output : 87 : OLE DB Destination : 100 : OLE DB Destination Input : 2691 : OLE DB Destination : Paths[ODBC Source.ODBC Source Output] : OLE DB Destination.Inputs[OLE DB Destination Input] : 
Data flow engine will call a component method. : 87 : OLE DB Destination : ProcessInput
Data flow engine has finished a call to a component method. : 87 : OLE DB Destination : ProcessInput : 131988512852893389 : 131988512853483038
Rows were provided to a data flow component as input. :  : 148 : ODBC Source Output : 87 : OLE DB Destination : 100 : OLE DB Destination Input : 2691 : OLE DB Destination : Paths[ODBC Source.ODBC Source Output] : OLE DB Destination.Inputs[OLE DB Destination Input] : 
Data flow engine will call a component method. : 87 : OLE DB Destination : ProcessInput
Data flow engine has finished a call to a component method. : 87 : OLE DB Destination : ProcessInput : 131988512855951629 : 131988512856551276
Rows were provided to a data flow component as input. :  : 148 : ODBC Source Output : 87 : OLE DB Destination : 100 : OLE DB Destination Input : 2691 : OLE DB Destination : Paths[ODBC Source.ODBC Source Output] : OLE DB Destination.Inputs[OLE DB Destination Input] : 
Data flow engine will call a component method. : 87 : OLE DB Destination : ProcessInput
Data flow engine has finished a call to a component method. : 87 : OLE DB Destination : ProcessInput : 131988512861458464 : 131988512862327957
Rows were provided to a data flow component as input. :  : 148 : ODBC Source Output : 87 : OLE DB Destination : 100 : OLE DB Destination Input : 2691 : OLE DB Destination : Paths[ODBC Source.ODBC Source Output] : OLE DB Destination.Inputs[OLE DB Destination Input] : 
Data flow engine will call a component method. : 87 : OLE DB Destination : ProcessInput
Data flow engine has finished a call to a component method. : 87 : OLE DB Destination : ProcessInput : 131988512867584956 : 131988512869313954
Rows were provided to a data flow component as input. :  : 148 : ODBC Source Output : 87 : OLE DB Destination : 100 : OLE DB Destination Input : 2691 : OLE DB Destination : Paths[ODBC Source.ODBC Source Output] : OLE DB Destination.Inputs[OLE DB Destination Input] : 
Data flow engine will call a component method. : 87 : OLE DB Destination : ProcessInput
Data flow engine has finished a call to a component method. : 87 : OLE DB Destination : ProcessInput : 131988512872702027 : 131988512873901325


How to test for SUB (ctrl Z)

$
0
0

Hello,

I have an SSIS package that loads a text file into a table.  Interspersed in the text file are header lines which I am using a Conditional Split to test for.  The last line of the file (when I look at it in Notepad++) shows SUB.  How can I test for that in my Conditional Split in order to ignore it?


SSIS vs Azure Data Lake Analytics

$
0
0

Hi guys, 

Just a professional market job question: 

Do you think that Azure Data Lake Analytics (besides Azure data Factory) is going to replace SSIS one day? 

I am using SSIS to load data in Azure after having transformed them, everything works but the more I do and the more I think that I need to update my knowledge on these tools. 

SSIS All Execution Report #Error

$
0
0

Hello community,

I was troubleshooting some issues in a SSIS package in SQL Server 2016, but when I open the All Execution Report, it show me the following error on the report:



Does somebody know why this is happening and how to fix the issue, sometimes I just wait or close the SSMS, and then everything starts working as usual, but I would like to have more information about this.

Regards,

Issues running SSIS package remote

$
0
0

Hi all,

I have an SSIS package that runs fine when logged in directly to the Windows database server.

I am running Windows 2016 Server and SQL Server 2016.

However, if I try to run the same SSIS package remote from my Windows PC with SSMS, it fails on "cannot write data file" errors.

OLE error: cannot open the data file "\\myshare\ssis\staging"

This is weird because my user account has full privileges and so forth. What can I check to solve this issue so that I can get the SSIS package to run remote from SSMS on my PC?

Need help with specific scenario

$
0
0

I need to export spreadsheet data into SQL table. For this purpose, I am using Excel Source from SSIS toolbox to read the data. Before I insert rows into destination table, I would like to create the table first whose name would be stored in a variable. How can I achieve this? I tried using OLEDB Command but there's no way I can provide a variable to SQL Command parameter.

Can someone please suggest an approach?

Invalid character value for cast specification

$
0
0

 

I have a Data Flow task that is simply copying data from table A to table B. (there is a derived column in between however)

 

All the columns in table A are varchar(255). The columns in table B vary: floats, decimals, etc

 

However, I am getting a bazillion errors when I run this, each of them to the effect that:

 

 Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available.  Source: "Microsoft SQL Native Client"  Hresult: 0x80004005  Description: "Invalid character value for cast specification". An OLE DB record is available.  Source: "Microsoft SQL Native Client" 

 

I don't understand why this is happening... I've succesfully run similar scenarios with other packages, ie) moving data from one table to another table even though the tables have different data types.

 

Help

 

Thanks

 

 

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>