Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Excel decimal values truncated in SSIS package

$
0
0

Hi All,

I'm facing decimal value truncation issue in my package. I'm using 2010 excel version and one of the column excel have both string and numeric values. 
Excel displaying only the rounded values but when we clicked the excel cell it has 8 decimals. so what SSIS is doing
it picks only the rounded values and it fails to retrieve all the decimal values. In my case I need all the 8 decimals values to my SQL table. Please provide me any solution to resolve this issue.

I tried to add and remove IMEX=1 excel connection properties. In both case it wont work.

SSIS package:  



Excel Sheet: Column D has string and numeric values


Oracle Destination - standard edition

$
0
0

I need a free Oracle destination component for SSIS that will work with Standard Edition.  I know there are ones by CozyRoc and Attunity that are either ones you have to pay for or are only for the enterprise version.  Anyone know of any?

 

SQL 2012 - Connection Manager not listed in Source Assistant

$
0
0

Hi,

Using SQL Srv 2012 - Started a new project, created an ADO.Net connection manager, created a Data Flow task,  in the Data Flow tab created a Source Assistant, when prompted to choose a Connect Manager my existing Connection Manager is not listed.  Why not? 

thanks!

Martin

Installation question with different SQL versions

$
0
0
I have a physical SQL server that has both SQL 2012 and SQL 2016 installed.  I had a request to install Integration Services on the server so I installed it for 2016 version.  Now being told by the vendor that I need to install it for the 2012 version.  I did some research online and saw in a post that I should not install an earlier version if there is a later version already installed.  My thought was to uninstall the 2016 version of Integrations Services and install the 2012 version but I am not really sure if that is the correct course of action.  I am not a SQL admin as we do not have one at my company but have been tasked with this.  Any suggestions would be greatly appreciated.

Changing a datatype for a column in an odata datasource

$
0
0

Hi,

I'm trying to import an odata datastream via an OData Source component. The datastream contains a column with decimals in it (like 0.42). The OData Source component defines this column as a DT_NUMERIC with precision 38 and scale 0. 

For some reason the decimal part of the number is cut off during the import. I have tried a derived column with a type cast to decimal(38,2) and a data conversion.

I also tried changing the data type at the most logical place for me: right at the source in the OData Source Component with the external columns but SSIS displays an error "Property value is not valid" and then starts telling me "

Error at Retrieve Werkorders [TDO Werkorders [246]]: The data type of output columns on the TDO Werkorders cannot be changed, except for DT_WSTR and DT_NTEXT columns.

Error at Retrieve Werkorders [TDO Werkorders [246]]: System.Runtime.InteropServices.COMException (0xC020837D)
   at Microsoft.SqlServer.IntegrationServices.OData.ODataSource.SetOutputColumnDataTypeProperties(Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostSetOutputColumnDataTypeProperties(IDTSManagedComponentWrapper100 wrapper, Int32 iOutputID, Int32 iOutputColumnID, DataType eDataType, Int32 iLength, Int32 iPrecision, Int32 iScale, Int32 iCodePage)"

I have been looking at this for over a day now. Does anyone have a clue?

Thanks in advance. Regards,

Peter de Hoog


Get the part of the excel data in SSIS.

$
0
0
I have create a SSIS package.In the excel source, I want to get the part of the data. How can I realize this result?

The ways to solve the errors in the execution of SSIS package?

$
0
0
In my package, I have an Execute SQL task and a Data Flow Task.
If the Execute SQL Task fail, I do not want to execute the Data Flow Task.
Please give me the advice. Thank you.

Date field conversion issue in SSIS when integrating from DB2 to SQL

$
0
0

I am working on data integration between DB2 and SQL server using SSIS package. From SQL query I am fetching the records and doing a lookup on DB2 data query where i have mapping on a column which is of Date datatype. In SQL the field is of date datatype but in DB2 it is always returning as a string. I tried converting the field to Date in DB2 but still it returns as string. I am new to DB2 and not sure how I can cast the field in db2 so my both mapping are of date datatype.The conversion is in the lookup query so can't convert using data conversion . I tried below syntax 

CAST(DATE_ISSUED AS DATE) as DATE_ISSUED.  When checked on the Advance Editor Output columns this field is still showing as string though i did casting as date.

Why is the SSIS package Advance editor is showing a date field from DB2 as always string. 

My source query(SQL) has to  do a lookup on the DB2 as i have date range condition .vice versa will not work for me as I have huge data in DB2 which i don't need .Rather just pull that matches with my source query.

Any help is highly Appreciated !!






Why is it impossible to get PHP+SQL Server+IIS working on Windows 10 PC?

$
0
0
In my case , I'm on windows 10 pro. trying to get PHP and MSSQLSERVER to run with IIS. I tried everything on https://docs.xxxx and failed. I get here: https://www.microsoft.com/en-us/sql-server/developer-get-started/php/windows and followed the steps and still failed. Now I rest my PC and hoping to find help here.

Chieki


After upgrade from SQL Server 2016 to 2017 problem with File System Task (Delete file) with error file being used by another process

$
0
0

Hi,

I have an agent job with two SSIS packages. The first package basically creates a couple of files and the second package imports the file into the database and then deletes the files. The files are place on a share.

The package was working fine with SQL Server 2016 and has been running a couple of years without any issues.

This week I have upgraded the server to SQL Server 2017 and now the same packages is failing about 3 times out of 10. 

The error is in the second package when the file should be deleted and the error message is: "Delete Stage File: Error: An error occured with the following message: "The process cannot acces the file '\\share\filearea\indata\file.raw' because it is being used by another process.".

I can see in procmon.exe that is only ISServerExec.exe which is accessing the file when running the sql agent job with the SSIS packages. And it seems like the ISServerExec.exe process from the first SSIS package in the job is not releasing the file before the second ISServerExec.exe process tries to delete the file. 

I have seen others have the same error and they have implemented delays in the package. However, for me the packages was working without issues for a long time it is only started after the upgrade from 2016 to 2017. 

Anyone seeing a similar issue after and upgrade? Or when running on SQL Server 2017?

Stefan

Split the file

$
0
0

my ssis package downloades the text file from Ftp. iT downloades for ex 5 files.

I want to split the file to smaller file after downloades. If the size of file is more then 600 mB then I want to split it into 6 files.

please suggest if any task in SSIS can perform this or any other way.

 

Unable to create an Excel Connection Manager not recognized as a valid connection type.

$
0
0

Suddenly, after the most recent Microsoft update (day before yesterday), SSIS no longer recognizes Excel as a valid connection manager type. The SSIS packages have been working well for over a year.   Here is the error message

TITLE: Microsoft Visual Studio
------------------------------

The new connection manager could not be created.

------------------------------
ADDITIONAL INFORMATION:

The connection type "EXCEL, {8FD37A45-01A4-210C-6C6D-575139717076}" specified for connection manager "{F118BC96-4456-4B60-A69F-1E69A7BACCFF}" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.

------------------------------

I tried creating a new Excel connection manager, and that didn't work either.

I'm running SQL Server 2017 and the most current version of SSDT (just downloaded and installed as an attempt to fix this issue).  The machine is 64-bit Windows 2012 R2.

I'm running Microsoft Office 365, but again, things have been running perfectly until today with SSIS accessing Excel.

My guess is that something, perhaps in yesterday's Windows update corrupted, whatever it is that SSDT needs to connect to an Excel data source, but that's only a guess.

Any possible solution?  Is there something I need to reinstall?  Please advise.


how to post your add on classified portal ?

$
0
0

Hy my self Shopia today now i'm going to telling about classified website job post and advantage. we have a website http://askhum.com click on this link and visit classified website after that create your account on site with your google account and then edit your classified like your full name, your address and phone. Now your classified account is ready to job post. so click on post job and select your business with tittles, area, make unique tittle with brief description and upload a good quality pictures. and then post your job. 

askhum.com classified web site in India.

how to find a sexual partner and escorts models in own area.

$
0
0
if you are looking a escorts for some fun and make your dream more beautiful then type on google http://mumbaihotcollection.com is the best escorts services website in mumbai.

Why SSIS

$
0
0

Why SSIS?

 

Is there a need for it?? The question is where and in which circumstances, isn't it??

 

I have been around in BI for quite some years and stumbled into MS conferences, where MS and others have tried to convince my poor brain. But all the time they have failed, so I started to wonder if the fault was mine. Have I gone too old???

 

Installing a lot of new SW on my harddrive including SSIS recently, I decided to give it a new chance. I also invested a lot of time in Webcasts where convincing people tried to get me going..

 

Still I wonder, is there a need for it, I am not capable of seeing it so please help!! Have I gotten blind or mad?

 

All I have done so far in my life with SQL should now be done in SSIS? And I start asking why!  Is it better? (No), Is faster (Not for me, but maybe for the computer), Is it more maintanable? (No, just more messy) Is it graphical? (Yes, but that dosen't mean easier to use than T-SQL - I get totally confused when it comes to context and overview... When I need to see a detail, I have to investigate it with at least five clicks and try to find it in a lot of hidden places where plain code makes much more sence)

 

AND when I look at a 'Showplan' i T-SQL, for me it looks like the thing I am now expected to construct manually...

 

SQL = Explain what you want

SSIS = Explain how to do it <-> IE a Showplan.

 

Am I right or am I wrong?

 


Stopping a Data Flow Task quickly when Excel source columns aren't correct?

$
0
0

I have a Data Flow Task consisting of a source Script Component that reads an Excel file.  That data is fed into a transformation Script Component that verifies the columns are correct and reorders and assigns them to the destination columns.  Then the data runs through a conversion task.  Then to the destination component.

When the transformation Script Component looks at the first row and detects an invalid file, I want to abort the whole process.

The problem is it always takes just as long to return immediately from Input0_ProcessInputRow() as it does to import a valid file.  Nothing I do speeds up the process of exiting the execution of a package.  Even throwing an exception as the first statement of Input0_ProcessInputRow() doesn't speed it up.

What can I do to make error feedback faster?

Getting a Derived column error

$
0
0

Hello,

I created a package and after writing 89,478 rows to the destination I am getting an error message.

Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "Derived Column" failed because error code 0xC0049064 occurred, and the error row disposition on "Derived Column.Outputs[Derived Column Output].Columns[bill_dt_Derived]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

I checked the source file and there is no blanks or NULL values. The destination column type is small datetime and my source column type is string.

This is how I am getting my derived column: (I am only taking the date piece from source)

(DT_DBDATE)TOKEN(bill_dt," ",1)

Thanks,

Ali.

Import Excel file

$
0
0

I have two multiple sheets excel files that have the same content but different extensions,PersonalInformation.xls and PersonalInformation.xlsx. Each of these excel file has two sheets named asPeople and State. I used two separate, parallel tasks to count the number of records in each sheet and shows the result in a message box.

When I set PersonalInformation.xls  in excel connection manager and run the package, its execution completes, a messagebox is shown that shows the the number of records in each sheets, People and State

In case of setting PersonalInformation.xlsx in excel connection manager and running it, the packages remains executing and no message box is shown.

The below images are for the case of reading the rows of each sheet in PersonalInformation.xlsx

Where is the problem,why the second package does not complete?


[Execute SQL Task] Error: The value type (__ComObject) can only be converted to variables of type Object.

$
0
0

Hi we run 2017 std. The tail end of my single row resultset query is shown below in the code block, a bit anonymized. I'm getting an error mapping the output (command) to an ssis string variable called extractCommand. Presumably because of the varchar(max) portion.  This error follows the first in the progress pane.

[Execute SQL Task] Error: An error occurred while assigning a value to variable "extractCommand": "The type of the value (DBNull) being assigned to variable "User::extractCommand" differs from the current variable type (String). Variables may not change type during execution. Variable types are strict, except for variables of type Object.
".

The query being generated dynamically ends up being close to 21k bytes in length for 22 locations.  The generating query runs fine in ssms and the generated query runs fine in the db where it is intended to run. 

Its my understanding so far that sql wont even run this unless the first item in the string_agg function has a data type (thus the casts) big enough to hold the results.  The generated query ends up looking something like this and the number of conditions in parens separated by "or" for each location is not necessarily the same as the others...

select ... from location1_replicated.dbo.tbl where (id=... and dt >... and dt <= ...) or (...) or (...) union all select ... from location2_replicated.dbo.tbl where (...) ....

The compound predicates were built separately, one row for each location in #predicates.

Does anybody know of an elegant way around this problem?

select substring(command,12,len(command)-11) command
from
(
select string_agg(cast(' union all select ' as varchar(max))+ '''' + a.loccode + '''' +' loccode,* from ' + case when a.loccode='abc' then 'xyz' else a.loccode end + '_replicated.dbo.tbl where ' + b.[predicate],'') command 
from #whichlocs a
join #predicates b
on a.loccode=b.loccode
) x

 

Retain connection, avoid re-loading data in ForEach Loop Container while processing in-memory

$
0
0

I have been scratching my head over this design for quite some time now and I think it is time to ask for help.

I have designed a SSIS package which uses a FELC to enumerate over an ADO recordset. Inside the FELC, I have a flat file source, a conditional split, an aggregate transform and a destination. For each enumeration, the source loads the same flat file and use the enumerated values to split the table before aggregation.

For performance purpose, I would like to (1) retain the connection in the FELC to avoid creating it at each enumeration, (2) avoid fetching the same input table and (3) keep the table in memory for processing.

To retain the connection, I understand that I could use other sources such as OLEDB, ADO.NET or Excel with the option “RetainSameConnection”. However, these will not satisfy my constraints of processing in-memory or not to re-load the data at each enumeration.

To keep data in-memory I have found two potential solutions: send the flat file table to a recordset or to an in-memory table. Concerning the in-memory table with the OLEDB source, the connection would be retained. But I have not found a solution to avoid fetching the table at each enumeration.

Concerning the recordset, I could send the flat file table to a recordset before the FELC and use the recordset as source with a script component within the FELC. It does look like the best solution.

My questions:

(1) Am I missing any potential solutions to satisfy my 3 constraints? I am flexible on the type of FELC, or even to another approach than a FELC as long as performance is high.

(2) Is there a way to avoid a source to be loaded at each enumeration in a FELC?

(3) If I use a script component and a recordset, can it satisfy my constraints of (a) retaining the connection and (b) avoid loading the data at each enumeration? If yes, how to achieve that in the script?

Many thanks in advance,

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>