Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS package with dynamic database connection

$
0
0

Hi,

I am trying to fetch 10 table data from 10 tenant database dynamically. I have Used ForeEach Loop Container to iterate each database one by one and dump to destination table.

Please find below screeshot for package


I have created a variable "Database Name", "Server Name", "UserName", "Password" and set the value.  In execute script I have fetch all the tenant database name. In "Database Name" variable i have set default db. Here my package run successfully 1st time but 2nd time in foreach loop container it fails and give error

"[OLE DB Source [93]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Dabasename" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed."

I read many thread saying Set DelayValidation to True in Data Flow Package level. I did that too. Also created a new connection by deleting old one. Still getting this error. Is there any other way to short out this issue?


Not able to view the excel sheets in excel connection manager in SSDT 2017

$
0
0

Hi All,

I am using Wn10 and visual studio 2019 .

I am using the SSDT (15.9.28307.960) VS 2017, 

For excelconnection manager I am getting the this error " SSIS Excel Connection Manager failed to Connect to the Source"

I uninstalled my office 365 and AccessDatabaseEngine.exe 64 bit and installed the AccessDatabaseEngine 32 bit.

I tested in SSDT 2017 its working fine .

After that I am not able to install the office 365 and AccessDatabaseEngine 64 bit, it says there is 32 bit office product.

So I tried via the commandprompt withepassive parameter, it still fails.

hence I need to remove the 32 bit and installed 64 bit it make my office 365 running.

can anybody has any solution to this, 

thanks



xp_cmdshell problem

$
0
0
I'm trying to start a ssis package by using xp_cmdshell.
When I execute the statement it just hangs.

To find the source of the problem I've done the following
exec master.dbo.xp_cmdshell 'dir c\'  : no problem
exec master.dbo.xp_cmdshell 'c:\testdir.bat' (where testdir.bat contains an 
"mkdir c:\test" command) : no problem

I put the dtexec command with all it's parameters in a testdtexec.bat file 
on the server and executed it from the server: no problem

When I then run the command:
exec master.dbo.xp_cmdshell 'c:\testdtexec.bat'
-> a "cmd" process is started (with a user that has admin priv)
-> a "dtexec" process is started (with the same user runs sqlagent)
and then it hangs forever

Any hints would be appreciated

 

Is there a way to execute specific containers inside a package in Integration Services Catalogs or in SSIS?

$
0
0
Hi All,
I have daily ETL routines and sometimes a package fails to run. If the failure occurs only in a specific container within my package, I would like to ask if there is a way to rerun just the failed container, not the entire package. It can be in Integration Services Catalogs, with SSMS or directly in SSIS.
Can anybody help me?

MS SQL Server Latest Certification Exam Codes

$
0
0

Hi,

Can you please share the Latest MS SQL Server Certification Exam Codes. I  have seen that 70-761&762, Is this exams related to MS SQL Server 2016 or 2019 Version ?

Thank you.

Error : System.ArgumentException: Value does not fall within the expected range. Running the rebuild process of the project.

$
0
0

Hi experts,

I wonder if anybody can give me an idea how to get rid of this issue.

Once I try to rebuild the project, it last for a while but at the end send me this error:

Applied active configuration to 'Project.params'.
Error : System.ArgumentException: Value does not fall within the expected range.
   at Microsoft.SqlServer.Dts.Runtime.Interop.ProjectInterop.ReferencePackage(Package package, String packageLocation)
   at Microsoft.SqlServer.Dts.Runtime.PackageItem.Load(IDTSEvents events)
   at Microsoft.SqlServer.Dts.Runtime.PackageItem.get_Package()
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.IncrementalBuildThroughObj(IOutputWindow outputWindow)
   at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.BuildIncremental(IOutputWindow outputWindow)

Build complete -- 1 errors, 0 warnings
========== Rebuild All: 0 succeeded, 1 failed, 0 skipped ==========

Is there anyway I can resolve this issue?

Thanks a lot in advanced.

SQL Server: Overflow of float data type during simple transfer operation...

$
0
0

I have a SSIS that copies data from one database to the other. The tables are created using identical script and contain a number of float columns with default precision.

Is it possible for an overflow error to occur while copying data because data in the source column (float) is larger than space available in target column (also float with same default precision)?

SSIS 2012 Project Deployment Wizard "Out of memory" Exception

$
0
0

Hello,

I encounter "Out of Memory" exception when trying to deploy SSIS Projects to SSISDB catalog using the Deployment wizard invoked through SSMS. A screenshot is attached. The error occurs at "Changing protection level" step, and when I click the "Failed" status, a pop up message shows the out of memory exception. This happens when large packages (combined dtsx file size is over 50MB) are in the project, although the *.ispac file is under 3MB.

I notice that I can successfully deploy the same project (*.ispac file) using the 64-bit deployment wizard which I have to invoke manually (C:\Program Files\Microsoft SQL Server\110\DTS\Binn\ISDeploymentWizard.exe). I tested this with increasing dtsx file sizes and when the combined dtsx file size reaches around 500MB, 64-bit Deployment Wizard also fails.

My desktop is Windows 7 64-bit and has 8GB RAM.

I would like to hear from others who have encountered the same error and whether they have found workarounds.

Thanks



Find dataflow task taking the longest

$
0
0

Hi All,

I have a data-flow task with several transformations. It runs for 4 hours.

Is it possible to find which transformation is taking the longest?

I am looking for a query that i can run on SSIS DB 

TIA


Perform transformation on No-SQL database using ETL

$
0
0

Hi,

I want to perform transformation on Cosmos DB data and create data warehouse using ETL tool such as SSIS,Azure data factory.

Is there any specific scenarios where we can use ETL with No-SQL database?

Thanks,

[Execute SQL Task] Error: The value type (__ComObject) can only be converted to variables of type Object.

$
0
0

Hi we run 2017 std. The tail end of my single row resultset query is shown below in the code block, a bit anonymized. I'm getting an error mapping the output (command) to an ssis string variable called extractCommand. Presumably because of the varchar(max) portion.  This error follows the first in the progress pane.

[Execute SQL Task] Error: An error occurred while assigning a value to variable "extractCommand": "The type of the value (DBNull) being assigned to variable "User::extractCommand" differs from the current variable type (String). Variables may not change type during execution. Variable types are strict, except for variables of type Object.
".

The query being generated dynamically ends up being close to 21k bytes in length for 22 locations.  The generating query runs fine in ssms and the generated query runs fine in the db where it is intended to run. 

Its my understanding so far that sql wont even run this unless the first item in the string_agg function has a data type (thus the casts) big enough to hold the results.  The generated query ends up looking something like this and the number of conditions in parens separated by "or" for each location is not necessarily the same as the others...

select ... from location1_replicated.dbo.tbl where (id=... and dt >... and dt <= ...) or (...) or (...) union all select ... from location2_replicated.dbo.tbl where (...) ....

The compound predicates were built separately, one row for each location in #predicates.

Does anybody know of an elegant way around this problem?

select substring(command,12,len(command)-11) command
from
(
select string_agg(cast(' union all select ' as varchar(max))+ '''' + a.loccode + '''' +' loccode,* from ' + case when a.loccode='abc' then 'xyz' else a.loccode end + '_replicated.dbo.tbl where ' + b.[predicate],'') command 
from #whichlocs a
join #predicates b
on a.loccode=b.loccode
) x

 

SSIS Failed to Establish Connection with Attunity Teradata connector 5.0 on VS2017 Prof

$
0
0

Hi All,

Need Advice with Teradata Attunity connector 5.0 with VS2017 Prof. 

Environment;

OS:Windows 10

Microsoft Visual Studio Professional 2017  with SSDT 15.8.0

Attunity Teradata connector 5.0 Installed on pc both 32bit and 64.

when i try to create connection manager with Teradata Attunity connector .. i get below Error

===================================

Error at MyTeradatabse [Connection manager "Teraconn"]: Failed to establish an ODBC connection with the database server. Verify that the Teradata ODBC Driver for Windows x86 is installed properly. SqlState = IM002 Message = [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified

 (Microsoft Visual Studio)

===================================

Test connection failed

===================================

Error HRESULT E_FAIL has been returned from a call to a COM component. (Microsoft.SqlServer.DTSRuntimeWrap)

------------------------------
Program Location:

   at Microsoft.SqlServer.Dts.Runtime.ConnectionManager.AcquireConnection(Object txn)
   at Attunity.IntegrationServices.DataFlowUI.TeraDataUI.TeraConnectionDialog.testConBtn_Click(Object sender, EventArgs e)

===================================

Error HRESULT E_FAIL has been returned from a call to a COM component. (Microsoft.SqlServer.DTSRuntimeWrap)

------------------------------
Program Location:

   at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSConnectionManager100.AcquireConnection(Object pTransaction)
   at Microsoft.SqlServer.Dts.Runtime.ConnectionManager.AcquireConnection(Object txn)

I have tried to test connection with 64bitRuntime = False .. still could not connect. I would really appreciate the advice

Jay

Character set issue

$
0
0

I have an SSIS package which is using Oracle Provider for OLE DB to retrieve data from Oracle database. 
One of the columns that I'm selecting is a coordinates(polygon) info. It is coming as CLOB (DT_NTEXT) in a text polygon representation: POLYGON((-1 1, -3 3, -1 1)) - (It's just an example. The actual values that I'm receiving have much more points and the string is very long)...
When I run the package on my local machine it works fine and shows the polygons as described above.
When I deploy my package to another server the values are coming like this: 䰀夀䜀 (Chinese characters...?)
The package is reading the data from the same Oracle source.
Any ideas on how it can be fixed? I cannot convert the column itself to another format since it can be very long (more than 8000 characters).

Problems with Excel Formatting in SSIS Output

$
0
0

Hi,

I'm in the process of upgrading a lot(100+) of SSIS packages to a newer version of Visual Studio. While upgrading i am also taking the opportunity to update any packages using xls files to xlsx. 

Most of the jobs are about taking data from a SQL Query, exporting to excel and emailing the file to end users so formatting is a key aspect. I'm not bothered about fonts/colours/borders etc but what is important is that numbers are exported as numbers and that dates are sometimes required to be date/time, just date or just time.

I had previously controlled this by using a template xls file with a hidden dummy row under the header containing example formatting. When the data was exported the process would "magically" follow the formats set out in the dummy row. Now that i've moved over to xlsx this only partially happens. If i put a date in a dummy row the export brings it out as a date(rather than text - not using the dummy row at all dumps everything out as text values that don't sort or filter correctly) but it's always DD/MM/YYYY format even if the dummy specified DD/MM/YYYY HH:MM. xls files worked perfectly in this regard. Even pre-formatting the columns in the way i would like doesn't have the desired effect as the export process ignores any pre-existing formats.

It's looking like my only option is to roll back to using xls which feels like a step backwards i wasn't really wanting to take. Is anyone able to give me any advice on how best to control formatting when outputting to excel? I've seen things on other threads about adding scripts to format data after export but to do this to 100+ packages isn't realistic when it's actually not adding much value to the process i currently have with xls files.

Regards,

Andy

Mapping numeric to nvarchar eliminates leading zero

$
0
0

I am working with VS SSIS 2015 and SQL SERVER 2017 (14.0.1000.169). The problem I'm facing is when I'm mapping a numeric data type column in SSIS to nvarchar column in SQL table. The value of the column in SSIS does contain a leading zero (0.07), but when mapped to a table it eliminates the leading zero (.07).

When i see the data viewer just before insert to the table, the values are just fine (0.07,0.08)

The metadata just before insert to the table is also fine. The both columns are DT_NUMERIC (4,2)

Please advise, and thank you in advance. Best Regards.


How to solve error component oledb destination has no input or all of its input are already connected to other output ssis ?

$
0
0

problem

How to solve error component ole db destination has no input or all of its input are already connected to other output ssis ?

I have excel sheet 2007 and have SQL server 2012 with visual studio 2010
I already finish ssis package and when execute package no data Importing from excelsheet to SQL server 2012
although excel sheet have rows
when make double click excel source i can see what inside excel sheet data
when click oledb source destination i can see what inside table on SQL server 2012 data

so what i do to solve this error please ?

also no data imported to table SQL server .

What I have tried:



CREATE TABLE [dbo].[EMP_A](
    [Id] [int] NULL,
    [Name] [nvarchar](500) NULL,
    [Dept] [varchar](10) NULL
) ON [PRIMARY]

GO

SET ANSI_PADDING OFF
GO

ALTER TABLE [dbo].[EMP_A] ADD  DEFAULT ('IT') FOR [Dept]
GO


Upgrading an SSIS package - with database collation change

$
0
0

Hi, 

We have an ERP DB and based on that a DW ETL process. IT loads data from the ERP to the DW. 

Here is the issue , we have upgraded the ERP and would like to continue the DW load . However the ERP database has moved from non unicode to unicode. This then causes issues to the SSIS package complaining about codepage mappings. 

Can I force somehow SSIS to ignore unicode and keep it latin2 or whatever  ?

I would like to avoid touching the packages which is huge . 

tx,

gabor

The first task fails, the next task will continue to run.

$
0
0
I am new to SSIs. I have two tasks in my package.
I executed this package. The first task failed and the next task did not execute.
But I want to realize the following result :
if the first task fails, the next task will continue to run.

How to split single column into multiple columns using SSIS

$
0
0

Hi Team,

If it possible to split single column to multiple columns using SSIS package please help one same.

For example:

Input data:

historyId      remarks
197     OldBalanceAmount: 916.4413 || OldHoldAmount: 0.0000 || oldFundAmount: 0.0000 || oldTemporaryAmount 0.0000 ||NewBalanceAmount: 916.4413 || NewHoldAmount: 0.0000 || NewFundAmount: 0.0000 || NewTemporaryAmount 15000.0000

Out Put:

historyIdOldBalanceAmount OldHoldAmount oldFundAmountoldTemporaryAmount NewBalanceAmountNewHoldAmountNewFundAmountNewTemporaryAmount 
197916.4413000916.44130015000


Thanks Bala Narasimha

Referencing Foreign Keys using LookUp Transformation

$
0
0

Hello everybody,

I'm asking for help on the following topic. For my Data Warehouse, I have the following database structure (star schema):

1: Dimension Table: Employee
Columns: emp_id (PK), emp_number, emp_name, and a few more

2: Dimension Table: Project
Columns: pro_id (PK), pro_name, pro_orderNumber, and a few more

3: Dimension Table: Time
Columns: time_id (PK), month, year, date (dd-mm-yyyy)

4: Fact Table: Hours Worked
Columns: emp_id (FK), pro_id (FK), time_id (FK), totalHoursWorked

Also, I have two Excel Sheets as Source Data. The first one (I call it S1, ~300 rows)) contains the details for the projects and employees and the second one (S2, ~7000 rows) contains the amount of hours worked per project per employee with a timestamp.

I want to insert the amount of hours, which each employee worked in each project at a timestamp, into the fact table by referencing to the existing primary keys in the dimension tables. If an entry is not present in the dimension tables already, i want to add a new entry first and use the newly generated id.

Using SSIS in, I created three Data Flow Tasks to insert distinct values into the Dimension Tables (using GROUP BY, since S1 and S2 contain a lot of duplicate rows). A fourth Data Flow Task (see image below) is used to insert Data (total amount of hours worked) into the Fact Table and this is where I am running into problems:

https://social.msdn.microsoft.com/Forums/getfile/1546795 (Microsoft doesn't let me insert images for some reason)

At the very end, the foreign keys are matched (see image below, Data Viewer before destination), however they are not matched to the amount of hours worked (which is passed by Multicast 4 image above)):

https://social.msdn.microsoft.com/Forums/getfile/1546797

Lastly, the amount of rows being inserted increases forever (see image below, I assume because of the Merge Joins). I used one UNION Transformation instead of the three Merge Joins before, but this resulted in the foreign keys transfered in separate rows instead of merged into the same one.

https://social.msdn.microsoft.com/Forums/getfile/1546798

Sort 0-4 are using the project name, employee number, timestamp as sorting columns (as they are passed all the way through) and Merge 1-3 are using the same columns to merge.

My intention was the following:

Merge Join 1: Combine project_id_Final and employee_id_Final

Merge Join 2: Combine time_id_Final and totalHoursWorked

Merge Join 3: Combine project_id_Final, employee_id_Final, time_id_Final and totalHoursFinal and insert into destination.

Am I having the right approach and just missing a little detail?

I also tried using a single Data Flow Task to insert Dimension and Fact Tables at once by joining the two sources. While the references where correct, it resulted in a separate entry in each DimensionTable for each row of the source.


I understand that this scenario might be confusing for everybody else, but thank your for taking time looking at it! Any help is greatly appreciated.

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>