Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS 2012 package getting Crash while saving

$
0
0

We have created SSIS Package with almost more than 400 Component in DFT.

We are facing problem - while saving the package we are getting below error.

 Failure saving package.

Additional Information:

Exception of type 'System.OutofmemoryException' was thrown (Microsoft.SqlServer.ManagedDTS.

AS when we tried to save package we observed that server process for Visual studio reached size more than 2 gb.

How to resolve this issue, another option we have to divide package but we don't want to do that as we have so many variable and columns in flow.

Please suggest. 


Run a T-SQL query / Stored Procedure against 250 databases.

$
0
0

Hi Experts,

I have a list of 250 database servers with their databases having common structure.

The requirement is to run a stored procedure present on those databases and push the output results to a SQL table in a custom database present on my reporting sever.

Thanks,

Eswar



Eswar

SSIS Data load issue with CLOB datatype.

$
0
0

Hello,

          we are facing an issue while loading the data of one column KOMMENTARE (CLOB datatype) from Oracle database to SQL database by using SSIS 2008.

By Corresponding to that column (which is using CLOB datatype), we have used DT_NTEXT in our application package.

As the data load is working properly, the load is getting failed all of sudden by specifiying the issue as mentioned below.

[Read from ORACLE PMT_ABC_TAB1] Error: Failed to retrieve long data for column "KOMMENTARE".

[Read from ORACLE PMT_ABC_TAB1[1]] Error: There was an error with output "OLE DB Source Output" (11) on component "Read from ORACLE PMT_ABC_TAB1" (1). The column status returned was: "DBSTATUS_UNAVAILABLE".

Could you please let us know the reason for these kind of issues?.

Please help us in resolving this issues.

Thanks in advance!!.. :-)

Regards,

Pradeep.

SQL 2014 Enterprise Attunity v3 for Oracle Import Package to SSISDB Error

$
0
0

I have successfully installed the Attunity v3 (32bit and 64bit) on Windows 2012 (Oracle 11g already installed) for SQL 2014 (Enterprise). I can run the project locally but if I attempt to import the package to SSISDB, I get this error:

The managed pipeline component "AttunitySSISOraSrc.8" could not be loaded.  The exception was: Could not load type 'AttunitySSISOraSrc.8' from assembly 'Microsoft.SqlServer.PipelineHost, Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'..

I'm still awaiting activation on the Attunity forum, else I would ask there.

Any ideas?

SQL 2016 Tabular Model processing fails when running from Integration Services package

$
0
0

When I run a SSIS package to do a full processing of a SQL 2016 Tabular Model deployed on SQL 2016 with compatability set to 1200, it fails and gives me the following error:

"[Analysis Services Execute DDL Task] Error: This command cannot be executed on database 'TabularDatabaseName' because it has been defined with StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database."

How do I change the StorageEngineUsed to the Tabular API's so that the I can process it correctly?

When manually processing this model within SSAS, it works 100% correctly.

Problem in opening the project.."Make Sure the application for the project type (.dtproj) is installed."

$
0
0

Hi,

         I have been working on some SSIS packages for a while now and today while i was working i was trying to create a new connection and in the process there was an error and it said the BIDS has to be closed and i closed it but later when i open BIDS  and try to open my project(.sln)  from the file menu  to work on the half done package it pops up an error which shows the path to my project location on the first line and next statement on the pop up error box says:

 

"Make Sure the application for the project type (.dtproj) is installed."

 

I tried to check some forums in which some suggested to try installing SP1 which i tried but ..i dont know why but the SP1 fails to install (i dont know if its causing problem becoz i already installed SP2 as i had some  other problem before for which the cure was to install SP2).

 

Did anyone here face such a problem before ?

I'd really appreciate if the experts here can tell a cure for this problem.

 

thanks,

Ravi

SSIS executable again SQL 2016 Express?

$
0
0
I need to convert a MS Access DB to SQL. I am planning on deploying SQL Server 2016 Express with Advanced Services. I know that older versions of SQL Express did not support Integration Services, but I have been unable to figure out if 2016 does.

Exception setting Script Component Output Columns

$
0
0

Hi,

I have been trying to add a Script component to my SSIS dataflow, but I am getting an "Object reference not set to an instance of an object" when I try to set the value of the Output Columns.

I have tried stripping down my code to a really basic example and I still get the issue. What I am doing is as follows.

I have an OLE DB Source which just selects data from a table, and that goes straight into my script component. The source just produces 1 row. I have passed through both columns they are both set to read only;


Then I am adding 1 output column with a datatype of int. Output properties are;



The column properties are;

I have also added one connection manager which I am not doing anything with in the script currently. Script Component code is all default stuff, all I have added is one line to set my Column1 output column to 1. The main.cs class is as follows.

public class ScriptMain : UserComponent
{
    /// <summary>
    /// This method is called once, before rows begin to be processed in the data flow.
    ///
    /// You can remove this method if you don't need to do anything here.
    /// </summary>
    public override void PreExecute()
    {
        base.PreExecute();
        /*
         * Add your code here
         */
    }

    /// <summary>
    /// This method is called after all the rows have passed through this component.
    ///
    /// You can delete this method if you don't need to do anything here.
    /// </summary>
    public override void PostExecute()
    {
        base.PostExecute();
        /*
         * Add your code here
         */
    }

    /// <summary>
    /// This method is called once for every row that passes through the component from Input0.
    ///
    /// Example of reading a value from a column in the the row:
    ///  string zipCode = Row.ZipCode
    ///
    /// Example of writing a value to a column in the row:
    ///  Row.ZipCode = zipCode
    /// </summary>
    /// <param name="Row">The row that is currently passing through the component</param>
    public override void Input0_ProcessInputRow(Input0Buffer Row)
    {
        /*
         * Add your code here
         */

        Row.Column1 = 1;
    }
When I run the code I get the following exception (debugging shows it is thrown when trying to set Row.Column1 = 1;

Object reference not set to an instance of an object.
   at Microsoft.SqlServer.Dts.Pipeline.PipelineBuffer.set_Item(Int32 columnIndex, Object value)
   at ScriptMain.Input0_ProcessInputRow(Input0Buffer Row)
   at UserComponent.Input0_ProcessInput(Input0Buffer Buffer)
   at UserComponent.ProcessInput(Int32 InputID, String InputName, PipelineBuffer Buffer, OutputNameMap OutputMap)
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponent.ProcessInput(Int32 InputID, PipelineBuffer buffer)
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)

I can't work out why it isn't working, I followed several different examples on how to write a script component, and what I am doing looks fine (?), but I always get the same issue. I do need the script component to do the transformation I am working on (more complex than just setting a column to 1) but I can't get it working for the simplest case.

I am really hoping someone is going to point out an obvious issue in something I have done :)

I am using Visual Studio 2015, latest SQL Server Data Tools build (14.0.60629.0) and SQL Server 2012. 

Let me know if anything else is required to help me, 

cheers,

Rob


[OLE DB Destination [471]]: The source and target columns must be in the same order in the INSERT BULK statement

$
0
0

[OLE DB Destination [471]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "The source and target columns must be in the same order in the INSERT BULK statement.".

Started getting above message from last few days.

SAme package has ran fin for a month so not sure what is causing this.

Target is Azure DW and when I choose normal mode of data insertion it is giving error that table does not exist. i think it has to be due to schema call that task makes before running.

Any one has any idea about this error?

Thanks,

Dilkush


Thanks, Dilkush Patel Microsoft SQL Developer Support Microsoft Corporation

destination control for source

$
0
0

I import data from CSV file to a SQL table and use the table imported as a source to import to another server table.

It looks like the destination control can not be used as a source control.

Are there any solution to use destination control as a source control on the same package or I need 2 packages to do the task.

Thanks again for your information and help,

Regards,

Sourises,

1071607685 error when import data fix but performance issue

$
0
0

I got 1071607685 error while insert a CSV file in to SQL server table.

I found the following link and follow the instruction to change DefaultBufferMaxRows from 10,000 to 1 and AccessMode OpenRowSet and it works SSIS insert all records.

I think that SSIS process one record by record, so I import 1.8 millions records from a few seconds to a few hours.

I would like to learn that to fix the issue without affect huge performance issue.

https://sqljgood.wordpress.com/2014/09/29/ssis-troubleshooting-error-1071607685/

Your information and help is great appreciated,

Regards,

Sourises,

ODBC source not working, returns no data

$
0
0

I am trying to move some data from an Accounting App (MAS200) to a DataMart. I have crystal reports using ODBC(RDO) that work, so I used the same in my SSIS package. It looks like it runs ok, except when I query my destination SQL table, it is empty.  My legacy system is fairly old.  Anyone know where I should start looking to debug?

Thanks,

Mike

Maintenance Plans

$
0
0

The databases involved have a lot of transactions, etc. which is why we have decided to go with the following approach.  The plan involves a full backup, a full monthly, differentials, and transaction log backups.  The daily backup will do a full backup at 6PM which stores all the data on a NAS.  The monthly will do a full backup at 6PM the last day of every month and this too is stored on a NAS.  The differential which runs every day between 8AM - 5PM and is taken every hour (1hr) which is stored on the local machine.  The transaction log backup is Mon-Fri 8AM-5PM and is taken every 15 minutes and this too is stored locally.  Each sub-plan has a maintenance task to clean up old files.  Clean up works as follows: Daily (Full) backups are removed after 14 days, Each Monthly (Full) backup is retained for 1 year, and the Differential and Transactions are retained for 1 full day.

So the questions..

The differential requires a full backup for which it can apply the differentials and the same goes for the transaction logs.  With the clean ups removing all (bak & trn) files after 24 hrs. does the whole thing just break after the first 24 hrs?

Is the diff and trans log backups using the full daily backup which reside on the NAS?

Does a history cleanup task change any of this, such as affecting the systems knowledge of when to remove old bak & trn files?

What issues will I face using this backup solution?

Changing Behavior of Large Object Data Types in CDC

$
0
0

In the documentation @ https://msdn.microsoft.com/en-us/library/bb500305.aspx there is a statement that if a Large Object Data Type column are not included in the update statement then that column will be NULL in the Operation = 3 row in the CDC table.

Is there any way to override this behavior?

Thanks,

Jamie Irwin


Jamie Irwin Senior Software Engineer Silvics Solutions, LLC

SSIS Sql Agent Job : Could Not connect error

$
0
0
Could not connect:
TCP error code 10060: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.

System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException:
A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
I'm trying to schedule job thru Sql Server Agent and its failing with this error. Any idea whats causing the error?

Company is moving from peoplesoft to workday. How to continue using my SSIS package?

$
0
0

Currently I have a SSIS package which extracts data from multiple resources like .dat files, sharepoint lists, sql server database, etc. The sql server database in turn gets data from peoplesoft database. 

Now our company has decided to move from peoplesoft to Workday. The historical data remains in peoplesoft but new data (1 year) will be loaded into workday.

So, I am thinking on how to get employee information for my SSIS package. Is there a connector I can use to connect to Workday and load it into sql server database?

Any ideas? This is critical for the continuity of this project. Any help is highly appreciated. Thanks in advance.

DTSX with "memory"

$
0
0

Hi,
Every month I have to use a DTSX to feed a table from an SQL database from a csv format file.
I used SQL Import and Export Wizard to build DTSX file, and everything went well.
The following month used the Execute Package Utility to make the monthly update again.
I put the new data in the a new file with the same name and the same path same structure ansdd format.
This NEW file has more records.
Strangely the table is updated only with the same amount of records of the previous month.
I checked everything I remembered.
I cleared the table (delete and then just had a record then close the SQL Server Management Studio).
Even used another workstation with Win 8, and the result is the same.
I have to make a new DTSX file to "upload" the new records.
Even compare the new DTSX file and the old one and I can't find differences.
Ask help to collegues of mine and they don't find any reasons for this...
Can someone tell any hints about this.

I'm using Microsoft SQL Server Management Studio V. 10.50.1600.1 

Microsoft SQL Server 2008 R2 

Import and Export Wizard (32 bits)  to build the DTSX file and the DTExecUI V.2.0 to execute

Thanks

João


Joao Simplicio Rodrigues

SSIS CDCSplitter component error

$
0
0

Hi,

I am working on an SSIS project which is having a CDCSplitter control. The package is developed using BIML. When I run the project I am getting following error

Error: 0xC0047062 at DFT Incremental load_Source1, CDCSplitter [92]: System.ArgumentException: Value does not fall within the expected range.
   at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBuffer100.DirectRow(Int32 hRow, Int32 lOutputID)
   at Attunity.SqlServer.CDCSplit.CdcSplitterComponent.ProcessInput(Int32 inputId, PipelineBuffer buffer)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)

My BIML code for CDCSplitter is 

<CustomComponent Name="CDCSplitter"
                    ComponentClassId="{874F7595-FB5F-40FF-96AF-FBFF8250E3EF}"
                    ComponentTypeName="Attunity.SqlServer.CDCSplit.CdcSplitterComponent, Attunity.SqlServer.CDCSplit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=aa342389a732e31c"
                    ContactInfo="Attunity Ltd.; All Rights Reserved; http://www.attunity.com;"
                    UsesDispositions="true"
                    Version="2"
                    ValidateExternalMetadata="false"><Annotations><Annotation AnnotationType="Description">Directs a stream of net change records into different outputs based on the type of the change (Insert, Delete and Update). This allows specific handling for different types of change records.</Annotation></Annotations><InputPaths><InputPath Identifier="Input" OutputPathName="CDCSource.Output" ><InputColumns><InputColumn SourceColumn="__$start_lsn" /><InputColumn SourceColumn="__$operation" /><InputColumn SourceColumn="__$update_mask" /><InputColumn SourceColumn="ColID"  /><InputColumn SourceColumn="ColA"  /><InputColumn SourceColumn="ColB"  /><InputColumn SourceColumn="ColC"  /></InputColumns></InputPath></InputPaths><OutputPaths><OutputPath Name="InsertOutput"><Annotations><Annotation AnnotationType="Description">Output type - Insert.</Annotation></Annotations><CustomProperties><CustomProperty Name="OutputType" DataType="Int32"
                                        TypeConverter="Attunity.SqlServer.CDCSplit.OutputType, Attunity.SqlServer.CDCSplit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=aa342389a732e31c">0</CustomProperty></CustomProperties><OutputColumns><OutputColumn Name="__$start_lsn" /><OutputColumn Name="__$operation" /><OutputColumn Name="__$update_mask" /><OutputColumn Name="ColID" DataType="Int32"/><OutputColumn Name="ColA" DataType="String" Length="10" /><OutputColumn Name="ColB" DataType="DateTime"  /><OutputColumn Name="ColC" DataType="Int32"  /></OutputColumns><ExternalColumns /></OutputPath><OutputPath Name="UpdateOutput"><Annotations><Annotation AnnotationType="Description">Output type - Update.</Annotation></Annotations><CustomProperties><CustomProperty Name="OutputType" DataType="Int32"
                                        TypeConverter="Attunity.SqlServer.CDCSplit.OutputType, Attunity.SqlServer.CDCSplit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=aa342389a732e31c">1</CustomProperty></CustomProperties><OutputColumns><OutputColumn Name="__$start_lsn" /><OutputColumn Name="__$operation" /><OutputColumn Name="__$update_mask" /><OutputColumn Name="ColID" DataType="Int32"  /><OutputColumn Name="ColA" DataType="String" Length="10" /><OutputColumn Name="ColB" DataType="DateTime"  /><OutputColumn Name="ColC" DataType="Int32"  /></OutputColumns></OutputPath><OutputPath Name="DeleteOutput"><Annotations><Annotation AnnotationType="Description">Output type - Delete.</Annotation></Annotations><CustomProperties><CustomProperty Name="OutputType" DataType="Int32"
                                        TypeConverter="Attunity.SqlServer.CDCSplit.OutputType, Attunity.SqlServer.CDCSplit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=aa342389a732e31c">2</CustomProperty></CustomProperties><OutputColumns><OutputColumn Name="__$start_lsn" /><OutputColumn Name="__$operation" /><OutputColumn Name="__$update_mask" /><OutputColumn Name="ColID" DataType="Int32" /><OutputColumn Name="ColA" DataType="String" Length="10"/><OutputColumn Name="ColB" DataType="DateTime" /><OutputColumn Name="ColC" DataType="Int32" /></OutputColumns></OutputPath><OutputPath Name="ErrorOutput" IsErrorOutput="true"><Annotations><Annotation AnnotationType="Description">Output type - Error.</Annotation></Annotations><CustomProperties><CustomProperty Name="OutputType" DataType="Int32"
                                        TypeConverter="Attunity.SqlServer.CDCSplit.OutputType, Attunity.SqlServer.CDCSplit, Version=1.0.0.0, Culture=neutral, PublicKeyToken=aa342389a732e31c">3</CustomProperty></CustomProperties><OutputColumns><OutputColumn Name="__$start_lsn" /><OutputColumn Name="__$operation" /><OutputColumn Name="__$update_mask" /><OutputColumn Name="ColID" DataType="Int32"  /><OutputColumn Name="ColA" DataType="String" Length="10" /><OutputColumn Name="ColB" DataType="DateTime"  /><OutputColumn Name="ColC" DataType="Int32"  /></OutputColumns><ExternalColumns /></OutputPath></OutputPaths></CustomComponent>

Please help me to resolve this issue

Project Deployment changing package version

$
0
0

Hi

I have added a project to the Integration Services catalog in SQL 2012 standard. The package was developed in Visual Studio 2010, deployed to the IS catalog using Visual Studio.

The package version is 6, but if I export the project from the IS catalog the XML shows the package as version 8, and Visual Studio can't open it. Is there some problem with the Integration Services catalo 2012?

If so is there a fix?

Is it the same in 2014?

Any help appreciated

Regards

Andy


CRM 4, SQL Server and .Net developer using C#

System.MissingMethodException get_ComponentMetaData

$
0
0

Hi All,

I have developed a custom data flow component for SSIS.   When I try to drop it onto the designer I get the following error:

Error at Process Changes [Person Crm Destination [39]]: System.MissingMethodException: Method not found: 'Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSComponentMetaData100 Microsoft.SqlServer.Dts.Pipeline.PipelineComponent.get_ComponentMetaData()'.
   at SSISCrmDestination.PersonCrmDestination.ProvideComponentProperties()
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProvideComponentProperties(IDTSManagedComponentWrapper100 wrapper)

I've tried attaching the debugger but doesn't trigger any breakpoints.

Any advice appreciated

Cheers

Stokesy

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>