Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SQL Server Data Tools - Cannot show the editor for this task

$
0
0

Hi,

I cannot edit sql tasks or script tasks in VS data tools. This is the error I get:

Cannot show the editor for this task. (Microsoft Visual Studio)
Value does not fall within the expected range. (mscorlib)

------------------------------
Program Location:

Server stack trace:
   at Microsoft.SqlServer.Dts.Runtime.PersistImpl.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events)
   at Microsoft.SqlServer.Dts.Runtime.DtsContainer.SaveToXML(XmlDocument& doc, XmlNode node, IDTSEvents events)
   at Microsoft.SqlServer.IntegrationServices.Designer.Undo.DtsPersistSnapshot.GetSnapshotHelper(IDTSPersist dtsPersist)
   at System.Runtime.Remoting.Messaging.StackBuilderSink._PrivateProcessMessage(IntPtr md, Object[] args, Object server, Object[]& outArgs)
   at System.Runtime.Remoting.Messaging.StackBuilderSink.AsyncProcessMessage(IMessage msg, IMessageSink replySink)

Exception rethrown at [0]:
   at System.Runtime.Remoting.Proxies.RealProxy.EndInvokeHelper(Message reqMsg, Boolean bProxyCase)
   at System.Runtime.Remoting.Proxies.RemotingProxy.Invoke(Object NotUsed, MessageData& msgData)
   at System.Func`2.EndInvoke(IAsyncResult result)
   at Microsoft.DataTransformationServices.Design.DesignUtils.RunOnMtaThreadFunc[TArg1,TResult](Func`2 function, TArg1 arg1)
   at Microsoft.SqlServer.IntegrationServices.Designer.Undo.DtsPersistSnapshot.GetSnapshot(IDTSPersist dtsPersist)
   at Microsoft.DataTransformationServices.Design.Undo.UndoController.GetFullSnapshot(Object obj)
   at Microsoft.DataTransformationServices.Design.Undo.UndoController.GetSnapshot(Object component, MemberDescriptor memberDescriptor)
   at Microsoft.DataTransformationServices.Design.Undo.UndoController.ObjectSnapshotManager.TrySaveSnapshot(MemberDescriptor memberDescriptor)
   at Microsoft.DataTransformationServices.Design.Undo.UndoController.SnapshotManager.TrySaveSnapshot(Object obj, MemberDescriptor memberDescriptor)
   at Microsoft.DataTransformationServices.Design.Undo.UndoController.OnComponentChanging(Object sender, ComponentChangingEventArgs e)
   at System.ComponentModel.Design.ComponentChangingEventHandler.Invoke(Object sender, ComponentChangingEventArgs e)
   at System.ComponentModel.Design.DesignerHost.System.ComponentModel.Design.IComponentChangeService.OnComponentChanging(Object component, MemberDescriptor member)
   at Microsoft.DataTransformationServices.Design.DtrPackageDesigner.DoDefaultActionForTask(TaskHost task)

I've tried repairing/uninstalling/re-installing/applying latest service packs etc. Nothing works.

I've also tried with VS 2012 and 2014. Same error.

Note that the I've installed the same product on other machines using the same dvd and everything works fine so I doubt it's a case of corrupt media file.

I'm running Sql Server 2012 Standard SP1 on Windows 7 Pro and have full admin rights on the machine.

Here's the VS Studio info:

Microsoft Visual Studio 2010
Version 10.0.40219.1 SP1Rel
Microsoft .NET Framework
Version 4.5.50938 SP1Rel

Installed Version: IDE Standard

Microsoft Visual Basic 2010   01011-532-2002361-70528
Microsoft Visual Basic 2010

Microsoft Visual C# 2010   01011-532-2002361-70528
Microsoft Visual C# 2010

Microsoft Visual Studio Tools for Applications 3.0   01011-532-2002361-70528
Microsoft Visual Studio Tools for Applications 3.0

Microsoft Visual Web Developer 2010   01011-532-2002361-70528
Microsoft Visual Web Developer 2010

MySQL for Visual Studio   1.1.4
Data design and management tools for MySQL.  Copyright © 2007-2014 Oracle, Inc.

SQL Server Analysis Services   
Microsoft SQL Server Analysis Services Designer
Version 11.0.5058.0

SQL Server Integration Services   
Microsoft SQL Server Integration Services Designer
Version 11.0.5058.0

SQL Server Reporting Services   
Microsoft SQL Server Reporting Services Designers
Version 11.0.5058.0

Visual Studio 2010 Shell (Isolated) - ENU Service Pack 1 (KB983509)   KB983509
This service pack is for Visual Studio 2010 Shell (Isolated) - ENU.
If you later install a more recent service pack, this service pack will be uninstalled automatically.
For more information, visit http://support.microsoft.com/kb/983509.

SQL Prompt 6
For more information about SQL Prompt, see the Red Gate website at
http://www.red-gate.com
For customer support, call 1-866-733-4283.
Copyright © 2006–2009 Red Gate Software Ltd

Any help would be much appreciated.

Thanks

Nico



changing database name

$
0
0

i created a database using ssms, and named it 

70-461

I am trying to change it using alter statement, but it wont let me, i also tried using ssms, but it wasnt possible

USE master;
GO
ALTER DATABASE 70-461
Modify Name = querying ;
GO

NOT NULL constraint

$
0
0

trying to drop a column that is defined as not null, 

i am trying to use this query, , but i dont know the constraint name is not null, or ?

alter table dbo.Robust2 drop constraint [    xxxx  ]
GO
alter table dbo.Robust2 drop column BESTSCORE3

Thanks

How to capture MSMQ logs

$
0
0
In our application we are using MSMQ ,how we can capture the MSMQ logs

Insert NULL value from a table to another using Execute SQL Task on OLEDB connection

$
0
0

Hi everybody!

Can you help me to solve a problem that I am not able to find how I can handle it please ?

I am trying to do a simple thing : transfert data from a table to another and I am not able to handle insterting NULL values to the destination.

In order to explain de problem, I reproduced it into a sample project that is very basic. Here is the project structure :

Tables :

Source table

This is the source datatable


Destination table

This is the destination datatable


Control Flow

The control flows works this way :

  • Get the source data from the source datatable on SQL Server
  • Foreach record, insert a new record in the destination datatable


Data Flow Task

The Data Flow Task works this way :

  • Get the source data from the source datatable on SQL Server
  • Put the whole data in a resultset
  • The resultset is stored in a package variable "DataStored" (Object)


OleDB datasource

The OleDB datasource retrives all the records from the source datatable


Resultset Destination

The destination resultset stores all the records from the source datatable


Foreach loop

The foreach loop iterates on each record of the resultset

aThere are 2 variable in the package to store the fields to copy :

  • Field2 (int)
  • Field2 (int)

The foreach loop is mapped on the "Field1" and "Field2" variable on the second and third index (the zero index is the primary key)


Execute SQL Task

The execute SQL task inserts each record to the destination datatable. The query needs the 2 parameters that I put into the variables (Field1 and Field2)


The variables are mapped to the "?" parameters.

And here is the problem! When Fields 2 from the source datatable is NULL, the package fails because of a conversion exception.

I understand that the variable type is a LONG one, and that is why it fails. But I don't know how to make it work...

I believe this is easy to do for SSIS experts, but I am a newbie...

Can you help me ?

Thank you.

Best regards.



Matteo, .NET Developer and System Engineer

Multicast producing double rows in one pipe only

$
0
0

SQL Server and SSIS are both version 2008 R2

I had an existing SSIS package that was copying 6000 rows from our IBM database to a SQL Server database daily.  Now I need to copy the same data to a second SQL Server so I thought using the Multicast tool would be a perfect solution.

I deleted the connector between the conversion widget and the original data destination then added a multicast widget between the conversion widget and the original destination.  Then added a new destination and connected a second pipe from the multicast tool to that.

When I run it, the SSIS interface and the log tab both indicate that 6000 rows were written to both destinations.  However, when I look at those destination tables the new destination has 12000 rows in it.  Every row appears twice.  The original destination has the correct 6000 rows in it.

If I temporarily delete the first (original) destination then the new destination gets the correct 6000 rows.  I can delete both pipes and re-create them and no matter which order I do it, the original destination gets the correct number of rows and the new destination gets double that.  The only way I can get the new destination to get the correct row-count is when it is the only destination hooked up.

If I add an IDENTITY column to the new destination table the duplicate rows are not added in any particular order (assuming IDENTITY value corresponds to insertion order).  That is, IDENTITY 2 is not a duplicate of IDENTITY 1, nor is IDENTITY 6001. The values seem completely arbitrary.

I have tried multiple setting for rows-per-batch (even 1) with no change in behavior.  The destination table has no keys, indexes, constraints, or triggers. It's currently just a "bucket of data".  it was created by scripting the original destination table and running that script on the second server.

Row-by-Row processing in data flow task

$
0
0

Hi to all

I want ask to you, how to process one row at time in transformation components in data flow task?

For Example, we have the following components in data flow task :

Derived Column ----> Ole DB Command_1 ----> Ole DB Command_2

I want that the Ole DB Command_1 receive the first row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).

After the Ole DB Command_1 receive the second row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).

After the Ole DB Command_1 receive the third row and execute sql command (INSERT command) then the Ole DB command_2 receive the same row and execute the sql command (INSERT command).

.... And so on... until last row.

Instead, now the Ole DB Command_1 receive n rows and execute n INSERT ... then the Ole DB Command_2 receive the same n rows and execute n INSERT.

How to realize row-by-row processing in Ole DB Command_1 and Ole DB Command_2?

thanks in advance.


SSIS package won't run as SQL Agent job

$
0
0

Hello,

I have an SSIS package that writes the results of a query to a flat file.  The package runs within Visual Studio, it runs when deployed to the MSDB database and I run the package, but when I schedule it in a sql agent job it fails with the following error message

Started:  5:02:00 PM  Error: 2013-01-09 17:02:02.12     Code: 0xC020200E     Source: Data Flow Task Flat File Destination [551]     Description: Cannot open the datafile "R:\XXXXXX\FinalReports\BRICLast24Hours.txt".  End Error  Error: 2013-01-09 17:02:02.13     Code: 0xC004701A     Source: Data Flow Task SSIS.Pipeline     Description: component "Flat File Destination" (551) failed the pre-execute phase and returned error code 0xC020200E.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  5:02:00 PM  Finished: 5:02:02 PM  Elapsed:  1.625 seconds.  The package execution failed.  The step failed.,00:00:02,0,0,,,,0

I've granted full control permissions to this txt file to the sql agent service account and my windows account.  The job step that runs the ssis package is run under a proxy I created that uses my windows credentials. I have administrator privileges on the server I run the job from.  I tried a UNC path rather than a drive letter.  I can't think of much more I can try to get this to run as scheduled.  Any info would be greatly appreciated.

Thanks in advance.


Server name in Parameter

$
0
0
I am writing SSIS PACKAGE  and there is Development, Production and UA server
Since I am developing package  in Development(Dev) area  and everytime  I have to change  the  server name for example Select * from [Servername].[Database name]. [tablename]  everytime  performing this query  in different environment I like to use  Variable or parameter any idea? or How can I write  stpre proc with this?

Vijay Patel


Why do we want use 'Data Flow Task' to 'Data Flow Tast' in Control Flow?

$
0
0

I found an example in my company's SSIS package folders. In Control Flow, it has one data flow connected to the other. Both of them are importing data from flat file and then exported to database. I think these two are kind of at the same level and cannot see any reasons to conncect them together.

Thanks & Happy Thxgiving,

Gavin 

Clearing CDC table from SSIS 2012

$
0
0

Hi Team,

I want to clean the CDC tables once by data is moved from the cdc tables to the actual destination table sql server 2104.

Is there any way I can delete the cdc tables from ssis ?

Or if truncate the CDC tables as normal tables would it impact any functioning of my CDC controls in SSIS ?

Thanks,

Chirag


rs.LoadReport syntax within Script Component of SSIS task

$
0
0

Hello,

please, what does Row.path, Row.name etc.... stand for?

How should I enter folder path, report name...What about this:

/FolderName/ReportName....Can not get rid of syntax errors what ever try...If possible, please example...

Thanks

string extension, mimetype, encoding;
            string[] streamIds;
            SC_7bb1aae362d44e26a741ccbbecdc6e18.csproj.ReportExecution2005.Warning[] warnings;
            byte[] results;
            rs.LoadReport(Row.path + Row.name, null);
            results = rs.Render(Row.reportformat, null, out extension, out mimetype, out encoding, out warnings, out streamIds);
            System.IO.FileStream writer = System.IO.File.OpenWrite(Row.folderPath + "\\" + Row.name + "." + Row.Extension);
            writer.Write(results, 0, results.Length);

changing ? or ¿ or � replace with Special characters

$
0
0

Hi there,

I am working with a ETL importing data from Oracle to SQL Server using SSIS 2008.  Oracle source data contains special characters which appear as ? or ¿ or � in Toad.  When I open source task and when I put select statment - select companyname from test. Here, am getting these values  ? or ¿ or � in source itself.

Characterset=Latin1_General_CI_AS

I try to change DT_STR to DT_WSTR  on source task, i am getting error. Need help on this.

Using SSIS to load data into one SQL table from multiple DBF files of different formats (Visual Studio 2012)

$
0
0
I am fairly new to SQL Server development and have a task at hand that I am in need of assistance. I am reading a folder full of DBF files. These DBF files could have 4 different file layouts. All the columns I need are in these files but the columns are not in the same order.  Ultimately, I need the data from these DBF files to be inserted into a single SQL table. 

LineageID of an input column has an another value in runtime

$
0
0

Hi all,

I upgraded my own data flow transformation component to SQL 2012. My component has one input and one asynchronous output.

I found the following unexplained behavior: a value of the LineageID of an input column at the runtime differs from a value which is currently set (and I can check it in designtime using Advanced Editor). E.g. component's input has only one column with the name "Address" and its LineageID equals 10. But during debugging component at the runtime I see that input contains only one column with the name "Address" but itsLineageID equals 11. How is it possible?

So somebody could explain me what is it or give me an advice how to avoid this behavior. And in general how is it possible at all that LineageID for the same column has different values in the design time and the runtime?

PS. Versions of this component for SQL 2005 and 2008 work without such problem.

Igor


Need help on flat file

$
0
0

I have 1000 records in text file.

Text file consists in this format...

[{"ID":1,:"Name":"test1","State":"AP"},{"ID":2,:"Name":"test2","State":"KN"}{"ID":3,:"Name":"test3","State":null}]

load bulk data using transformations

$
0
0

Hi all,

I have a problem to loading with bulk data. ORACLE TO TERADATA  using SSIS package. we have bulk data in ORACLE and we should transform it into Teradata by using SSIS. we are having more then 10 million records.

Thanks,

Use Merge Join instead of Lookup?

$
0
0

I am inserting from a stage table to a dictionary table using the Lookup Transformation.  I need to change this Lookup Transformation to use a Merge Join, where the Derived Columns - CreateDate(Insert Table) and UpdateDate(Update Table) work the same as the Lookup Transformation did.  Please advised or illustrate. Thanks 

how to split list of columns into 2 tables in SSIS 2012?

$
0
0

Hi,

I have 200 columns in Source. Now i want to split these columns, few into Destination A and few more columns into

Destination B. Multi cast i tried to use, but it coping all the columns . Any help would be appreciated. Thanks in Advance.

Lets assume  i have columns  A,B,C,D,E..

i want to move Columns A,B,D into destination A and 

columns A,C,D,E INTO Destination B. please help me to implement this logic?

SSIS Package developed in SQL Server 2008 R2 Using BIDS 2008 is not working in SQL Server 2012

$
0
0
I am working in a Product based company.
We have given a Pre-requisites to our customers like SQL Server version should be 2008 or later.
I have developed and deployed my SSIS package in SQL Server 2008 R2 with BIDS 2008.
I have used "Script Component" in my package.
We will take only DTSX package file to the customer places and run using SQL Agent Job daily. 
Who are all(Customer Environments) using SQL 2008 or R2 my package is running fine.
Who are all(Customer Environments) using SQL 2012, I am facing the below issue 

The component metadata for “Script Component, clsid {874F7595-FB5F-4OFF-9BAF-FBFF825OE3EF}” could not be upgraded to the newer version of the component. The PerformUpgrade method failed.

Still the same package is running fine for some of the customers having SQL 2012.
I am getting the above error only for few customers.

Can someone please guide me how to proceed on this. Is there any solution to go without upgrading the package to SQL 2012. Because we need to maintain only one package for all customers
Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>