Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Look up with more than one column SSIS

$
0
0

Hi,

My applicaiton SSIS package should check if the row exist in a target table before it insert a row into target table. Cache transformation holds the value of the target table. SSIS package uses Look up transformation with Full cache mode to identify if row exists. This works fine if there is only one column to match (ex - Check if there is a look up match for a specific ID). However, if there are more than one column to do the look up operation (ex - provide ID and Code into Look up Cache to check if the row exist), SSIS package is unable to do a look up. Package gives a design time error --------------------------Cannot map the Lookup column, 'ProcurementCodeId', to an input column because the Lookup column is not an index column. Only index columns can be mapped when the Lookup transformation is configured to use a Cache connection manager.

Could you please help me with a solution to this problem?


Discussing ETL development on MSFT platform

$
0
0

Hi Guys,
this is more of a discussion point than a question.

Over the last 10 years I have pioneered many new ideas. One of them is SeETL. This idea is going to change the future of ETL development.

The question is "are ETL developers on MSFT platform interested in what we have done"? 

Feel free to ask questions or comment on this thread about what we have done. I think this as good a place as any to discuss SeETL and how it can be used in conjunction with SSIS.

These two videos will give you an idea of what we have done. I will explain in a few words below. But there is a LOT of information available about what we have done on our web site. Some people will love us, some people will hate us. Those who will hate us are the ones who build ETL for a living and do not want faster, better, cheaper ways of building ETL to become widely known. There are plenty of those guys around. Luckily they do not control budgets.

http://www.youtube.com/watch?v=3_wXs-vjZNk&feature=plcp
http://www.youtube.com/watch?v=E0PaMJIA7II&feature=plcp
Basically we have cut the cost of ETL development in about half even when a vendor ETL tool is used. I know that sounds like a "tall story" but it is true.

My background is I have been in IT 30 years building large batch systems...and ETL is just one more large batch system. I have been in BI for 21 years now. I built my first ETL tool in 1995 and it was used in many accounts to give me a competitive advantage on selling deals. So I know what I am talking about.

Starting in 2002 we migrated all our ETL knowledge into SeETL. It has developed over the last 10 years. It gives us a competitive advantage in ETL development to the tune of about 50%. It started out when I was talking to Ralph Kimball who suggested that "if you are so smart write an article about all the features are good ETL should have in it and I will publish it." That article, which never did get published, formed the initial design specs for SeETL.

SeETL allows us to completely prototype the ETL subsystem prior to actually using a vendor ETL tool. In our clients what we have been doing is building the ETL subsystems in SeETL, getting everything to work, getting the data models settled....doing all the testing on the prototype....and once the client is happy THEN migrate the ETL to the clients specified ETL tool. We have built a suite of tools that massively help us build data warehouses at our clients.

This is a client we did where we migrated the SeETL ETL to Informatica. http://www.youtube.com/watch?v=jBJWe9FxRxs&feature=plcp

We have ways and means to migrate our ETL to DataStage and Informatica very quickly. This all works on any standard database. DB2, Oracle, Sybase IQ, Netezza, MySQL are all proven platforms. But we are an MSDN and we develop on MSFT.

We have offered IBM and Informatica the opportunity to hire us to write an interface that will generate mappings for their tools. Up to them if they want to accept it.

I have contacted some old friends at MSFT to make the same offer to MSFT for SSIS...but I guess here is as good a place as any to mention that too.

The basic concept is very simple.

Forget about GUIs and build ETL directly from the mapping workbook using vb to read the workbook and generate what ever is necessary.

We have an older C++ engine but we can now also generate SQL directly. That means we are somewhat competitive to SSIS because we can do most of what SSIS can do. However, clients use SSIS for many more reasons than just getting the ETL done. There are also issues of support etc. So we see SeETL happily co-habiting with ETL tools...some people are using SeETL in production but most are using it only in development.

SeETL covers much more than just the ETL today. Anyone who takes a deeper look into the reports we provide out of the box will see that.

I am now in the process of making the whole idea of "write mappings and other directives in a spreadsheet and generate what is needed" into a widely accepted idea. I fully realise that I am going to be called "crazy" for this idea. I was also called "crazy" in 93-5 when I was talking about star schemas. But this idea is going to transform the world of BI. Just like star schemas did.

I very much own the idea of putting all the mapping definitions into a workbook and generating what is needed out of the workbook. This idea was invented in 2004 so there is more "prior art" than anyone can poke a stick at. We have quite a few big clients who have witnessed what we have been able to do. But as a small firm talking to people there is great resistance to this idea as per normal.....people who disbelieve new ideas are very committed to their disbelief...rather like sailors who said the world was flat.

So it is time to get this idea "out there" and widely accepted and adopted. And what better way than to offer some useful "free ware" and let people guide how the product evolves. The usual "free ware" standards apply. The software is free and the support is paid at normal support rates (which are lower than MSFT in any case).

What we are doing is making BI more accessible to more companies by lowing the cost of developing BI. This is good for everyone except people who make money out of coding ETL. The ideas I am bringing forward will put a lot of those guys out of work, just like computers put a lot of book keepers out of work. One caveat. You can't do this if you are in the german speaking region. We have an exclusive reseller in the german speaking region so you have to talk to them first. 

SeETL for prototyping and design time work will remain free. There are lots of very useful reports and you can build your complete prototype. It will be a matter of your honour that you do not deploy the prototype into production. The only time we look for SeETL software fees is deployment to production. If a client develops in SeETL and deploys in SSIS and does not call us for support? Hooray for them.  Good luck to them.  (We actually make our real money out of selling data models not SeETL.)

In production we are looking to bring the price of ETL down as SeETL is more widely accepted. If SeETL is not widely accepted we will continue to sell it for its current price point of EUR20K to EUR50K to deploy into production for one company. A rounding error.

So it is really up to "the best and brightest" to decide for themselves if they want to use SeETL in production or just use it in development and deploy SSIS into production. 

So...that is the proposal. The innovative idea that GUIs slow us down when building ETL, Data Models, etc, and that creating the input for such in an excel workbook and generating everything needed is a good idea. I know it is a good idea because at the end of a project our clients usually get the ETL in a vendor ETL tool and the data model in a data modelling tool. The whole end product looks "normal" and they switch over to "normal" support. A few clients use our tools for everything and they retain their advantage by doing that.

The point being? We get our clients to the end point faster and cheaper than anyone else.

So....Feel free to discuss this idea on this thread....good...bad...or otherwise. I am very interested in comments. I am interested to see if there are some guys here who are the "best and brightest" who would like to try out what it is we have done and evaluate it for themselves.

I am very much into "put the idea out there and see what happens." So far, not much. But that is exactly what selling star schemas was like in 93-95...by 2000 everyone had "discovered" star schemas thanks to Ralf Kimball and his books. We will see the same again in this tool I think.

Best regards


Peter Nolan




How to remove reference to a connection that does not exist

$
0
0

I'm getting the following error and I have no connection with that ID. How can I remove the reference to this connection? I must have had a connection with this ID in the past but removed it.  Thanks

The feed myPackage has failed due to Cannot find the connection manager with ID "{363662E0-B6D4-4F1C-B917-2115A9F0D919}" in the connection manager collection due to error code 0xC0010009. That connection manager is needed by "runtime connection "OleDbConnection" (25)" in the connection manager collection of "component "OLE DB Source" (16)". Verify that a connection manager in the connection manager collection, Connections, has been created with that ID.

SCript not working

$
0
0

Hi,

The following script im trying to use for checking the file for its date,

the script has to succeed if its current date, else should fail. 

but the following one not at all working please help.

Public Sub Main()
        Dim Fileinfo As FileInfo
        Fileinfo = New FileInfo(Dts.Variables("User::UserFilePath").Value.ToString())
       
        If Fileinfo.Exists = False Then
            Dts.TaskResult = ScriptResults.Failure
            Return
        End If

        Dim current As Date = DateTime.Now
        Dim fileCreateDate As Date = Fileinfo.CreationTime      

        If current.ToShortDateString <> fileCreateDate.ToShortDateString Then
            Dts.TaskResult = ScriptResults.Failure
        Else
            Dts.TaskResult = ScriptResults.Success
        End If
    End Sub

please let me know where its going wrong !

Thanks !


--------------------------- Radhai Krish | Golden Age is no more far | --------------------------


Visual Studio 2010 (Data Tools) Gets stuck when opening SSIS-solution (Sql server 2012)

$
0
0

Hello

Since I installed Sql Server 2012 (ever since beta up to release) I've had problems with SSIS. Whenever I forget to close the package before I close the solution it gets stuck on the "Preparing Solution ..." phase when I open the solution again. To solve this I have to open task manager, force close visual studio (which is taking 80-90% CPU) force close it, then open it again.

After I open the solution again I get an error message that says "An error was encountered while opening associated documents the last time this solution was loaded. Document load is being skipped during this solution load in order to avoid that error."

I get this error whenever I open a SSIS-solution and I it rather annoying

Did anyone had this problem and solved it?

w. Regards

Marcus Hansson

SQL Job Output in another server

$
0
0

Hi,

I am using mssql 2008 and by using BIDS2008 i have created a package and output in excel format, the output path is another server(00.00.00.00\export\). The package is running and i got result when i run the package in BIDS but when i try this using SQL server Agent (jobs)it display the error,

------------------------------------------------------------------------------------

Started:  7:42:15 AM
Error: 2012-10-30 07:42:16.66
   Code: 0xC0202009
   Source: Package1 Connection manager "DestinationConnectionExcel"
   Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft JET Database Engine"  Hresult: 0x80004005  Description: "The Microsoft Jet database engine cannot open the file '\\192.168.10.219\Export\SourceExcel\Export_Test.xls'.  It is already opened exclusively by another user, or you need permission to view its data.".
End Error
Error: 2012-10-30 07:42:16.67
   Code: 0xC00291EC
   Source: Preparation SQL Task 1 Execute SQL Task
   Description: Failed to acquire connection "DestinationConnectionExcel". Connection may not be configured correctly or you may not have the right permissions on this connection.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).

--------------------------------------------------------------------------------

When i search in google, it is the reason of user permission/access denied but my server is under "work group" and i have given all the permissions. Let me know any special permission is there to export the excel?

Please guide me to overcome this issue and many thanks in advance.

Regards,

Samsul hudha .M.Y


Problem running the DTSX package

$
0
0

Hi All,

I've created a very simple package and imported it into Integration Services "Stored Packages">"File System"

I can execute it from there and it runs perfectly.

When I create a job in SQL Server Agent and select that package to run from the SSIS Package Store - it does not complete all the steps.

the first step is a truncate table step...that gets run.  but the second step which imports data from a Sharepoint list to SQL table does not run

Can Someone help me with this ?


SSIS 2012 Package Parameters

$
0
0

Hi,

  I have an SSIS 2012 package that I have written and I have created parameters and inside the package I have created variables.  I wish to call the package and pass information into the parameters, not the variables.  Is there a way to do this?  All the posts that  have read seem to suggest passing information to the variables and not the parameters.

Thanks for your help,

Steven


SSIS OLE DB Source - Query is fast but procedure lasts forever

$
0
0

Hi everyone!

I have a simple package: one OLE DB Source and one OLE DB Destination.

An interesting problem happened - when I execute query in OLE DB Source component it executes and fills the table in 5 minutes. When I create the stored procedure from that same query and call exec proc_name, it's not working and is stucked in "Pre-Execute phase is beginning." for 1, 2,...5 hours!

So far I've tried following:

  • delete / recreate the procedure
  • create procedure with new name
  • DBCC FREEPROCCACHE on database where procedure is created (procedure queries the data from 3 databases)
  • SET FMTONLY OFF at the beginning of the procedure and also before calling the procedure (before exec proc_name)
  • grant execute on the procedure 

and nothing works for me. Does anyone has a clue what could be the problem? Is there any way to trace what is going on and why is this happening?

Thanks in advance,

Miljan

SSIS:Flot file to DB2 loading

$
0
0

Hi,

My flat file date is in format: 2012-10-01 This is date datatype , for convertiong iam useing this expression

RIGHT((DT_WSTR,4)DATEPART("yyyy",[CColumn 20]),4) + RIGHT("0" + (DT_WSTR,2)DATEPART("mm",[CColumn 20]),2) after that getting this format 201210 (unicode string),Now i need load DB2 database .

I want to import that into the AS/400 DB2 as 201210 this is DB-database

Vasu


vasu

Execute sql task failing

$
0
0

Hi All,

I have a pakage that loads data from Oracle into sql server database. In the control flow task i have an Execute sql task and a data flow task and again another execute sql task. The Execute sql task after the data flow task is failing with the following errors:

Error: The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.

[Execute SQL Task] Error: Executing the query "update [DW[CIP_APP].[TBL] set DATE_REVISED" failed with the following error: "The statement has been terminated.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

Error: The conversion of a varchar data type to a datetime data type resulted in an out-of-range value.

In the EXECUTE SQL TASK thats failing i have an UPDATE Statement:

UPDATE [DW].[APP].[tbl]  
SET    date_revised = CONVERT(DATETIME, date_revised, 120), 
       date_non_bid = CONVERT(DATETIME, date_non_bid, 120), 
       date_bid_orig = CONVERT(DATETIME, date_bid_orig, 120), 
       date_bid_current = CONVERT(DATETIME, date_bid_current, 120), 
       date_final_acceptance = CONVERT(DATETIME, date_final_acceptance, 120) 
UPDATE [DW].[APP].[tbl] 
SET    date_est_start = CONVERT(DATETIME, date_est_start, 120), 
       date_est_finish = CONVERT(DATETIME, date_est_finish, 120), 
       date_act_start = CONVERT(DATETIME, date_act_start, 120), 
       date_act_finish = CONVERT(DATETIME, date_act_finish, 120), 
       date_orig_start = CONVERT(DATETIME, date_orig_start, 120), 
       date_orig_finish = CONVERT(DATETIME, date_orig_finish, 120), 
       date_per_submittal = CONVERT(DATETIME, date_per_submittal, 120), 
       date_per_approval = CONVERT(DATETIME, date_per_approval, 120), 
       date_idem_approval = CONVERT(DATETIME, date_idem_approval, 120) 
UPDATE [DW].[APP].[tbl1]
SET    date_bid_current = CONVERT(DATETIME, date_bid_current, 120) 
UPDATE [DW].[APP].[tbl2]
SET    date_bid_current = CONVERT(DATETIME, date_bid_current, 120) 
UPDATE [DW].[APP].[tbl3]] 
SET    date_contract = CONVERT(DATETIME, date_contract, 120) 
UPDATE [DW].[APP].[tbl4]
SET    date_approval = CONVERT(DATETIME, date_approval, 120) 
ALTER TABLE [DW].[APP].[tbl]
  ALTER COLUMN date_approval DATETIME 
ALTER TABLE [DW].[APP].[tbl1]
  ALTER COLUMN date_contract DATETIME 
ALTER TABLE [DW].[APP].[tbl1]
  ALTER COLUMN date_paid DATETIME 
ALTER TABLE [DW].[APP].[tbl2]
  ALTER COLUMN date_bid_current DATETIME 
ALTER TABLE [DW].[APP].[tbl3]
  ALTER COLUMN date_bid_current DATETIME 
ALTER TABLE [DW].[APP].[tbl4]
  ALTER COLUMN date_revised DATETIME 
ALTER TABLE [DW].[APP].[tbl5]
  ALTER COLUMN date_est_start DATETIME 
ALTER TABLE [DW].[APP].[tbl5]
  ALTER COLUMN date_est_finish DATETIME 
ALTER TABLE [DW].[APP].[tbl5]
  ALTER COLUMN date_act_start DATETIME 

As per the error message i think something is wrong in the conversion of the date_revised, which i underlined in the code. Can anyoneplease tell me what is wrong with the conversion??

In the ORACLE DB the date is in the format "13-MAY-98' and on the SQL SERVER its in the format "1998-05-13 12:18:17"

Thanks


Best practice for extracting csv files to update a database

$
0
0

Our scenario is this:

A program saves a csv file to a network directory every several minutes.  The program is connected to a workcell which creates parts (manufacturing company).  The file has fields such as part number, creation timestamp, passed, failed, etc.  This file contains 1 or more records (usually 1) that need to be updated to a SQL Server table.  This update needs to occur as soon as the file is saved or as close to as possible to keep the processes working.  There are dependencies on this information.

Would SSIS be the best practice for doing this?

Thank you for your help,

Fred

Dataflow task with MultiCast transformation giving "Connection failure" error while using stored procedure

$
0
0

Hi,

I have a stored procedure which pulls around 30 million records from multiple tables. My SSIS dataflow task's design is as given below.

When I try to call this stored procedure in oledbsource, SSIS is throwing connection failure error. But if I give the same query inline in oledbsource there is no error and it is working perfectly. I started getting this error after introducing Multicast and Derived column. With one source and one target SSIS works fine with the same stored procedure. 

Appreciate your valuable help.

Thanks,

Prasad M

Execute Package Task doesn't refresh connection before executing

$
0
0

Hello,

I'm struggling with an issue on Execute Package Task in SSIS (SQL Server 2008R2)

The context is the following: I'm trying to execute a package stored in SQL Server MSDB table from a "Master" package. We have different environments (DEBUG, DEV, PROD, ...), each of them can execute different builds of the same package (while v3 is still under development in DEV, PROD executes v2, ...)

Which means that the connection used by my "Master" package is dynamically changed to reference the right SQL Server instance (and so MSDB database), depending on the current environment of execution.

My issue is the following: after several tests, I'm pretty sure that the Execute Package Task doesn't used the newly configured connection, but instead keeps the value which is stored in the package (meaning the one used while developing the package).

To change the connection property, I set the value of variable within a script task, and this variable is then used to set the value of the ServerName property of my MSDB connection through an expression.

I actually figured out this issue when I saw in my log table in PROD that it was the build number matching my DEV package that was incorrectly executed on my PROD environment, and the errors where matching some new developments i just did.

Has anyone ever faced such issue?

I can send you a test project which reproduce the error if needed.


Configuration of integration services in a Cluster

$
0
0

Hi All,

I am trying to configure the integration services in cluster environment.I have added as a resource and SQL integration services is online status.I am able to connect to integration services. But Not able to expand or open msdb from connected integration services.

I have followed this link http://msdn.microsoft.com/en-us/library/ms345193(SQL.105).aspx.

In this link they explained how to configure, But what is the group and where to find in Cluster administrator.

Please can some one suggest how can i proceed to successful configuration.

Thanks for your time,

Sharath Reddy



How to expose TaskHost properties using a Script Task

$
0
0

Hi

I have built a custom logging feature through the use of the Execute SQL Task SQLStatementSource property, using dynamic SQL that utilise System variables and event handlers.

There are some things that are not available through the help of System variables such as task result, tast status etc and I would like to capture these details for the tasks in my packages.

I have learned of the TaskHost class - http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.dts.runtime.taskhost.aspx and wanted to know is it possible to get the following properties using a script task on an event handler -

ExecutionDuration

ExecutionResult

ExecutionStatus

StartTime

StopTime

I would like to capture these properites and put them into variables using the script task and then my dynamic SQL within the SQLStatementSource property can capture as variables and output when an event handler fires.

This is what I would like to and want to know if it what I have described is possible?

Any pointers to help me get started would be good as I am still getting familiar with scripting and the Ingteration Services Object Model.

Thank you

Login Failed on Data Flow Task

$
0
0

I cannot figure out why my login keeps failing in a data flow task. This is the error message I'm getting:

TITLE: Microsoft Visual Studio
------------------------------

Error at PACKAGE_NAME [Connection manager "Connection_Name"]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E4D.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E4D  Description: "Login failed for user 'USER2'.".

Error at Data Flow Task [OLE DB Source [1]]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Connection_Name" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.

 

------------------------------
ADDITIONAL INFORMATION:

Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)

------------------------------

Formatting When Exporting Data to Excel

$
0
0

In my SSIS package, I'm exporting data using the Excel Destination. When I open the excel file, the columns are not expanded until I expand them, and fields that look numeric but were strings in the database have green triangles in the cell informing me that numbers are stored as text. Is it possible in SSIS to specify to expand the cells automatically when opening the excel sheet, and to not notify  me that a numeric field is stored as text?  I wasn't sure if anything could be done before the excel sheet was created in SSIS or if this would have to be handled in excel itself.

Thanks

Index was outside the bounds of the array Error : SSIS SQL Agent Job

$
0
0

Hello,

We have SSIS package which was executing SQL Agent Job task to run the replication snapshot agent. It was working fine till now.

Now we migrated to new SQL Server 2012 and we had changed the connection string of the distirbutor server (which was earlier on 2005 and now it is migrated to SQL 2012) on which SQL Agent job invoked by this package resides. When i run modified SSIS package from BIDS it is running fine but when deployed on SQL Server and through SQL server job when it runs it throws error.

Source: Start Snapshot of ABC Description: The Execute method on the task returned error code 0x80131508 (Index was outside the bounds of the array.)

The environment is as below.

Integration Service on which SSIS package deployed and SQL Job to execute the package running on SQL Server 2008 R2 (10.50.1600)

All connection in SSIS are pointing to SQL Server 2012.

Let me know if you have encountered such error and the solution.

Thanks in Advance,

Malkesh Sheth

Process CSV Files with Invalid Characters -SSIS

$
0
0

Hi, 

I am processing a csv file with the follwowing sample data:

"SubscriberID","JobID","EventDate","Email","RejectCategory","RejectSubCategory"
"698766271","48282952","2012-10-29 02:21:07.460000000","recknedde@woh.rr.com","Soft bounce","Mailbox Full"
"801320836","48282952","2012-10-29 02:25:43.117000000","marksasaldana68@bresnan.net","Unknown bounce","Unknown"
"892661671","48282952","2012-10-29 02:23:57.700000000","burgeaaasddrmum@comcast.net","Technical/Other bounce","Server Too Busy"
"907644011","48282952","2012-10-29 02:25:39.733000000","griffedddsan85@hotmail.com","Technical/Other bounce","Network Error"
"909267592","48282952","2012-10-29 02:25:20.610000000","alishaaasaqwea19@comcast.net","Technical/Other bounce","Server Too Busy"

When I process the CSV file-Is there a way in SSIS where in I can get rid of " "( double quotes) characters. I do need them to be loaded into my destaination table. I need to scrub them at SSIS level. Is there a possible way to scrub them in SSIS?

Please advice ...

Thanks


EVA05

Viewing all 24688 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>