Are you the publisher? Claim or contact us about this channel

Embed this content in your HTML


Report adult content:

click to rate:

Account: (login)

More Channels

Channel Catalog

Channel Description:

All questions related to SSIS, transforms/data flow, control flow, and other related topics.

older | 1 | .... | 125 | 126 | (Page 127) | 128 | 129 | .... | 944 | newer

    0 0

    I have a SQL 2008 R2 application database, which also integrates SQL queries with MS Index Server via a linked server. The MS Index Server currently indexes c. 1,000,000 files. 

    I need to move the application to Windows Server 2012, however this does not support MS Index Server. What can I do?

    1) Do I have to also install MS Search Server? (any limitations)

    2) If so, then do I simply use linked server provider "Microsoft OLE DB Provider for Search"?

    3) If so, then how do I configure it?

    There is very little recent information about this anywhere on the net. I was forced to try to use this driver previously after upgrading to Windows Server 2008/SQL Server 2008, but it didn't work, and as far as I can see nobody got it to work and nobody had any response from  MS. Fortunately, eventually somebody realised that many of us depend on some sort of file search server and SQL Server integration and simply re-enabled Index Server in Windows Server 2008. With any luck the same thing will happen in Windows Server 2012 (what's your problem with Index Server?). But, in the meantime it would be useful if somebody can tell us how to keep our applications alive?

    0 0

    Hi -

    This is sql 2008 r2.

    I've got an ssis package that i am using to simply copy all the data from 20 tables in one database to tables in another database, on the same server.

    I split up the table import operations into 5 separate data flows (based on some suggsetions from a question i posted yesterday).

    The problem i am seeing is that a data flow might have 3 or 4 operations (copying data from table to table) and 2 or 3 finish and 1 or 2 just sit there with the number of rows displayed never changing after a few minutes.  In fact, I left the package running over night and 10 hours later the package was in the same state.

    I ran sp_who2 to see if there were any blocking issues and i noticed that there were two. Each was a "Select * from TableA" from the ssis package and it was blocking its own Insert operation!

    There is nothing else running on this server. Sometimes the table that ssis hangs on just has 1000 rows and other times it is a few million. 

    Is there something else i can check to help troubleshoot the problem?

    thanks for any help.

     - will

    - will

    0 0


    Here is an unusual situation, which I did not find covered on any forums online.  I can access the Integration Services MSDB folder on the prod server, but not on the test server.

    The parameters of the environment I am using are:

    local machine: SQL Server 2008 R2

    test server and prod server: Windows Server 2008 R2 Enterprise with SQL Server 2008 R2 installed on both.

    I have admin account, which has sysadmin rights and is in the administrators group.  I can view the tables in msdb System Tables --> dbo.sysssispackages and dbo.sysssispackagefolders in the Database Engine.

    I connect to Integration Services through remote desktop connection to the test server  Server Management Studio--> Object Explorer --> Connect --> Integration Services --> enter admin credentials. I can access the MSDB folder on the prod server, but not on the test server. There should be no need to modify MsDtsSrvr.ini.xml because a colleague can access the MSDB folder with his admin account, which is in the same administrators group and we are using the same environment.

    When I try to create a new folder in MSDB or expand the node to view existing folders, I get the error "The SQL Server instance specified in SSIS service configuration is not present or is not available. [...] Login failed for user '[my admin user]' (MsDtsSrvr) ". I don't think the error message identifies the issue properly, because there is a working default instance installed on the server.

    I can provide more details if you need to help me resolve this issue, just let me know.


    0 0

    I am connecting to an Access 2007 database using an OLEDB source and the MS Office 12.9 Access Database Engine provider.  I have to perform a data conversion on many fields to DT_STR or DT_TEXT because they are coming in as Unicode data types.  I then am writing the data to a SQL Server 2008 table using a OLEDB destination.  There are no errors or warnings on any of the components, but when I run the SSIS package, I get the following error messages:

    Error: 0xC0202009 at Data Flow Task, OLE DB Destination [2055]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.

    An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".

    Error: 0xC0209029 at Data Flow Task, OLE DB Destination [2055]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination Input" (2068)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (2068)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

    Error: 0xC0047022 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (2055) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (2068). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

    Error: 0xC02020C4 at Data Flow Task, OLE DB Source [35]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

    Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (35) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    Can someone please assist me?

    Thank you,


    0 0
  • 12/18/12--06:33: SSIS CSV Import
  • Using 2008 R2... I have a data flow that imports a csv file to a table in sql server. The flat file connection manager uses {CR}{LF} for the row delimiters. 

    I have setup event handlers for the data flow OnError and OnTaskFailed to report any issues back to a log table. 

    This has been running fine for the past few weeks however today the flat file contained {CR} as row delimiters and not {CR}{LF}. This resulted in no data being imported but the dataflow processed successfully and the event handlers were not hit.

    How can I instruct the data flow to fail when the specified delimiters in the file are incorrect? 

    0 0

    Just playing around with SSIS 2012 deployment and I was wondering if it is somehow possible to deploy a project to an other server (my company laptop in other domain) with SQL Server authentication. I can browse to it, but I can't change authentication mode.

    Please mark the post as answered if it answers your question | My SSIS Blog: |Twitter

    0 0

    I have parent package and child package. i am passing the connection string of child from parent package. To connect to one of the connection i have user name(app_db11) and password. 

    When I have protectionlevel EncryptSensitiveWithUserKey it works but when i change it to DontSaveSensitive it fails in data flow dask at the connection with error : Login failed for user 'app_db11'. 

    Any idea what is wrong?

    0 0


      I built an SSIS package to load data from some flat files into an SQL Server 2005 database.  The data is loaded into a set of tables on a staging database.  After that it is manipulated and loaded into the data warehouse and the Analysis Services cube is processed.  I developed this package on a test Server.  It uses an XML Configuration file to configure the paths to the various flat files.  The configuration file also contains the destination server names (SQL Server, SSIS Server and SSAS server) and database names.

      By default, the package looks to the local D: Drive for the flat files.  After testing, this was deployed to Production.  I modified the configuration file to point to a network share for the source flat files and to the production servers.  I manually ran the SSIS package via SSMS and it ran successfully.  Here's where things get weird.  I created an SQL Agent Job to run this every night.  It runs successfully (no errors and pulling data from the flat files on the network) except that it's still loading data into the test server!

      My first thought is that maybe the SQL Server Agent's service account (which is a domain account and not the default NetworkService account) might not have access to theC:\Program Files\Microsoft SQL Server\90\DTS\Packages\ folder in order to read the XML Configuration file, and therefor, it is falling back to the values that I used when I developed the package.  (i.e. pointing at the test server.)  But if that were the case, I would expect that i would also try to find the source files on the D: Drive instead of the network share that I indicated within the config file.

      So I'm at a loss.  How could this package run fine on the Production server when I run it directly, but switch back to the test server when the SQL Agent launches it?



    0 0


    What's the easiest way to also deploy my Development environment with all its variable to my acceptance/production environment?

    Please mark the post as answered if it answers your question | My SSIS Blog: | Twitter

    0 0

    I have a package with several data flow task. These task are failing, as far as I can tell, based upon the phase of the moon and the position of Mars. Everytime something fails I get the same message.

    An OLE DB error has occurred.
    Error code: 0x80004005. 
    An OLE DB record is available. 
    Source: "Microsoft SQL Server Native Client 10.0" 
    Hresult: 0x80004005  Description: "Syntax error, permission violation, or other nonspecific error".

    "component "[insert component name]" failed validation and returned validation status "VS_ISBROKEN". 

    This is what is happening. We'll call them DFT 1, 2, and 3. DFT 2 is dependant on DFT 1's successful execution. DFT 3 has no dependancies.

    1. I run the package. DFT 1 fails. DFT 3 works.

    2. I fix DFT 1. I run the package. DFT 1 runs. DFT 2 fails. DFT 3 works.

    3. I run the package and do nothing to DFT 2. DFT 1 and 2 works. DFT 3 fails.

    4. I run the package DFT 1 and 2 work. DFT 3 fails.

    5. I recreate DFT 3. I run the package. DFT 3 runs. DFT 1 fails.

    6. I run the package. DFT 1 works. DFT 2 fails. DFT 3 runs.

    7. I run the package. DFT 1 works. DFT 2 works. DFT 3 fails.

    It's like DFT wack-a-mole.

    Has anybody seen this behavior before?

    0 0

    Is it possible to centralize SSISDB data from multiple servers into a single one-stop SSISDB? Primarily, I am interested in viewing execution information from various SSIS 2012 servers in one central location.



    0 0
  • 12/18/12--14:08: SSIS Error
  • I got this error when I execute SSIS package in IST server
    "An error occurred while adding the managed SSIS type library to the script host. Verify that the DTS 2000 runtime is installed"

    can any one help.

    0 0

    I've recently upgraded several packages to SSIS 2012, and I'm finding that the row counts shown in the data flow while debugging are entirely nonsensical. My data flow goes something like this at the moment:

    OLE DB Source - 138,292 rows

    RowCount - 325,974 rows

    Lookup - 730,972 rows

    Derived Column - 9,878 rows

    Lookup - 29,634 rows

    RowCount - 148,170 rows

    OLEDB Destination

    This is very different from what I saw in 2008 R2. The first row count and lookup should most certainly not exceed the number of rows that have been read from the source at any given time. Are these counts being displayed counting something different in 2012 than what they counted in 2008 R2?

    I can also see when it goes into post execution that in both the output windows/progress the number of rows that have been actually been written to the OLEDB Destination is an entirely different number than what is shown flowing through the data flow pipeline while debugging (251,1095).

    Another oddity - the data flow tasks never turn green, even after the control flow shows that the data flow task has completed successfully.

    0 0


    I have searched for possible fix but no luck thus far.

    Issue: SSIS 2008 Script Task

    Package Location: server with 32 bit OS – running windows 2007.

    Description: When I try to edit (clickEdit Script button) in a Script Task or a Script Component, get the following error:

    TITLE: Microsoft Visual Studio

    Cannot show Visual Studio 2008 Tools for Applications editor.


    The VSTA designer failed to load:  "System.Runtime.InteropServices.COMException (0x80004005): Error HRESULT E_FAIL has been returned from a call to a COM component.

       at VSTADTEProvider.Interop.VSTADTEProviderClass.GetDTE(String bstrHostID, UInt32 dwTimeout)

       at Microsoft.SqlServer.VSTAHosting.VSTAScriptingEngine.EnsureDTEObject()" (Microsoft.SqlServer.VSTAScriptingLib)

    Have tried the following with no luck:

    Deleting the Script Task and re-adding it?

    Created a new package and added the Script Task…no good!

    Please advice what other things I can try here.


    0 0

    I have had this over and over again and I am sure many others have, too. I try to use an SSIS package as a template with connection managers from a sample database. The package works fine as is. When I try to convert it to use another database, I put the new server name, user name, password, and select a database. I test the connection and everything is fine. All other components, expressions, and variables are fully converted to the new database. All data destination components are converted over to use the new table. I save the package. There is no evidence whatsoever from any GUI, Dialog box, or property that I can see that has the old connection information. Then I run the package, end up with a login timeout error, and look at my connection manager. ALL the old values are back as if I never changed anything. What am I missing here? It seems as it is useless to create a template. When I try to reconfigure one, it is all lost.

    All I really want to do is adapt the same package to import different CSV files into their matching tables. Just change the file connections and sources, and change the destinations and their connectors. Any help will be vastly appreciated.

    0 0

    Daily only one file we are receiving. My source have 100 records fixed width file. From the 100 records  we need pass it into 10 tables.  These identification can be able to do by using  first 3 bytes of every records. The destination table columns count is different. One table have 3 columns another table have 30 columns.

    I have a plan to use conditional split to route records 10 stage database tables  with 2 columns  from that we need to imply our  logic to split the other columns then send it master tables

    Else using script components is it possible?

    please give your suggestions.



    0 0

    I have a SP document library. I DO NOT HAVE ACCESS TO SHAREPOINT INTERNALS, though I have full control of the library. I can't write custom workflows, etc. because I can't register them. It'sMS Office SP 2007, a standard Excel 2007 document library. My users create a new excel 2007 (xlsm) workbook, add data, and save it. There are several important columns as well (Status, Approver, CompanyID, etc.)

    Right now, I have an Access procedure that uses the library as a linked table to get all the detail (Excel file name, and Server Properties.) It selects the workbooks that should be processed, processes the rows one at a time, leaving error messages on each row, then saves the revised workbook with revised status, etc. back to the SP server.

    Can I use SSIS to import the (perhaps already validated...) workbooks into SS2k12? How?

    Of course, I'd rather do the whole shebang in SSIS, but it seems a bit of a reach...right?


    0 0

    Hi all,

    Currently, our ETL sometimes will fail, however the error message is very general, I want to use SSRS to show ETL's step. From the report, i will know which steps have passed, which are failed. Who can share some sample or useful links. Every comments is welcome. REally thanks in advance.

    0 0
  • 12/18/12--23:17: SSIS
  • How do l resolve this in SSIS Pipeline Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  

    ===================IN GOD I TRUST=================

    0 0

    I am facing  memeory problem while using SSIS 2008 R2 SP1 32 bit with error like




    Microosft acknowledges error on above link and takes us to appropraite Cumulative Update but it is for SSIS 2008, but I am working with SSIS 2008 R2 so which Cumulative update i should install to get probelm solved?

older | 1 | .... | 125 | 126 | (Page 127) | 128 | 129 | .... | 944 | newer