Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Package to load Excel file started failing with error "External table is not in the expected format."

$
0
0

Hi Gurus,

I am facing a strange issue and not able to see what could be causing it. I already checked this thread and some other blog posts around the same issue. But its slightly different in nature and hence opening a new thread. So here it goes:

I have two similar SSIS 2012 packages that load two different excel files (2016)  to SQL tables these packages have a FOR EACH LOOP container and processes the files in a source folder (if present) and archive them after loading into an archive folder. If there is no file found the package completes successfully with a message saying no files found. The packages are executed through a windows task scheduler job. This have been working fine until last week when we loaded a new correction file. The file was loaded into database correctly and moved to archive folder as expected but the job threw the below error alert and logged the same in the log file (Variable names have been changed for security purpose).

Error: 2018-08-17 07:00:27.31
   Code: 0xC0202009
   Source: PackageName Connection manager "Excel Source"
   Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft Access Database Engine"  Hresult: 0x80004005  Description: "External table is not in the expected format.".
End Error
Error: 2018-08-17 07:00:28.23
   Code: 0xC020801C
   Source: Copy in Database - Data Flow Task Excel Source [202]
   Description: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Source" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.
End Error

We are receiving the same error everyday since then even when there is no file to process in source folder. But in my case the error is coming in only Prod server.Interestingly, this is happening only for one of the two packages which have almost same logic except for two different files formats and both run from the same job. Also, the task works fine in my SSDT and in lower environments. I have already checked that the delay validation property for Excel source, For Each Loop container and 'Run64bitRunTime'property are set to True. This server was also rebooted once after the last file processing and there is no task in hung state. Version of SQL and SSIS is 2012, MS-Excel file 2016 and MS Excel driver in the server is 14.00.7180.5000 for 64bit and 6.03.9600.17415 for 32bit. However, I don't think it matters because the other job runs just fine and lower environment has the same configuration and runs the same job without any issues.I am using following expression to dynamically set the excel connection manager:

"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + @[User::FilePath] + ";Extended Properties=\"Excel 12.0 XML;HDR=NO;IMEX=1\";"

I am sure this would be some silly small thing but somehow I am not able to catch it.Hoping to get some help with your valuable suggestions on how to fix this issue. 


HTH,
Cheers!!
Ashish
Please mark it as Answered if it answered your question or mark it as Helpful if it helped you solve your problem.


Error: 0xC0202009, OLE DB Destination [39]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.

$
0
0

Hi All,

When I am trying to run my SSIS package I am getting below error.

SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.

Can you please help me to trouble shoot this error.

Thanks in advance.

Thanks,

Naga


Thanks, Naga

SSIS Data Flow Pipeline Stops and success is displayed - yet missing 800,000 rows

$
0
0

Im using the most recent version of 2017 Professional with Oracle OLEDB drivers installed July 2018 with a very simple OLEDB connection to an Oracle view on the source side. Im sending directly to a clean and flat SQL Server table with no indexes, constraints, nothing.  There are no transformations or any other SSIS objects between the the source and destination, just pure pipeline with two OLEDB connections.  I have done no tuning and there are no parameters being used.  I want the entire view to populate the SQL Server table.  I am getting 130,000 rows of the 1.1 million rows of the source Oracle view and it completes with no errors except a shared memory error that is there even when I limit the row count at the view to 500 rows (so I think that error message is an SSIS bug https://social.msdn.microsoft.com/Forums/sqlserver/en-US/2f2d9883-59ba-4356-ad68-05e67537e749/ssispipeline-warning-warning-could-not-open-global-shared-memory-to-communicate-with?forum=sqlintegrationservices#2f2d9883-59ba-4356-ad68-05e67537e749  ).

Things tried:

We expanded table space on the destination database - resulting in same row count at point of completion

Things not tried:

Tuning buffers since the speed was not an issue and 130k rows made it fine

Monitor pipeline errors

My questions are this:

  1. What could cause this view and another like it (that is about 800 columns wide) to stop and not report any error that interrupted the data flow on the "Execution Results"?
  2. With ONLY the Source and Destination OLEDB connectons tied to each other, can I FORCE it to resume following a data error instead of stopping at the point of error (if that is what it is doing)?
  3. Is error monitoring a requirement in order to FORCE a RESUME of the package processing when an error occurs?

Thanks in advance for your help with this.


Les Draper





Data Flow - varchar(max) to varchar(max)

$
0
0

I am trying to move data from a SQL server to move data from sql server to sql server. The source table contains varchar(max) and the destination table is define with a varchar(max).  In SSIS I can see the source column of the data flow define as a DT_TEXT (normal for SSIS), and the destination column when mapis also a DT_TEXT. When I run the package, it failed with the following error:

[OLE_DST - RunTicket (BULK LOAD) [25]] Error: An error occurred while setting up a binding for the "<g class="gr_ gr_396 gr-alert gr_spell gr_inline_cards gr_run_anim ContextualSpelling ins-del multiReplace" data-gr-id="396" id="396">templateId</g>" column. The binding status was "DT_TEXT".
[OLE_DST - RunTicket (BULK LOAD) [25]] Error: An error occurred while setting up a binding for the "<g class="gr_ gr_408 gr-alert gr_spell gr_inline_cards gr_run_anim ContextualSpelling ins-del multiReplace" data-gr-id="408" id="408">equipmentId</g>" column. The binding status was "DT_TEXT". The data flow column type is "DBBINDSTATUS_UNSUPPORTEDCONVERSION". The conversion from the OLE DB type of "DBTYPE_IUNKNOWN" to the destination column type of "DBTYPE_VARCHAR" might not be supported by this provider.

Any ideas?


ssis - combine 3 tables from 3 different source into one

$
0
0

I want to combine 3 tables from 3 different source which brings in millions of rows from each table into one for daily use. Is there any way i can bring it into a temp holding place using some task in ssis so that it won't consume more resources and will run faster, when combining it to 1 table.

Excel destination create table

$
0
0

its been a few years since I used ssis and am a little rusty

I have an oledb data source and an excel destination, a blank excel file

I am trying to use the execute sql task to create the sheet in the excel file

I create a data flow task and map my source to my destination.

I then use the create table sql code that is created automatically and I copy and paste that into the execute sql task

I configure the task using the excel connection manager

when I execute the task I get the following error:

[Execute SQL Task] Error: Executing the query "CREATE TABLE `Excel Destination` (
    `BusinessEn..." failed with the following error: "The Microsoft Jet database engine could not find the object 'Excel Destination'.  Make sure the object exists and that you spell its name and the path name correctly.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

isn't the whole point to create the sheet called Excel Destination?

what am I missing?

Thanks

Unable to deploy SSIS after VS 2017 update 15.8.0

$
0
0

I just installed the 15.8.0 update to Visual Studio 2017 and now I am unable to deploy SSIS projects and packages to the SSIS Catalog from within Visual Studio 2017.The Deployment Wizard does still work from the command line, however.

I have repaired SQL Server Data Tools. I have repaired Visual Studio 2017. I have uninstalled and reinstalled SQL Server Integration Services Projects. The end result is the same:

ETA: This error occurs after the "Connect" button is clicked on the "Select Destination" page of the Wizard.

===================================

Could not load file or assembly 'Microsoft.SqlServer.Management.IntegrationServicesEnum, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The parameter is incorrect. (Exception from HRESULT: 0x80070057 (E_INVALIDARG)) (mscorlib)

------------------------------
Program Location:

   at System.Reflection.RuntimeAssembly._nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks)
   at System.Reflection.RuntimeAssembly.nLoad(AssemblyName fileName, String codeBase, Evidence assemblySecurity, RuntimeAssembly locationHint, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks)
   at System.Reflection.RuntimeAssembly.InternalLoadAssemblyName(AssemblyName assemblyRef, Evidence assemblySecurity, RuntimeAssembly reqAssembly, StackCrawlMark& stackMark, IntPtr pPrivHostBinder, Boolean throwOnFileNotFound, Boolean forIntrospection, Boolean suppressSecurityChecks)
   at System.Reflection.Assembly.Load(AssemblyName assemblyRef)
   at Microsoft.SqlServer.Management.Sdk.Sfc.ObjectCache.LoadAssembly(String fullName)
   at Microsoft.SqlServer.Management.Sdk.Sfc.ObjectCache.LoadElement(ObjectLoadInfo oli)
   at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetServerVersion(Urn urn, Object ci)
   at Microsoft.SqlServer.Management.Sdk.Sfc.Environment.GetObjectInfo(Object ci, RequestObjectInfo req)
   at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.GetObjectInfo(Object connectionInfo, RequestObjectInfo requestObjectInfo)
   at Microsoft.SqlServer.Management.Sdk.Sfc.Enumerator.Process(Object connectionInfo, RequestObjectInfo requestObjectInfo)
   at Microsoft.SqlServer.Management.Sdk.Sfc.SfcInstance.GetSupportedResultTypes(ISfcConnection connection, Urn urn)
   at Microsoft.SqlServer.Management.Sdk.Sfc.SfcInstance.Initialize()
   at Microsoft.SqlServer.Management.Sdk.Sfc.SfcInstance.MarkRootAsConnected()
   at Microsoft.SqlServer.Management.IntegrationServices.IntegrationServices..ctor(SqlConnection sqlConnection)
   at Microsoft.SqlServer.IntegrationServices.Wizard.Common.Model.ServerHelper.ConnectToServer(String server, String username, String password, AuthenticationType authenticationType)
   at Microsoft.SqlServer.IntegrationServices.Wizard.Common.ServerProjectBrowser.validateServerWorker_DoWork(Object sender, DoWorkEventArgs e)

===================================

The parameter is incorrect. (Exception from HRESULT: 0x80070057 (E_INVALIDARG)) (mscorlib)

------------------------------
Program Location:

   at System.AppDomain.nApplyPolicy(AssemblyName an)
   at System.AppDomain.ApplyPolicy(String assemblyName)
   at Microsoft.VisualStudio.Platform.VsAppDomainManager.MatchAssemblyName(AssemblyName reference, AssemblyName definition)
   at Microsoft.VisualStudio.Platform.VsAppDomainManager.FindAssembly(String name, IEnumerable`1 probingPaths)
   at Microsoft.VisualStudio.Platform.VsAppDomainManager.InnerResolveHandler(String name)
   at Microsoft.VisualStudio.Platform.VsAppDomainManager.ResolveHandler(Object sender, ResolveEventArgs args)
   at System.AppDomain.OnAssemblyResolveEvent(RuntimeAssembly assembly, String assemblyFullName)



SSIS Build/Deploy Fails with new Visual Studio Version 15.8.0

$
0
0

After upgrade to 15.8.1, Visual Studio now is crashing when attempting to build, rebuild, or deploy an SSIS package.

Attempted to reinstall Data Tools for Visual Studio 2017 but it did not resolve the issue.

The issue occur on more than one computer.

The build Output yelds nothing even in verbose mode (it literally does not get to post any message before the crash). But this is found in the eventlog:

Application: devenv.exe Framework Version: v4.0.30319 Description: The process was terminated due to an internal error in the .NET Runtime at IP 728B7A88 (72800000) with exit code 80131506.


Appaned two streams

$
0
0

Hi

Is there any way to append (not merge and not union all) two streams in SSIS? I know that both stream consist of the same number of rows. So the first row of the first stream appended to the first row of the second stream and so on. Hopefully it is very straightforward to do so just I am so blind :-)

Jozsef

Executing batch file with variable in Foreach loop in SSIS

$
0
0

I have a problem with executing my batch file that looks like that:

certutil -hashfile "THIS_IS_WHERE_I_WANT_VARIABLE_FROM_SSIS" SHA256 > checksum.txt

As you can see I don't know how to put a variable from ssis called FileName here so loop will calculate checksum for each file

Validate if a SQL table has records from SSIS 2016

$
0
0

Hi guys,

After transferring tables from one server to another, I want to validate that the tables exists on the destination server and have records in it.  I don't have any problems validating that the tables are there.  However, I tried everything to validate if the tables have records in it, or event the size of it will do, both I can't get it working.

Here is my script:

      public void Main()
        {
            dttables = (Dts.Variables["myDataTables"].Value as DataTable).Copy();
            ValidateTransferredTableExistance(dttables);
            Dts.Variables["BadCount"].Value = badcount;
            Dts.Variables["EmailMessage"].Value = message;

            Dts.TaskResult = (int)ScriptResults.Success;
        }

        public void ValidateTransferredTableExistance(DataTable dt)
        {
            // Opens a connection to the database 
            Server DestinationSrv = new Server(DestinationServer);
            Database db = DestinationSrv.Databases[DestinationDB];
            Boolean tableExists = false;
            Boolean rowsExists = true;

            message = "We may have encountered problems with the transfer of the table(s) listed below." + Environment.NewLine ;
            message += "Please investigate." + Environment.NewLine + Environment.NewLine;


            // Validate that all tables got transferred
            foreach (DataRow row in dt.Rows)
            {
                string sqlTable = row["transferSchema"].ToString() + '.' + row["transferobject"].ToString();
                tableExists = db.Tables.Contains(row["transferobject"].ToString(), row["transferSchema"].ToString());

                if (tableExists == true)
                {
                    DataTable table = new DataTable();
                    table = db.Tables[sqlTable];
                    rowsExists = (table.Rows.Count == 0)

                    if (rowsExists == true)
                    {
                        message += "- " + table + " is empty." + Environment.NewLine + Environment.NewLine;
                        badcount++;
                    }
                }

                if (tableExists == false)
                {
                    message += "- " + sqlTable + " has not been transferred. Please validate that the table exists on the source server." + Environment.NewLine + Environment.NewLine;
                    badcount++;
                }
            }

        }

 

Do I have to fill the table variable using OLEDB Adapter? I don't want to do that, because some of them have over 1 million records.  So can I somehow get the size/data space used for them?

Thanks a lot in advance for you help :-)

Mylene

SSIS connect to Kafka using kerberos to get secured data

$
0
0
Hi, I am developing ssis 2015 package where it can connect to Kafka using kerberos to get secured data. How to do? I'm new to this...


Deployed Project not importing CSV file Data

$
0
0
I have a project with an Execute SQL Task (Which truncates the temp table when the project is run), a Foreach Loop Container which contains a Data Flow task to import the results of CSV files into the temp table. I then have outside of the Foreach Loop Container, a Script Task that deletes the CSV files from a specified folder directory. When I run the project in debug mode, it works perfectly. The temp table is truncated, the results of multiple CSV files is imported into the Temp table, and the CSV files are deleted. However, when I deploy the project and create a SQL Agent Job, the truncate table task runs as well as the Script task that deletes the files. But there is no data that is imported. It's as if the Data Flow Task inside the Foreach Loop Container does nothing. Any ideas? I'm using 2012.

How to pass Date values to SSIS OLEDB Source /query

$
0
0

Hello,

I have a Package running and now I am trying to pass StartDate and EndDate to query in OLE DB Source.

I want to pass the values from Agent job, but I can see only system type variables?

Please Help.


Pass Date values to SSIS OLEDB Source /query

$
0
0

Hello,

I have a Package running and now I am trying to pass StartDate and EndDate to query in OLE DB Source.

I want to pass the values from Agent job, but I can see only system type variables?

Please Help.




unzip files load to different tables

$
0
0

Hi 

I have bunch of files coming in through zip , I have to unzip the files and load to different target tables ( one file to one table). I have around 200 files when unzipped and have to load to 200 different tables.

Can someone help me how to load this with one SSIS package looping ?

Thanks

Raj


Error loading flatfile data to table using SSIS

$
0
0

Can someone help me with loading the attached csv file database.

T‌he issue here is the CSV file has data with Double quotes and comma as delimiter but there are some instances where one of the column has Double quotes in the data also.(Show with color below)

---FYI.. This is a single record in my CSV File....
"33043000030000","24","MAGNOLIA PETROLEUM CO.","NORTH DAKOTA STATE A  1","NORTH DAKOTA STATE A","1","MAGNOLIA PETROLEUM CO.","NORTH DAKOTA STATE "A" #1","10/21/1950","5609","KIDDER","141 N","73 W","36","SENE","1980 FNL 810 FEL","WILDCAT","","","","VERTICAL","46.990158000000001","-99.864013","OG","DRY","","12/24/1950"‌‌‌

I‌‌ read online and people are suggesting me to write VB code.

Below is how the data is showing in the preview tab on Flat file connection manager


T‌‌hank you.


script to add/remove AD members

$
0
0

Hi,

Following is my code and I am not sure why the userNAme is treated as char.Also,somehow the section of code below does not seem right from the point that I want to enumerate the table and select just the username column from the table and out it into list.Then first remove members from group and add them back from table.I am seeing the error converting char to string foruserName below:

foreach (var userName in result)
                    {
                        AddUserToGroup(context,group,userName);
                    }

Actual code:

public void Main()
            {
            
            try
            {
                var table = new DataTable();
                var adapter = new OleDbDataAdapter();
                adapter.Fill(table, Dts.Variables["User::UserList"].Value);

                var results = from r in table.AsEnumerable()

                              select r.Field<string>("user_name").ToList();

                var context = new PrincipalContext(ContextType.Domain, "xxx.xx.xx.xx");
                foreach (var res in results)
                {
                    var group = GroupPrincipal.FindByIdentity(context, "ABCD123");
                    group.Members.Clear();

                    foreach (var userName in result)
                    {
                        AddUserToGroup(context,group,userName);
                    }
                    group.Save();
                }
            }
            catch (Exception exp)
            {
                //MessageBox.Show(ex.ToString());
                var msg = String.Format("{0} - {1}", exp.Message, Environment.NewLine);
                System.IO.File.AppendAllText(@"C:\Users\Tdam\Documents\ADError.txt", "Date :" +
                    DateTime.Now.ToString() + "\t" +userName + "\t" + "Message:" + msg);
                Dts.TaskResult = (int)ScriptResults.Failure;
            }
                Dts.TaskResult = (int)ScriptResults.Success;
            }
       
      
            public void AddUserToGroup(PrincipalContext context,GroupPrincipal group, string userName)
            {

            var user = UserPrincipal.FindByIdentity(context, IdentityType.SamAccountName, userName);

            if (!group.Members.Contains(user))
                group.Members.Add(user);
             }
      
    }

Also,pls let me know if i am proceeding in right direction in terms of searching users in context and then adding/removing users from a specific group.

Delete-Insert data using SSIS component

$
0
0
Hi friends,

Do we have any such existing component within SSIS which can be used to delete existing data from a table and then freshly load it with required set of data  ??

More of a delete-insert type....

Thanks a bunch ! Sweta

Success is reported but fails to bring entire data set.

$
0
0

I have a simple Oracle OLEDB data source from a view that produces 4.4 million rows. When running this simple data flow taks the SSIS package reports success at 900k rows which do hit the target SQL Server table.

There are no parameters in this Data Flow task, it is basically from the VIEW in ORACLE to a SQL Server table with all system defautls.

If there were a space issue on the receiving end I would expect an error but the 900k rows are queryable as well as the source of 4.4 million rows.

This is not making any sense to anyone here.

Thanks in advance for any suggestions where to look for something in SSIS that could possibly limit rows.


Les Draper

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>