Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Character in CAST argument not valid after migration from Sql Server 2008

$
0
0

Hi,

I migrated a job from Sql Server 2008 to Sql Server 2016 that transfer data from a table in SQL Server to a file in AS 400 server. I am using the provider Microsoft OLE DB provider for DB2.

The problem is that after migration is not possible to use the "Ignore" option in colums with numeric type in destination table.

The process send the Blank value in Ignored columns and throw "Character in CAST argument not valid" Error.

It's works fine in SQL Server 2008.

Than you!


ERROR:To run a SSIS package outside of SQL Server Data Tools you must install Standard Edition of Integration Services or higher

$
0
0
I'm running the latest  SSDT
Microsoft Visual Studio 2015 Shell (Integrated)
Version 14.0.23107.0 D14REL
Microsoft .NET Framework
Version 4.6.01055
Installed Version: IDE Standard
Microsoft Visual Studio Tools for Applications 2015 00322-10000-00000-AA722
Microsoft Visual Studio Tools for Applications 2015
Visual Basic 2015 00322-10000-00000-AA722
Microsoft Visual Basic 2015
Visual C# 2015 00322-10000-00000-AA722
Microsoft Visual C# 2015
SQL Server Analysis Services 13.0.1605.88
Microsoft SQL Server Analysis Services Designer
Version 13.0.1605.88
SQL Server Data Tools 14.0.60629.0
Microsoft SQL Server Data Tools
SQL Server Integration Services
Microsoft SQL Server Integration Services Designer
Version 13.0.1601.5
SQL Server Reporting Services 13.0.1605.88
Microsoft SQL Server Reporting Services Designers
Version 13.0.1605.88

I created a SSIS task that has targetserverversion set to SQL Server 2016
I created a bat file that targets: C:\Program Files (x86)\Microsoft SQL Server\130\DTS\Binn\DTExec.exe

When I run the bat file, the task runs the first 6 control flow items which are SQL Tasks. The seventh control flow item is a script task which throws the following error:

To run a SSIS package outside of SQL Server Data Tools you must install Standard Edition of Integration Services or higher

I'm not sure why it is failing with this error.

SQL job never finishes

$
0
0

I have an ssis package that writes to a log file as the sql agent job is finishing up.  It's basically,

insert tablename values ('The job is done',getdate())

The package is inserting the data but every day or so the package / job fails to finish.   If I look in activity monitor is shows me the job is still running and I can't stop the job.  I have to restart the SQL Agent to proceed.   I know the job is successful because the data is in the table. 

SQL 2014 SP2 enterprise edition running on the same server as the integration services.  The connection manager uses the SQL 11 Native Client.  


--Burt King

How to style a header of Excel file in Destination

$
0
0

Hi,

I have a dataflow my source is sql server and my destination is Excel file.

What i want is to style the first line of my file and give the line a background color and also add some margin in the cell of file

how can i do that using ssis

Thanks

Text truncated at 255 while loading data from Excel to SQL using SSIS

$
0
0

Hi,

I am facing an issue in SSIS where the value from a cell in Excel is getting truncated at 255 character length.

I have tried the following solutions:

1. Adding IMEX=1, in the connection string value.

2. Modifying the first row and inserting the sample row with 4000 characters in the cell.

3. Changing the registry setting value for Excel Driver, Typeguessrows = 0 , etc.

4. Extracting the data from Excel using Table, View, SQL Query, etc.

None of the above solution is working.

Could you please suggest any other solution that I can try.

Thank you 

Re-startablity of SSIS process

$
0
0

I have SSIS process which is fairly big  and very critical and it is moving data from tables to tables  , transforming data many ways , applying transaction number in process and pushing them down stream process.  I want to ensure in event package fail or have network failure best practice are used and recovery is done in order. My goal is to start process exactly where it left off or roll back every single thing that is done via package.  I can't use checkpoint as restarting from the task level will cause duplicate records. Below are some of the scenario I am concern about. Can someone advise what would be the best practice.

About current package: In coming data comes in flat file and is loaded to staging and then  in to 20 tables and massaged. All table has Batch_ID . This Batch_id per run. Data is kept in 20 tables  for historical reference when same process runs again. Process runs and refer to data already in historical table to generate transaction number and to negate transactions.  

Scenario 1. Data is loaded to 1st table 2nd and half way to 3rd table than how do I start from same spot I left off at data level. For example  : 100 transaction loaded from the file , 100 loaded to 1st and 2nd table and in 3rd 50 loaded and network went out.

Scenario 2. File is loaded half way and network goes out.

Scenario 3. All data loaded successfully but right before batch is update with success failure occur. 

I am concern of possible failure as all data has  transaction number generated and downstream system does not accept anything out of sync or if once transactions are out of sync it will be disaster for future runs.



Importing .DTA file

$
0
0

I have .DTA file that I need to convert to excel. I thought using SSIS or Importing the .dta into a database makes it easier, however I am not getting any luck. I do have both SQL Server 2012 and 2016. I tried to use the SQL Sever 2016 Data Import and Export Wizard.

1. For my data source I select "Flat File Source" for my destination I chose Excel... see error 1

2. Instead of Excel for my destination I connected to my local database. Error 2 shows that

Error 1. TITLE: SQL Server Import and Export Wizard
------------------------------

Column information for the source and the destination data could not be retrieved, or the data types of source columns were not mapped correctly to those available on the destination provider.

     - The data type could not be assigned to the column "s

Error 2: 

- Executing (Error)
Messages
Error: Preparation SQL Task 1: Unclosed quotation mark after the character string 's'. (SQL Server Import and Export Wizard)
 
Error 0xc002f210: Preparation SQL Task 1: Executing the query "CREATE TABLE [dbo].[test-data.dta] (..." failed with the following error: "Incorrect syntax near 's'.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
 (SQL Server Import and Export Wizard)
 

Strange behavior from Oracle OLE DB Source in a dataflow

$
0
0

I have a pretty simple data flow that uses an OLE DB Source, using the Oracle OLEDB Client.  In the OLE DB Source, I can parse the query and it succeeds.  I can alos preview the data.  But when I run the package, it fails on that component with the following error:

Error: 0xC0202009 at Instance Properties, OLE DB Source [1412]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "OraOLEDB" Hresult: 0x80040E14 Description: "ORA-00903: invalid table name".

One thing to note.  I am running on Windows 7 x64 and I know there are designer vs. runtime issues.  But this seems to be different since this is all from within the designer. 

 


SSIS Packages failing to execute after the November Quality Preview Rollups

$
0
0

SQL Server 2012 (11.0.6248)

This started happening when SQLAgent job executes a SSIS package after applying the November Preview rollups KB3197875 and KB3196684 (seen as KB3195387 in installed updates).  Trying to remove them now to see if it resolves the issue.  Happy Turkey Day!

The SSIS Execution Process could not write to the IS catalog: <SERVER>\<INSTANCE>:<SSISCatalogDB>   Error details: Connection open and login was successful, but then an error occurred while enabling MARS for this connection. (provider: Shared Memory Provider, error: 15 - Function not supported);   at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, SqlCredential credential, Object providerInfo, String newPassword, SecureString newSecurePassword, Boolean redirectedUserInstance, SqlConnectionString userConnectionOptions, SessionData reconnectSessionData, DbConnectionPool pool, String accessToken, Boolean applyTransientFaultHandling)
   at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnectionPool pool, DbConnection owningObject, DbConnectionOptions options, DbConnectionPoolKey poolKey, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
   at System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
   at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal&amp; connection)
   at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal&amp; connection)
   at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal&amp; connection)
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.ProviderBase.DbConnectionClosed.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.SqlClient.SqlConnection.TryOpenInner(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.Open()
   at Microsoft.SqlServer.IntegrationServices.Server.Shared.ExecutionSpecifier.CheckParameter(ServerOperationStatus status)
   at Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ProjectOperator.PerformOperation()







SQL 2016 Tabular Model processing fails when running from Integration Services package

$
0
0

When I run a SSIS package to do a full processing of a SQL 2016 Tabular Model deployed on SQL 2016 with compatability set to 1200, it fails and gives me the following error:

"[Analysis Services Execute DDL Task] Error: This command cannot be executed on database 'TabularDatabaseName' because it has been defined with StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database."

How do I change the StorageEngineUsed to the Tabular API's so that the I can process it correctly?

When manually processing this model within SSAS, it works 100% correctly.

How to run a select statement and send the result in tabular format in a Send mail task?

$
0
0

Hi,

I have a log table with 10 columns. I would like to execute a sql statement against the tale to pull top 25 records and send the result in an email in a HTML format.  What are the possible solutions? Your help is highly appreciated.

Parent child call between different projects of the same solution

$
0
0

Hello!

In SSIS 2014 I have a solution comprising of two projects, 'ParentProject' and 'ChildProject'.

ParentPackage.dtsx has calls to both child packages, ChildPackage1.dtsx and ChildPackage2.dtsx. I have used file system references as the child packages are in another project.

ChildPackage1.dtsx has a call to ChildPackage2.dtsx, via a Project reference:

ChildPackage2.dtsx has a script task that outputs a message-box:

When ParentPackage calls the ChildPackage2 it outputs the message-box and works fine, but when ParentPackage calls the ChildPackage1 (which in turn calls ChildPackage2) it throws the bellow error:

Why does it throw this error when both child packages are in the same project?! Please share your thoughts on this.

SSIS solution in source control repository

$
0
0

Hi, I am checking in my Visual Studio 2013 SSIS project to SVN source control repository from my local path and when I try to open up the solution from SVN I get .dtproj file not found error. I tried to delete the .suo and other .user files and still no luck. How would I put my SSIS solution in source control for other developers to access it with no issues. If create a new solution in the SVN path and add all the packages I would have to manually re-configure the connection managers inside each task in the SSIS package. I have used TFS in the past and never ran into any issues sharing the solution with other developers on the team.

Please let me know.

Thank you in advance.


SQLEnthusiast

Truncation Issue:Copy Data from Excel to Database via SSIS

$
0
0

Hi team,,

 i am facing truncation issue while Copying Data from Excel to Database via SSIS

The Excel with View looks like (0.85) with actual data present  -0.85111345747 and when the data is staged to DB only  (0.85)  gets stored.

How to stage the full data  -0.85111345747  to DB without any truncation issues

Please help

SourceInputColumnLineageID custom property type

$
0
0

Hello, I have problem with package execution. When I try to run Data Conversion task I get following message and error:
SourceInputColumnLineageID custom property must be type VT_I4 and Visual Studio Error: VS_ISCORRUPT.
Sounds like problem with metadata.

Package should extract data from Access file, convert it and insert into table in SQL Server 2012 Standard instance. I have folowing installation setup:
SQL Server 2012 standard
Visual Studio 2013 Professional with Business Intelligence for VS 2013 (which in turn installs also files for SQL Server 2014)

I tried to uninstall SQL and VS and cleaned the register as well. But after installing everything again problem persists. My thought is a lack of compatibility between Business Intelligence for VS 2013 and SQL Server 2012, but this is weird. 

Anyone had similar problem?


Working example/script task on consuming data from REST API

$
0
0

Hi all

I have googled my way through the internet searching for one working example of a script task in SSIS, that loads data into a table from an REST API. So far with no luck - it seems that everyone agree that it can be done using a script task, but it's apparently one of the best kept secrets.

So please if someone knows and actually have done it themselves, please give me an example of how to make the following work.

From the app company I got the following information

"The partnerkey is used for authentication. The response returned from the service contains the SessionToken, use this token in subsequent calls to the service to authenticate yourself." 

So I got the partnerkey. With the partner key I can get a sessionkey. Following info is provide from the app company

Request (JSON)
POST /login/application
Content-Type: application/json
{"Name":"micros", "Key":"<partnerkey goes here>"}

And once that works, department can be retrieved with this

Request (JSON)
GET /departments/application/?securityToken=<token>
Content-Type: application/json

So how would this fit into a SSIS flow. First use partnerkey to get a sessionkey. Then use that sessionkey to get department data

IS instance does not work after fresh installation

$
0
0

Hi there,

Just I've installed engine and IS in this new VM but it does not work now when i try to log in:

(SQL2k14 version)

-Service is up

conditional split in ssis

$
0
0

Hello <o:p></o:p>

With conditional split in SSIS, if I only have one output conditional split from the a set of data, where does the other set of data that doesn’t meet the criteria go, does it just get dumped or held in memory which will slow down the server at some point? Is it a good idea to do this or should I add an default output for the data that doesn’t meet the criteria?

<o:p>i don't need the data that doesn't get meet my criteria so its ok for it to just get deleted.</o:p>

Kind Regards<o:p></o:p>

Danny <o:p></o:p>

Anyone experience "retroactive" changes to DT_BYTES data in the the data flow?

$
0
0

Good day,

I've filed a Connect bug on this issue, but wanted to know if anyone else had any "unexplained" or odd behaviour when dealing with DT_BYTES columns, Conditional Splits, and/or Multicast operations.

The short story is that I have a flow with four DT_BYTES columns (the business scenario is these are hashes of multiple columns, used to detect differences in rows over time). This flow goes through a Multicast and then on to two Conditional Splits. One conditional split tests for differences in two of the hash columns, the other tests for differences in the other two. If there are differences, I set the "old" hash to the "new" value using a Derived Column (and other things).

The issue is that performing this Derived Column "replace value" operation causes the column to be BLANK in the preceding Conditional Split.  That's right, a change to a column made further "down" the flow is affecting the operation of a component that precedes it.  And the real funny thing is that if I remove the "sibling" Conditional Split - which is only related to the problem by a Multicast connection HIGHER up the flow... then the issue goes away.

Simple code attached to the case in Connect...


Todd McDermid's BlogTalk to me now on

Discussion: How to best load data from Excel Sheet saved in Office 365 Sharepoint and into SQL DB

$
0
0

Hi all

I have a question that I would like some input on. I thought that it was straight forward, but currently it seems rather tricky to actually be able to get data from an Excel sheet from a document folder in Office 365 Sharepoint and into a SQL Database using SSIS.

The setup is as follows.

I have a Office 365 Sharepoint setup, with a set of Subsites - one for each Customer Area.

So Customer1@company.com have access to https://company.sharepoint.com/customer1

On the site there is a document folder where the user can upload their timesheets. Ex. Timesheet_Jan2016.xls ... Timesheet_Nov2016.xls is stored in the document library called

https://company.sharepoint.com/customer1/DocumentsForUpload/Forms/AllItems.aspx

So question is now, how can I, from the Server running SSIS and SQL create a package that is allowed to access the folder and extract the data. I know how to design the SSIS flow to cycle through the files and import, but it's more about actually getting to the folder in Sharepoint. I have seen examples where, instead of a document folder, the user is directed to a OneDrive Location and then on the SSIS/SQL server the SQL agent account is then granted access to the OneDrive folder and after syncronizing it to the local OneDrive folder, it can then be accessed through SSIS. This however seems very cumbersome and at the same time, the OneDrive Sync seems only to be initiated once the agent account is logged on to the desktop..

Alternative i'm thinking of just booting up a FTP, join it to the Azure AD, grant the users access to respective folders on the FTP and then redirect the users to the FTP instead of using the Document folder

\Christian

 

Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>