Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS 2012: Cannot open the datafile : All permissions set, shared folder

$
0
0

Hi, 

Server 1: SSIS 2102

Server 2: Shared Folder (Everyone Full Control)

Client 1: A Desktop with Management Studio

A simple package deployed in the SSIS Catalog with a Dataflow.

The Source is a simple flat file which is stored at \\Server 2\Test folder.

The Destination is a SQL table (it doesn´t matter this time).

If we run the packate by using the SSMS / Server 1 -> Integration Services Catalogs in the Client 1 (Desktop), the package fails logging "Access Denied" and "Cannot open the datafile".

If we run the same package by using the SSMS, the packages runs without any problem.

We don´t use SQL Agent to run packages.

From the Desktop, the logged user can open and write in the shared folder. Everyone is Full Control.

Could it be server delagations? What could be happening?

Thanks in advance,

Alex Berenguer


Alex Berenguer


C# Script to copy an Excel File working locally but not on the server when I schedule a job

$
0
0

So I have these couple C# commands to copy an Excel File which works fine and dandy on my local client when I test and run this SSIS Package but when I tried scheduling the job on the server, it failed.

These are the couple C# commands...

            //  This Opens the Source .xlsx File from Emdeon ePaySmart
            Workbook workbook = excelApplication.Workbooks.Open(StringSourceFile, XlUpdateLinks.xlUpdateLinksNever, true, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing);

            //  This will Save the Source .xlsx Emdeon ePaySmart File as a .xls File...note xlFileFormat.xlExcel5
            workbook.SaveAs(StringDestinationFile, XlFileFormat.xlExcel8, Type.Missing, Type.Missing, Type.Missing, Type.Missing, XlSaveAsAccessMode.xlExclusive, Type.Missing, Type.Missing, Type.Missing, Type.Missing, Type.Missing);

Is this because I'm using xlFileFormat.xlExcel8 and locally it is running fine but when I schedule via SQL Server Management Studio as a SQL Server Agent Job it seems to be failing??

I sure hope I don't have to change this so it will run as a SQL Server Agent Job and/or run this locally every blessed week.

Can someone clear this up for me and provide a potential solution?

Thanks for your review and am hopeful for a reply.

ITBobbyP85

using reflection only once on assembly that will be invoked repeatedly by sub packages

$
0
0

Hi.  We run 2012 enterprise and unfortunately need to use reflection (long story) to call various methods from our c# scripts.  I've shown a sample of that pattern below.

Someone asked me today if the results of reflection can be saved in such a way that sub pkgs can avoid using it over and over to reload the same assembly, properties etc?  But instead reuse what was "saved" by doing it once perhaps in the mstr package?

System.Reflection.Assembly asmX = System.Reflection.Assembly.LoadFrom(Dts.Variables["asm Path"].Value.ToString());
Type y = asmX.GetType("a.b.c.d");
Type z = asmX.GetType("e");
var methodinfo = z.GetMethod("f", new Type[] { y, typeof(string) });
object anobj = Activator.CreateInstance(y);
var connectionString = Dts.Variables["connstringvar"].Value.ToString();
y.GetProperty("some property").SetValue(anobj, "some string", null);
methodinfo.Invoke(null, new object[] { anobj, connectionString });

 

Dynamic Destination Server & DB

$
0
0

Hi All,

I am migrating a DTS package to SSIS 2008R2 and I have a situation where i have to iterate through a table and have to load data from a single source table, into multiple Server-DB combinations.
The Server-DB combination is picked up from a table.
The table looks something like,


Name of Table: TABLE_A
____________________________

ID      Server-DB Combination
_______________________________

1       SERVER1.DB1
2       SERVER2.DB2
3       SERVER3.DB3
4       SERVER4.DB4
5       SERVER5.DB5
6       SERVER6.DB6
7       SERVER7.DB7
8       SERVER8.DB8
9       SERVER9.DB9


Here is my insert statement in SQL 2000 database:

set @dbname = (SELECT [Server-DB Combination] FROM dbo.TABLE_A)
set @sql = 
'INSERT INTO '+@dbname+'.dbo.DESTINATION_TABLE ' 
+'Select *  '
       +'from SOURCE_SERVER.SOURCE_DB.DBO.SOURCETABLE '
+'where COLUMN1 > GETDATE() -15 '
+'and COLUMN2 = 'INR'
+'and COLUMN3+COLUMN4+cast(COLUMN1 as varchar(10)) NOT IN 
                     (SELECT COLUMN3+COLUMN4+cast(COLUMN1 as varchar(10)) FROM '+@dbname+'.DBO.SOURCETABLE)'
 EXEC(@sql)

Note: The table dbo.DESTINATION_TABLE is available in all the Server-DB combinations and the METADATA of source and destination are exactly the same. 

I am assuming a For loop and configure that to iterate a data flow task that has an OLE DB Source with SQL command as a source with the select query from above mentioned query.
But my concern is about the destination coz it is changing dynamically based on the Server_DB Combinations which is also making things complex while mapping source to destination.
Any suggestions please?

Thank you.

Multiple runs of same stored procedure

$
0
0

I am an SSIS newbie so please excuse how little I know.

We have a manual job that I would like to automate in SSIS.

We run the same stored procedure six times with different parameters. Each run creates a csv file that we load into excel and then mail the attachments to the user. Can SSIS run the SP, and in a send mail task, send the six outputs?

How to strip out first row as a field and fill down?

$
0
0

I’ve been importing all kinds of data in the past 2 days.  I’ve got one more data set to get through. I’ve got 200+ files with dates in the very first row.  I want to . . . somehow . . . use the date as the very first field (fill this down with the rest of the data set), and import the rest of the fields normally. So, as you can see in the screen shot below, the date is in row 1. I want to strip this out, and make it a field#1 called ‘thedate’ (or whatever), and the import field#2, field#3, etc. I’m pretty sure I need some kind of script to do this, but I’m hoping there is a tool built into SSIS that can help me get through this challenge. I’m using SSIS 2013, flat files and SQL Server Developer 2014. 

I’d greatly appreciate some help with this.

Thanks everyone.


Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.


SSIS SQL Server 2012 - The requested OLE DB Provider Microsoft.Jet.OLEDB.4.0 is not registered.

$
0
0

The ssis package is part of a project deployed to a server with SQL Server 2012 SSISDB. The project is using environments. The project is set to Run64BitRuntime = False. This particular package is attempting to use a Data Flow to use an Excel Source (Microsoft Excel 97-2003 Worksheet) and load that to a SQL Server db table OLE DB Destination.

The package runs fine locally in Windows 7 MS Visual Studio Ultimate 2012 v 11.0.61030.00 Update 4. But if I run the package by command line on the server, the package fails and returns the following error:

MyPackage:Error: The requested OLE Provider Microsoft.Jet.OLEDB.4.0 is not registered. If the 64-bit driver is not installed, run the package in 32-bit mode. Error code: 0x00000000. An OLE DB record is available. Source "Microsoft OLE DB Service Components" Hresult: 0x80040154 Description: "Class not registered"

the SSISDB log returns the following message:

SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "SQLImport Step_XLS" failed with error code 0xC0209303.  There may be error messages posted before this with more information on why the AcquireConnection method call failed.

All other packages within this project run and end successfully, but this is the only one that uses an excel spreadsheet as input. If anyone have experienced this before or has any ideas on what to do next, I would appreciate it. Thanks!

Flat File connection to Linux box

$
0
0
How can I access a flat file on a linux box using the Flat File Connection Manager?

FTP only the unporcessed/new files

$
0
0
Basically I FTP flat files to a local disk and load data. My question is very simple, what is the best way to transfer only the new/unprocessed files to the local disk from the remote server. Somebody suggested a script task but I wasn't sure. please help. thanks in advance.

svk

Weird Error Running SSIS

$
0
0

I'm trying to load data from a flat file. I thought everything was setup correctly, but now, I'm getting these errors.

SSIS package "c:\users\rshuell\documents\visual studio 2013\Projects\Integration Services Project6\Integration Services Project6\Package.dtsx" starting.
Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
Warning: 0x80047076 at Data Flow Task, SSIS.Pipeline: The output column "Column 32" (114) on output "Flat File Source Output" (17) and component "Flat File Source" (13) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Flat File Destination [2]: The processing of file "C:\Users\rshuell\Desktop\errors.txt" has started.
Information: 0x402090DC at Data Flow Task, Flat File Source [13]: The processing of file "C:\Users\rshuell\Desktop\archive\GEM3L0909.txt" has started.
Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
Error: 0xC0202009 at Data Flow Task, OLE DB Destination [154]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020901C at Data Flow Task, OLE DB Destination [154]: There was an error with OLE DB Destination.Inputs[OLE DB Destination Input].Columns["INDNAME"] on OLE DB Destination.Inputs[OLE DB Destination Input]. The column status returned was: "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Data Flow Task, OLE DB Destination [154]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "OLE DB Destination.Inputs[OLE DB Destination Input]" failed because error code 0xC0209077 occurred, and the error row disposition on "OLE DB Destination.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (154) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (167). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at Data Flow Task, Flat File Source [13]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on Flat File Source returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Flat File Destination [2]: The processing of file "C:\Users\rshuell\Desktop\errors.txt" has ended.
Information: 0x402090DD at Data Flow Task, Flat File Source [13]: The processing of file "C:\Users\rshuell\Desktop\archive\GEM3L0909.txt" has ended.
Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "Flat File Destination" wrote 0 rows.
Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: Data Flow Task
Warning: 0x80019002 at Package: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (6) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "c:\users\rshuell\documents\visual studio 2013\Projects\Integration Services Project6\Integration Services Project6\Package.dtsx" finished: Failure.
The program '[1884] DtsDebugHost.exe: DTS' has exited with code 0 (0x0).

The weird thing is, even when I export the errors to a flat file on the desktop, I still get errors!  I thought the errors were supposed to be written to a flat file, or some other source if you choose, to facilitate debugging.

Does anyone here have any idea what is wrong?


Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

File system task - Move file issue, skip the current log file thats open by another process - IIS logs

$
0
0
Hi,

I have created a package to move IIS log files from their source directory to another directory called processing. However, when the package tries to move the current log file it errors because the file is in use by another process (by IIS as its still writing to it). I would like the package to just ignore the "in-use" file and process the rest and show the package completing successfully. The file in use will be rolled by IIS at midnight and the file will be picked up by the next package run.
However, when the file is rolled and becomes free a new log file is created which will be marked as "in-use"...... which should be ignored at the next run.

Is there some way that I can tell the file system task to ignore in use files or add an event handler to do the same sort of thing?

Any assistance is appreciated

How to read data from VSAM files of IBM mainframe with SSIS and store it as relational data in SQL Server?

$
0
0

Hi Everyone, 

I am working on a task to extract data from a bunch of VSAM files from IBM Mainframe and store it into SQL Server as relational data, I found that in many of the discussions, talking about creating a ADO.net Connection with ADO.NET Data Provider for Host Files, then use Data reader source to access data, but I am not able to find the driver(ADO.NET Data Provider for Host Files) in drop downlist when creating the ADO.net Connection.

I also can't find the provider in SQL Server 2012 SP1 Feature Pack, where can I download the provider without installing HIS 2013?


Johnny Wang

Location of SSIS Packages

$
0
0

Hello all,

I have started to work for a company and they have some SSIS packages in place. I need to find them and I don't know where to look at. I don't want to ask others all the time.

I tried to find  Microsoft Visual Studio, but they don't have this application on their server.

Where should I find the SSIS packages without asking from others?

Thanks,

GGGGGNNNNN


GGGGGNNNNN


Using Multiple OLE DB Source data flow component v/s having one OLE DB Source data flow component

$
0
0
I need to pull multiple segments of data from one datasource.

I have a data flow task with OLE DB Data source which has access mode as sql command. I have included all the required segments of data to be pulled from the source in a Where clause like below 

Select column1,column2,........from Table1 a JOIN Table2 b on a.column1 = b.column1 where b.column3 IN ('a','b','c')

I would like to get SSIS expert views on performance/best practice wise of having multiple OLEDB data sources with sql commands split like below

source1:- Select column1,column2,........from Table1 a JOIN Table2 b on a.column1 = b.column1 where b.column3 IN ('a')

source2:- Select column1,column2,........from Table1 a JOIN Table2 b on a.column1 = b.column1 where b.column3 IN ('a')

source3:- Select column1,column2,........from Table1 a JOIN Table2 b on a.column1 = b.column1 where b.column3 IN ('a')

to pull data from source and write to a SQL Server table

will the OLEDB source execute in parallel reducing the time to pull data from source system. 

Escaping the ODBC Escape Sequence

$
0
0

For the love of god and all that is holy, how on earth do you ESCAPE the ODBC Escape characters {} (curly braces) within an Execute SQL Task in a DTS package ??!??!

I'm using PostgreSQL ANSI ODBC driver 9.03.02.10 connecting to redshift to perform an unload. The caveat is that my query has a regex in it wrapped in an unload. I've created the example below to display what is happening.

Unloads in redshift have to be wrapped in single quote ( ' ), of course escaping any quotes that need to go along for the ride. When it gets to the { it thinks I want to do an ODBC escape to do something like this http://msdn.microsoft.com/en-us/library/ms711838(v=vs.85).aspx but I don't want it to. I've tried setting { escape '#' }, I've tried all the combos of escapes I can. Is there a way to escape this, or disable ODBC Escape Sequences in the package or anything of that nature?

This query works as it should(within the Exec SQL task) when it is not wrapped in the single quotes. For example, running the script without it being wrapped in the UNLOAD('  ') command executes successfully. 

QUERY:

UNLOAD('

SELECT * FROM testing_regex WHERE 1=1 AND somestring ~* \'[a-z]{7}$\'

') TO 's3://path'

CREDENTIALS 'aws_access_key_id=xxx;aws_secret_access_key=xxx'  DELIMITER '|' ALLOWOVERWRITE ADDQUOTES


PARALLEL OFF;[Execute SQL Task] Error: Executing the query "UNLOAD(' SELECT * FROM testing_regex WHERE 1=1 AND..." failed with the following error: "ODBC escape convert error". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.


SSIS 2014 Flat File Source - Redirect Row - Crashes

$
0
0

Hi, I have an SSIS package written in Visual Studio 2013, to be run on SSIS2014. I am attempting to parse a flat text file. If any of the columns are truncated or cannot be casted to the corresponding data type, I want to redirect these rows to a Script Component and then dump the data into a SQL Server table.

When I debug this package, the redirect rows step is hanging. I see the yellow spinning icon and nothing happens. Down in the status bar, it says that the "Package execution completed", but it didn't. There aren't any errors in the Progress tab.

Do you know why the "Redirect Row" is freezing? I have placed breakpoints in the "Set Error Description" script component and they are not being hit. So, it looks like the "Redirect Row" has crashed/raised an exception.

When I load this package into SSIS 2014, and run from the server, it crashes.

Thanks for your help.

Call Function from SSIS package

$
0
0

Hello.  I have a field pulled from an Access database that I need to manipulate before processing.  I need to strip out illegal characters.  What's the best way to use a function to process the data in the table.  I was hoping to do this within the package if possible.  

From within Access, a query passed values to a function defined within a code module and returned the new value.  The SSIS package needs to be able to do this as well.





Pass a variable value to SQL Command in OLEDB Source

$
0
0

Hi,

  I have the OLEdb Source where it has SQL Command as Data Access Mode. Below is the sample query that i have in that.

DECLARE @MonthOffSet int = 24
DECLARE @PaidDate_SK_Lowdatetime = dateadd(mm,MONTH(getdate())-@MonthOffSet-1,dateadd(year,datediff(year,0,dateadd(YY,0,getdate())),0))
DECLARE @PaidDate_SK_Highdatetime = dateadd(dd,-1,dateadd(MM,@MonthOffSet,@PaidDate_SK_Low))

followed by select statement which has where clause.

 Instead of hard code the value 24, i am trying to get the value from variable. I know there is a limitation to add the parameters only in where clause. Is there any work around or solution.

EVENT HANDLER FOR BASIC SSIS PACKAGE

$
0
0

As a developer I want to have a template for SSIS that has event handlers using parameters not expression builder

Acceptance Criteria:

-  Event handlers are dynamic using parameters 

Please ideas and assist.

Thanks

SSIS package is running without any issue, but when I configure through job, it is throwing error

$
0
0

On Production Environment (BNEPRDSQL102)– SSIS package is running without any issue, but when I configure through job, it is throwing below error 

On UAT Environment(BNEDEVSQL201)– Job is already running successfully and got the UAT Signoff from Client.

On Dev Environment (on my 9R VM-VPC_o876031VC9R)– Job is successfully running.

PRODUCTION:

Source: 1. DB2 2. Oracle(FMMS)

Destination: SQL Server  2005(Fleet Availability)

Dev Environment VM (VP_o876031VS9R)

Source: 1. DB2, 2. Oracle

Destination: Sql Server 2005

On Prod, when I am trying  run job configuring the same SSIS package (which is successfully running to run on Dev)  it is failing as below exception:

Message
Executed as user: INTERNAL\_SQLService1. ...0.5000.00 for 64-bit  Copyright (C) Microsoft Corp 1984-2005. All rights reserved.    Started:  5:18:09 PM  Info: 2014-10-15 17:18:15.72     Code: 0x4004300A     Source: Get Location Data DTS.Pipeline     Description: Validation phase is beginning.  End Info  Progress: 2014-10-15 17:18:15.72     Source: Get Location Data      Validating: 0% complete  End Progress  Error: 2014-10-15 17:18:17.30     Code: 0xC0047062     Source: Get Location Data DataReader Source [1]     Description: System.Data.Odbc.OdbcException: ERROR [IM014] [Microsoft][ODBC Driver Manager] The specified DSN contains an architecture mismatch between the Driver and Application     at System.Data.Odbc.OdbcConnection.HandleError(OdbcHandle hrHandle, RetCode retcode)     at System.Data.Odbc.OdbcConnectionHandle..ctor(OdbcConnection connection, OdbcConnectionString constr, OdbcEnvironmentHandle environmentHandle)     at System.Data.Odbc.OdbcConnectionOpen..c...  Process Exit Code 1.  The step failed.

package is getting called 

C:\Progra~1\Microsoft SQL Server\90\\DTS\Binn\dtexec /sql  FleetAvailabilityDataImport /server localhost /decrypt password /MAXCONCURRENT " -1 " /CHECKPOINTING OFF  /REPORTING V

I have already verified below

Source and Destination should be same i.e., 32-bit or 64-bit configuration drivers. Particularly this need to be taken care when you build the package in local system. All our local systems are 32-bit and servers 64-bit.

Please Advise

Regards

Rajesh

email id - itsrajesh2002@hotmail.com



Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>