Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Running both 32 bit and 64 bit of SQL Server Express on 64 bit Windows 7 Professional

$
0
0

Hi,

I am fairly new to SQL and not an IT professional. I'm trying to import data from MS Excel into an existing table in SQL using the Import/Export wizard but running into a bunch of issues. I think my first problem is that my bit versions are all over the place. My Windows 7 Professional is 64 bit, MS Excel 2007 is 32 bit and SQL Server Express is 64 bit.

I have also tried importing the data using the OPENROWSET command but I got an error that said "The OLE DB provider "Microsoft.ACE.OLEDB.12.0" has not been registered." Looking in my Server Object folder in SSMS, I see that MS Ace OLDEDB is not listed as a linked server provider. I tried to rectify the problem by downloading the Access database engine 2007 but that didn't do anything.

I am not sure what my next step should be. Should I download the 32 bit version of SQL Server Express? Can I run it parallel to the 64 bit version? Will it create a whole new set of problems given my Windows is 64 bit and I am now running the 32 bit version of SQL?


How to set up job for getting data and put in shared drive?

$
0
0

I've one table which is contains sales Tax data and I need to send data(only for past month) to accounting person by every end of the month for Example. Today is July/1/2015 so I need to send data for month of June/2015. 

Is there anyway I can setup a job to send a data. Also I am not able to send large file using OUTLOOK so is there any other suggestion for Tool to send a data? 

Error in SQL Server Agent : The step failed.

$
0
0

Hi,

I have some SQL Server AGent jobs that calls SSIS packages. The job runs every 10mins looking for an excel file as a source. Many a times, the job fails with error "The Step Failed".

Can someone please let me know how i can get into the details of the error?

Regards,
Mayank Jain

SSIS 2012 (11) Flat file import with footer hangs (never completes)

$
0
0

Hi all,

We have many SSIS packages built in Visual Studio 2008 and previously deployed to SQL 2008 (SSIS 10) and then migrated to SQL 2012 (SSIS 11).

Most packages work OK but a couple hang and never complete.  These import text files with a footer which has a different number of columns to the data rows.  We have found that SSIS 10 allowed this OK and the footer row was filtered out once in SSIS.

Every bit of documentation states that in SSIS 11 (SQL 2012) the flat file import has been improved but as far as we can see it means that some of our imports do not work.  The data looks like this:

10;  234234;SMITH;2015-01-07;  3123123123;VE65          
10;  234234;SMITH;2015-01-07;  3123123123;VE65
10;  234234;SMITH;2015-01-07;  3123123123;VE65
99;000277;000000                                                                                                   

You can see the footer has a different number of columns.  In SSIS 11 (SQL 2012) it starts to process the file and never completes, the SSIS package runs continuously and does not complete, I have to stop it.

I can find no other references to this on the internet, I really do not want to pre-process the file or use a custom script as a data source for the import because there should be a way to handle this - it must surely be a bug if it hangs?

Is there a update I need to do to fix this?

Regards,

Gary.

Data extraction from Hyperion Financial Management using SSIS

$
0
0
I want to build a data import process with SSIS, sourcing Hyperion Financial Management. Accoring to my knowldge there were a Star Integration Server (Star Analytics acquired by IBM in Feb 2013) doing the extraction job and which could be used in SSIS. As this product is not available now, Any other suggestion , how to do this.

srpatel0

Copy MySQL fields to MSSQL database with insert trigger

$
0
0

I have MySQL and MSSQL database,when the MySQL table is updated, I'm trying to update the MSSQL table with selected fields from MySQL table.i'm able to query and update from MSSQL but I want this to be done with  insert trigger from MySQL. please help in any way

 INSERT INTO tablelast (ids,post_dates,post_modifieds,post_titles,post_contents,guids,meta_ids)
SELECT * FROM OPENQUERY(connection,'SELECT id,post_date,post_modified,post_title,post_content,guid,meta_id FROM umkhanyakude.wp_posts INNER JOIN umkhanyakude.wp_postmeta on wp_posts.id=wp_postmeta.meta_id')-- where id=new_@id')
go

Oralce connectivity

$
0
0

Hi ,

I have made the connection with oracle data base from SSIS by using Oracle Source task. Connection has being established but, not able get the data. It is not showing any error , it is showing like running for long time.  I have tested with several tables. Do I need to change any setting from oracle side or in SSIS.

please advice me.

thanks


Raja

Nuget,SSIS, and GAC.

$
0
0

Ran into an issue today, it appears that when trying to write an SSIS Script component, you can not easily use a NuGet package/package manager to reference API's. Not without registering them to the GAC first.

If you reference a NuGet package, and then close the script and reopen it, it will lose the reference if the DLL's are not GAC'd.  I guess this OK if you are only referencing one or two DLL's.  But it will cause issues when trying to run from another computer.  The other problem in my case, is that I am referencing an entire API library -quickbooks.  There are a dozen or 2 DLL's that I am trying to reference via NuGet.  It is going to be very cumbersome to GAC them all.

This makes me want to rethink using SSIS at all for this solution, but since I am trying to build an ETL process between an internal accounting system and quickbooks online, SSIS seemed a good choice initially.  My intent was to write a small REST call in the SSIS script component that would act as a data source to the web api and then load a Chart of Accounts based on information from QuickBooks online.  But as soon as I close the script, I lose all my Nuget references. 

Kind of hard to believe that SSIS does not really "play friendly" with NuGet at this point.  Anyone know if there are any improvements planned in this area?



SAP BI connector 1.1 - Getting [SAP BW Source] Information: ArgumentOutOfRangeException is met: row 9 total length 750, declared lines per recod 3, actual lines 3 , column 31, offset 713, length 40

$
0
0

HI

I am connecting SAP BW server through ssis 2014 ( sap Bi connector 1.1) . while connecting one of the OHS in BW server to pull the data , getting the below information and the process in running infinitely in debugging mode .

[SAP BW SOURCE] information : Argumentoutofrangeexception is met : row 9 total lenght 750 , declared lines per record 3, actual llines 3, column 31, offset 713 , length 40

[SAP BW SOURCE] information :Append Blank line:row o total length 1000, declared line per recod 3 , actual lines 4 , column 31 , offset 713 , length 40

same SAP BW server ,  if i connect  and pull the data using ssis 2008 ( sap bi connector 1.0) its running without any issues.

my production server has sql 2014 so I need to know why this issue in SAP BI connector 1.1 alone and how to fix it ?

Thanks & Regards,

Gopal



Derived Column Failed because of truncation occured...?

$
0
0

[Derived Column[2]] Error: The "Derived Column" failed because truncation occurred, and the truncation row disposition on "Derived Column.Inputs[Derived Column Input].Columns[WeightUnitMeasureCode]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Check Nulls" (2) failed with error code 0xC020902A while processing input "Derived Column Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

Expression : 

ISNULL(WeightUnitMeasureCode) ? "UNKNOWN" : WeightUnitMeasureCode

Data Type :    Length

Unicode string [DT_WSTR]3

Please help me out get rid of this error.



How to fix the error CPackage::LoadFromXML ?

$
0
0

Hi friends, 

I made a test package on my developer machine, [used sQl 2008r2 and Bids 2008].

After Deploying it to job scheduler, it works well.

But after I added XML configuration to it, It throws the following error.  

TITLE: Import Package
------------------------------

The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails.


------------------------------
ADDITIONAL INFORMATION:

The package failed to load due to error 0xC0010014 "One or more error occurred. There should be more specific errors preceding this one that explains the details of the errors. This message is used as a return value from functions that encounter errors.". This occurs when CPackage::LoadFromXML fails.


------------------------------
BUTTONS:

OK
------------------------------

so does that means I am receiving Error from that XML itself ?

I created and deployed the Package on same machine; 

Kindly help, I am new to deployment and Package configuration.  

Regular failures during integration and processing the cube.

$
0
0

Hi all

I am completely new to DW maybe 6 months EXP. I have had a DW dumped on me and it has been running fine until recently.

The integration from the data sources to the staging area and to the DW are fine. The container that processes the cube is being somewhat problematic lately.

First the dimensions and measures are processed with the exception of the sales details dimension and sales fact measure, this is processed with the process add function to add only sales from the sales fact table that have a flag of 1 in the [add this] column - ie yesterdays data.

We are experiencing random failures lately meaning it runs fine some days and fails others and seems to be getting worse, I am not sure what is wrong or where to look.

These are the only errors I get.

Task Type(SSIS Component) – Data Flow Task

Error Description -Parser: An error occurred during pipeline processing.

Error Code – -1055719414

Task Type(SSIS Component) – Data Flow Task

Error Description -Errors in the OLAP storage engine: An error occurred while the 'Sales Key' attribute of the 'Sales Details' dimension from the 'Nandos' database was being processed.

Error Code – -1055719414

Task Type(SSIS Component) – Data Flow Task

Error Description -XML for Analysis parser: The XML for Analysis request timed out before it was completed.

Error Code – -1055719414

Task Type(SSIS Component) – Data Flow Task

Error Description -SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Dimension Processing" (2) failed with error code 0x80004005 while processing input "AnalysisServicesServerInput" (15). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

Task Type(SSIS Component) – Data Flow Task

Error Description -XML for Analysis parser: The commit or rollback request cannot be completed because there is no active transaction in this session.

Error Code – -1055719414

Task Type(SSIS Component) – Data Flow Task

Error Description -Dimension Processing failed the post-execute phase and returned error code 0x80004005.


Error Code – -1073450984

could this be due to partition size?

HOW TO DO THIS?

$
0
0

DIM tables are lookup tables.

 

  • Every day at 11 AM, for all the lookup tables Files will be uploaded into a folder D:\Processing.
  • E.g.: For  Cabin class data, file name will be DIM_CABIN_CLASS_<Date & time>.csv
  • For DIM Airports, File  name would be DIM_AIRPORTS_07022015071223.csv"
  • File name will change everyday
  • All the lookup data will be loaded into a specific folder
  • If there is no file corresponding to  Lookup, it assume that no data to be loaded

 

  • If the file exists and while loading check for data in the main tables. IF exists, update or else Insert
  • Job should be scheduled to run every day at 11 AM

build version vs code version

$
0
0
Hi, Some people I worked with do a build of the dtsx in BIDS and then take the dtsx from bin\release and use this dtsx to import/schedule in sql server. So there is a code version and runtime version of the dtsx. The code version is maintained in tfs. Some just use one version of the dtsx. They do not do a build to take the bin\release version but rather do a build just to make sure everything compiles and no errors. In this case only one version of dtsx to maintain code and execute. Is there any difference between the built version of the dtsx and the code version  ? Need your input on which dtsx to use and why.. Thanks

While generating CSV dt_dbtimestamp is ignoring millisecond part when it's zero

$
0
0

Hi,

I have one data flow task that reads data from table and generate a csv file. Now I have one timestamp column that holds datetime value. AT destination under derived column I have used dt_dbtimestamp for that field.

When table contains a value for which millisecond is not zero e.g. 2015-07-16 12:25:27.010  then in CSV it generates 2015-07-16 12:25:27.010000000 which is correct as per my requirement. But, when the value in table is 2015-07-16 12:25:27.000 then output in csv becomes 2015-07-16 12:25:27 which is issue for me. I want the output as 2015-07-16 12:25:27.000000000

Please help on this

Thanks,

Biswanath


SSIS SAP BW Source component not working in sql 2012

$
0
0

We have 7 packages developed and deployed in 2008R2.

All these packages extract data from SAP systems using SAP BW 1.1 as the source components and then loads data into SQL Server 2008 R2 database. These packages were working fine till recently.Packages were developed using VS 2008.

Recently, the data base was upgraded to 2012, and all these packages were migrated to VS 2010.

Out of these 7 pakages, only one package is having issues while reading data from SAP source component.Rest all packages are successfully migrated and deployed. They are executing as expected.

Note: we have downloaded and installed SAPBI installable and placed the librfc32.dll in the required folders.

Issue in detail - when we try to preview the data in the SAP BW souce componnent, there seems to be a ovelap if any of the records have null values in one of the columns. (Basically the data column positions are scattered)

Ex: refer to screenshot attached.

The same package, if built using VS 2008 on a machine having SQL2008R2 then it works.

If anyone has experienced this situation request you to share the reason behind this behaviour of SAP BW Source compnent?

Or

Also let me know if this an installation issue or if there is any fixes avaible for this component?

Regards,

Siva

SSIS Catalog slower than package deployment

$
0
0

We are having a problem with our Data Warehouse load process taking well over an hour longer than expected.  Taking a look at our logging, every package is running just a second or two slower, which is translating into an increase of the load time from 4:30 - over 6 hours.

Background
SQL Server 2014 Enterprise
Tools are all version 12

The WH Load is made up of 61 systems that are required to be loaded, consisting of 12 different system types.  The most common system type has 31 systems that all execute the same packages.  There is a control system that manages the executions.
This equates to 2237 package executions over the course of the night.  We have systems all over the world, and have to wait for each one to pass midnight before we load, so we can only start to load systems that are at GMT-5 very late (we are at GMT+1).

All packages were upgraded to version 12, and converted to project deployment at the same time.  We also reviewed each package and removed some old code we no longer need, so there is not much we can do to make them quicker.

Taking a look through our historical logs, each package is running slightly slower.  Some of these packages only used to run for a total of 1 - 2 minutes over 31 executes, but this is now taking 2 - 3 minutes over those same executes.
I chose one package that loads a fact.  The package deployment execute took 191 seconds for 32 executes.  Project deployment took 377 seconds for 31 executes. I checked the catalog.executions, and that is showing a total of 364 seconds for the 31 executes.  I am using a custom SP to execute packages, so I do expect a small additional overhead there.

We have turned all logging off for the executes to see if that was causing a problem, but there was no change.

Has anyone seen any thing similar or can point me in the direction where I can find the cause of this issue?

Thanks

Error installing SSDTBI_VS2012_x86_ENU.msi

$
0
0

Hello,

Please can you help me solve the below error when installing Microsoft SQL Server Data Tools - Business Intelligence for Visual Studio 2012.

Below is the exact error :-

---------------------------
Microsoft SQL Server Data Tools - Business Intelligence for Visual Studio 2012
---------------------------
An error was encountered.

Unspecified error
---------------------------
OK   
---------------------------

Softwares I have installed :-

1. OS : Windows 8 Pro 64 Bit

2. Microsoft SQL Server 2012 (SP1) - 11.0.3128.0 (X64)
    Dec 28 2012 20:23:12
    Copyright (c) Microsoft Corporation
    Enterprise Edition (64-bit) on Windows NT 6.2 <X64> (Build 9200: )

3. MS Visual Studio 2012 v11.0.50727.1 RTMREL

Note : Same setup works on Windows 7 OS meaning the msi is NOT corrupt.

Thanks,

Shirish

Managing a SSIS pkg when the data structure of the source changes - SSIS 2012

$
0
0

Hi,

I've a SSIS 2012 pkg with several data flow task to get data from an Oracle database and then to write on a SQL Server database. The related job runs each night.

It occurs 1-2 times for month that the data structure of the Oracle source changes (e.g. for a new column or an existing column is to delete or the length of an existing column increments). In a such scenario I'd like as much as possible to avoid that the import data job goes in error, without accomplishing any actions during the nigth. Moreover, the pkg has to get all Oracle records and not only the rows without any errors due any structure data changes for the Oracle source.

Any suggests to me, please? Thanks

filter out non-numeric data

$
0
0

Hello,

I have a package that i am building right now and I need to filter out data from my employeeid field that is not an integer. How would i proceed with this? I currently have a conditional split filtering our employee id's that contain a dash. 

Viewing all 24688 articles
Browse latest View live