Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Maintenance plan for backup executes endlessly.

$
0
0

Hi All,

I have a maintenance plan for a full back which is scheduled to run once every night. It has been running quite peacefully with out any error and usually takes 50min average. But Now(the last three days), it starts executing but never completes. I tried to run sp_who2 and saw a block when the execution reaches at a specific database. I stopped the execution, killed the spid on the 'Blk By' and then restarted the execution but same problem show up again. Whats going on? Any ideas?

Thanks.


Export Column Transformation - No selections in Extract Column

$
0
0

First time trying to export a column but looked straight forward...yet when I go to select the Extract Column there is nothing in the drop down list to choose from (there are fields listed under the File Path Column)...I don't get it, what am I missing? I even went so far as to Select Into a new table rather than use a text query. Thanks.


Loading Data issue

$
0
0

i am loading data from text file to sql server table.

now say after loading 400 rows , due to some reason (file format issue) it stops. now if i run the package again  -> i don't want to load 400 rows from begning .. it should start from 401 rows. How i will do it?

can i implement this logic by check point ?



ssis 2012 database connection - workoffline

$
0
0

Hi,
For some reason, the database connection shows workoffline each time I open the ssis project even after I uncheck the workoffline for the database connection.
Do you know why the database connection in the connection managers section keeps on showing the red icon by this connection and also shows workoffliine tick next to it?
Thanks

How can install SQL Server 2008 Integration Services without CD?

$
0
0

How can install SQL Server 2008 R2 Integration Services without CD?

The PC with Windows 7 32-bits is far away and poor bandwidth . So, I want to install SQL Server 2008 Integration Services without CD.

Does any installer for SSIS only?

Forcing a Truncation warning in string to be error

$
0
0

Hi,

If let say my source table has column with varchar(20) and my destination table has a column of varchar(5), I want SSIS to force to an error because of the truncation detected. Is it possible?

Currently, warning is shown at design time that there is truncation but when package is run, the truncated data still gets inserted to destination.



cherriesh

SSIS Connectors for Oracle by Attunity vs Oracle provider for OLE DB performances - SSIS 2012

$
0
0

Hi,

does anyone know if SSIS Connectors for Oracle are more performant than Oracle provider for OLE DB.

I've accomplished some tests and I've noticed that the Attunity connectors take less execution time, about 35-40% less than Oracle provider for OLE DB.

Thanks

Calling Oracle stored procedure from SSIS with input and output parameter

$
0
0

Hi,

I have a stored procedure which has an input and output parameter(both are number in Oracle,INT variable in ssis).I need to call my SP in execute SQL task with input parameter and I should get one row from the result of it.

I was running below query sucessfully in TOAD.But not getting the idea,how to call from SSIS package.


How controlling the null value for an input column passed to a script component - SSIS 2012

$
0
0

Hi,

for a script component task, I've passed an input column derived from a data source and inside the Visual Basic script code I've written an IF to control the column value by using IsNothing(Row.MyColumn) or Row.MyColumn = Nothing but I cannot to avoid this error:

[Control data [222]] Error: Microsoft.SqlServer.Dts.Pipeline.ColumnIsNullException: The column has a null value.
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.HandleUserException(Exception e)
   at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)

I've also tried to use the IsDBNull function, but without any results.

Any suggests to me, please?

Thanks

What right do I need to run package via job on separate SSIS machine?

$
0
0

SQL/SSIS 2012

I have a Database Server with SQL Server Agent and I have separate SSIS server with packages in the SSIS catalog (same domain). The SSIS machine has a local db engine just for storing the SSISDB/Catalog.

I want to create a job on my Database Server that runs an SSIS package from that separate SSIS server. I'm using a domain user (credentials & proxy). What rights do I have to give that user on my SSIS machine to run that package.

The package is reading a local file and dropping the data in a database (via a SQL Server account).

SSIS - How to run command line to copy files with User Variables within a Execute Process Task

$
0
0

Hello,

I'm am having syntax issues within the Arguments when trying to copy a file with cmd.exe using User Variables.

It works when I hard code the arguments : /c copy /b  "\\folder1\file.txt" "\\folder2\file.txt"

However, it's failing when I try using User Variables to replace the directory and file.

User::FILES = \\folder1\file.txt

User::FILE_NAME = file.txt

"/c copy /b + @[User::FILES] + \" \\\\folder2\\" + @[User::FILE_NAME]"

Does anybody know what's wrong with my syntax?

Thanks!

"dtutil", how to tell which configuration file for packages using after deployment?

$
0
0

Hello All, 

Trying to achieve this feature, 

Using DOS command to automatic my deployment process--glad I found dtutil. However, it doesnt give you any chance to identify which configuration file to be used by SSIS packages after deployment.

Deployment Method: file deployment

Configuration sued: XML file and SQL Table. IN XML file, it tells which DB connection for packages to look up the configuration table.

Who can share some thoughts on this?

 

 


Derek

Flat Files - SSIS Handy Hints

$
0
0

After working with flat files (especially text and CSV) for some time in my current role I have decided to come up with this little discussion which I believe may benefit peers who might not yet have come across some of the issues that I have raised in this piece. This piece is just a summary of what I have thought would be useful hints for anyone to consider when they are working with CSV or text files. Everything in this writing is purely from my experience and I am discussing it at SSIS level. I believe most of the actions can be applied across in SSMS when dealing with flat files there.

Flat files as destination file

Exporting data to flat file destination is relatively easy and straight forward. There aren’t as many issues experienced here as they are when one imports data from flat files. However, whatever one does here hugely impacts on how easy and straightforward importing data down the line from your flat file that you create at this stage will be. There are a few things to note. When you open your Flat File Destination component, of the options that you will see – listed below is a discussion of some of them. I have found it useful to take the actions which I have mentioned below.

  1. Column names in the first row– If you want the column names to be in the first row on your file make sure that you always check the box with that option. SSIS does not do that for you by default.
  2. Column delimiter– I always prefer to use Tab {t}. The default delimiter is Comma {,}. The problem that I have found with use of comma delimiters is that if the values have inherent commas within them the system does not always get it right in determining where the column ends. As a result you may end up getting some rows with shifted and misplaced values from their original columns owing to the wrong column allocation by the system.
  3. AlwaysCheckForRowDelimiters– this is a Flat File Connection Manager property. The default setting of this property is True. At one time I found myself having to use this after I faced a huge problem with breaking and misplacement of rows in the dataset that I was dealing with. The problem emanated from the values in one of the columns. The offending column had values of varchar datatype which were presented in the form of paragraphs with all sorts of special characters within them, e.g.

This is ++, an example of what – I mean… the characters ;

in the dataset which gave me: nearly 100% ++ headaches – looked like {well}; this piece

OF: example??

You can see from the above italicised dummy value example what I mean. Values such as that make the system to prematurely break the rows. I don’t know why but the somehow painful experience that I had about this led me to the conclusion that I should not leave the system to auto-decide where the row ends. As such, when I changed the property AlwaysCheckForRowDelimiters from True to False, along with the recommendations mentioned in items 1 and 2 above, breaking and misplacement of rows was solved. By breaking I mean -  you will find one row in a table being broken into two or three separate rows in the flat file. This is carried over to the new table where that flat will is loaded.

Text or CSV file??– In my experience going with the text file is always efficient. Besides, some of the things recommended above only work in text file (I suppose so. I stand to be corrected on this). An example of this is column delimiters. Item 2 above recommends use of Tab {t} column delimiter whereas in CSV, as the name suggests, the delimiters are commas.

Flat files as source file

In my experience, many headaches of working with flat files are seen at importing data from flat files. A few examples of the headaches that I’m talking about are things such as,

  1. Datatypes and datatype length, if using string
  2. Shifting and misplacement of column values
  3. Broken rows, with some pseudo-rows appearing in your import file
  4. Double quotation marks in your values

Below I will address some of the common things which I have personally experienced and hope will be useful to other people. When you open your Flat File Source component, of the options that you will see – listed below is a discussion of some of them. I have found it useful to take the actions which I have mentioned below.

  1. Retain null values from the source as null values in the data flow– this option comes unchecked by default. From the time I noticed the importance of putting a check mark in it, I always make sure that I check it. It was after some of my rows in the destination table were coming up with shifted and misplaced column values. By shifted and misplaced column values I mean certain values appearing under columns where you do not expect them, by so doing showing that purely the value has been moved from its original column to another column where it does not belong.
  2. Text qualifier– the default entry here is <none>. I have found that it is always handy to insert double quotes here (“). This will eliminate any double quotes which the system may have included at the time when the flat file was created. This happens when the values in question have commas as part of the characters in them.
  3. Column delimiter– this solely depends on the column delimiter which was specified at the time when the flat file was created. The system default is Comma {,}. Please note that if the delimiter specified here is different from the one in your flat file the system will throw up an error with a message like “An error occurred while skipping data rows”.
  4. Column names in the first data row – if you want the first row to be column names put a check mark on this option.

Datatypes and datatypes length

By default when you import a flat file your datatypes for all the columns come up as varchar (50) in SSIS. More often than not if you leave this default setup your package will fail when you run it. This is because some of the values in some of your columns will be more than 50 characters, the default length. The resulting error will be a truncation error. I have found two ways of dealing with this.

  1. Advanced– This is an option found on the Flat File Source Editor. Once this option is selected on your Flat File Source Editor you will be presented with a list of columns from your flat file. To determine your datatypes and length there are two possible things that you can do at this stage.
    1. Go column by column – going column by column you can manually input your desired datatypes and lengths on the Flat File Source Editor through the Advanced option.
    2. Suggest types – this is another option under Advanced selection. What this option does is suggest datatypes and lengths for you based on the sample data amount that you mention in the pop-up dialog box. I have noticed that while this is a handy functionality, the problem with it is that if some of the values from the non-sampled data have lengths bigger than what the system would have suggested the package will fail with a truncation error.
  2. View code– this is viewing of XML code. If for example you want all your columns to be of 255 characters length in your landing staging table
    1. Go to your package name, right click on it and select the option View code from the list presented to you. XML code will then come up.
    2. Hit Ctrl + F to get a “Find and Replace:” window. On “Find What” type in DTS:MaximumWidth="50" and on “Replace with:” type in DTS:MaximumWidth="255". Make sure that under “Look in” the selection is Current Document.
    3. Click “Replace All” and all your default column lengths of 50 characters will be changed to 255 characters.
    4. Once done, save the changes. Close the XML code page. Go to your package GUI designer. You will find that the Flat File Source component at this point will be highlighted with a yellow warning triangle. This is because the metadata definition has changed. Double click the Flat File Source component and then click on Ok. The warning will disappear and you will be set to pull and load your data to your staging database with all columns being varchar (255). If you need to change any columns to specific data types you can either use Data Conversion Component or Derived Column component for that purpose, OR you can use both components depending on the data types that you will converting to.

Dynamic Flat File Name and Date

Please see this blog http://www.bidn.com/blogs/mikedavis/ssis/153/using-expression-ssis-to-save-a-file-with-file-name-and-date

There is so much to flat files to be discussed in one piece.

Any comments plus additions (and subtractions too) to this piece are welcome.


Mpumelelo

Derived Column Transformation Editor error

$
0
0

Hello,

I'm trying to calculate age in Derived Column Transformation Editor:

 convert(int,DATEDIFF(d, BIRTH_DATE, getdate())/365.25) as age

but i have some error(o think i need to change the script)

Please advice

Thanks

Naming confusions on BIDS, SSDT, SSDTBI and Visual Studio , Visual Studio Shell

$
0
0

hi guys, a very confused BI developer here looking for help.

Our production server has been updated to SQL SERVER 2012 recently and on development env, we are still using BIDS for Sql Server 2008R2.... So far, I've developed and SSIS  solution on dev env and deployed to production, it works fine now . 

However, I want to update my dev env to be in line with Sql server 2012 and thats when the fun stuffs begin...

So, could anyone please tell me which tool I need to install ? I understand BIDS is gone for good, is SSDT I need to install or SSDT-BI?  Should I install Visual Studio  and if yes, which version of VS should I install, VS 2010, VS2012 or VS2013??? maybe VS2014?

Thanks

Hui


--Currently using Reporting Service 2000; Visual Studio .NET 2003; Visual Source Safe SSIS 2008 SSAS 2008, SVN --


how to load lacs of records into Excel files

$
0
0

Hi,

i am getting many records(20lacs,30 lcs), these records into Excel

actuvally single excel file contains 10 lacs records or some limit only, more then that file limit records came,

how to handle in that case

please help me

Regards

MalliS

SQL server Timeout issue in data insertion

$
0
0

We are loading data from DB2 database to SQL server using Data flow task. During data insertion in SQL server, we are receiving the below error.

"An exception has occurred during data insertion, the message returned from the provider is: Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding."


Kindly help to resolve this issue.

Thanks!!!

SSIS: Unflatten a hierarchy

$
0
0

I have an excel file that has the following columns and data

ItemNumberDescriptionAttribute1Attribute2Attribute3
1000A121
1001B102
1002C211
1003D001

I need to output this into two tables and in the following way

ItemIdItemNumberDescription
11000A
21001B
31002C
41003D

ItemIdAttributeIdValue
111
122
131
211
232
312
321
331
431

Essentially two questions:

1. After I create the first table, how can I provide the Id generated there to the Attributes table

2. How do I unflatten the Attibutes to multiple rows. There can only ever be three attributes Attribute1, Attribute2 and Attribute3. These need to be associated with AttributeIds 1, 2 and 3 respectively.

Thanks in advance

SSIS From SQL To Oracle

$
0
0

Hello,

I just need to move data from 1 table in SQL to Oracle on demand. Is there an easy way to use SSIS package to achieve this goal without using link server?

File existence in a Share Drive, for SQL

$
0
0
 My requirement is to check whether A particular  file exists In a share folder ,before I start importing the data Into SQL. If it DOES exist then the package should load the file into SQL. Suggestions how to do this using SQL or SSIS Pkg?
Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>