Hi ,
We are upgrading our SQL Server environment. I would like to know standard methods for :
a) Packages deployed to MSDB.
b) Packages deployed as file system.
any help is appreciated.
Thanks,
Vishal.
Hi ,
We are upgrading our SQL Server environment. I would like to know standard methods for :
a) Packages deployed to MSDB.
b) Packages deployed as file system.
any help is appreciated.
Thanks,
Vishal.
Hey you!
Want to be a real-life virtual guru? (pun intended)
Do you want to win the love and admiration of the community you work in?
You can win REAL virtual medals and prove your skills, in a competition that is judged by real Microsoft judges!
Gurus who continue to shine soon get noticed!
Oh yes! We're talking inner circles! Nominations! New horizons!
Step up and stand out!
All you have to do is add an article to TechNet Wiki from your own specialist field. Something that fits into one of the categories listed on the submissions page. Copy in your own blog posts, a forum solution, a white paper, or just something you had to solve for your own day's work today.
Drop us some nifty knowledge, or superb snippets, and become MICROSOFT TECHNOLOGY GURU OF THE MONTH!
This is an official Microsoft TechNet recognition, where people such as yourselves can truly get noticed!
HOW TO WIN
1) Please copy over your Microsoft technical solutions and revelations toTechNet Wiki.
2) Add a link to it on THIS WIKI COMPETITION PAGE (so we know you've contributed)
3) Every month, we will highlight your contributions, and select a "Guru of the Month" in each technology.
If you win, we will sing your praises in blogs and forums, similar to the weekly contributor awards. Once "on our radar" and making your mark, you will probably be interviewed for your greatness, and maybe eventually even invited into other inner TechNet/MSDN circles!
Winning this award in your favoured technology will help us learn the active members in each community.
More about TechNet Guru Awards
Thanks in advance!
Pete Laker
TechNet Wiki Community Council Member, Azure MVP, Wiki Ninja & TechNet Guru!
#PEJL
Got any nice code? If you invest time in coding an elegant, novel or impressive answer on MSDN forums, why not copy it over toTechNet Wiki, for future generations to benefit from! You'll never get archived again, and
you could win weekly awards!
Have you got what it takes o become this month's
TechNet Technical Guru? Join a long list of well known community big hitters, show your knowledge and prowess in your favoured technologies!
I have a table containing a couple of dozen fields the records of which will contain some 30 distinct values in one of the columns. I want to read through the table and enumerate each record with an integer counter to be placed in one of that records columns that indicates the count value of that record sharing one of the 30 distinct fields.
Each time I read a record I want to see the highest counter value for records with that distinct value in the distinct value column, add one to the count and put the new counter value in the "counter" field for that record.
Could someone kindly show me how to do this? I'm guessing its going to use an sql task in the control flow but if there is a better way would be happy to use that. I've tried a couple of things but haven't yet got it right.
Thanks tonnes for any help, Roscoe
Hi ,
I'm Eellen , from standard chartered malaysia .
Hi, I am using SQL Server 2014 and for SSIS designer SSDT BI with Sisual studio 2013. I have the following query in an Execute SQL Task with Single resultset assigning the value to variable "role" of string data type.
SELECT DISTINCT (STUFF((SELECT CAST(',' + SubTableUser.UserRole AS VARCHAR(MAX)) FROM TableUser AS SubTableUser WHERE SubTableUser.UserID = TableUser.UserID FOR XML PATH('')), 1, 1, '')) AS UserRole FROM TableUser
Execute SQL Task Error:
The value type (__ComObject) can only be converted to variables of type Object.
An error occurred while assigning a value to variable "role": "The type of the value (DBNull) being assigned to variable
"User::role" differs from the current variable type (String). Variables may not change type during execution.
Variable types are strict, except for variables of type Object.".
I tried to assign the object variable value to a string variable in a script task
String.Format("{0}", Dts.Variables["User::role"]) and get this value "System.__ComObject" instead of the actual value.
Thank you in advance.
SQLEnthusiast
I'm using this, https://mikedavissql.com/2013/09/16/loop-through-excel-files-in-ssis/. This is the error when I run it. ???
Error at Data Flow Task 1 [Excel Source [173]]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Water Chemistry Test 2" failed with error code 0xC0202009. There
may be error messages posted before this with more information on why the AcquireConnection method call failed.
Error at Data Flow Task 1 [SSIS.Pipeline]: Excel Source failed validation and returned error code 0xC020801C.
Error at Data Flow Task 1 [SSIS.Pipeline]: One or more component failed validation.
Error at Data Flow Task 1: There were errors during task validation.
Error at SPOReport [Connection manager "Water Chemistry Test 2"]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft Access Database Engine" Hresult: 0x80004005 Description: "Invalid argument.".
Date column doesn’t have null
Like this
CHANGED_DATE
2014-11-04 03:01:19.000
2014-06-04 18:15:41.000
2014-05-04 18:45:17.000
2014-06-04 18:14:53.000
2014-05-04 19:02:58.000
2014-06-04 18:14:35.000
2014-06-04 18:13:57.000
2014-04-22 12:00:55.000
2014-06-04 18:16:06.000
2014-05-04 18:45:41.000
2014-04-05 01:29:21.000
2014-06-04 18:23:51.000
2014-06-04 18:24:39.000
2014-05-04 19:06:20.000
2014-11-04 03:01:19.000
im doing insert and update checking.
By using lookup and conditional split, in conditional split update checking expression Im giving
Like
([Changed Date] != L_Last_Updated_Date)
(ISNULL([Changed Date]) ? NULL(DT_DBTIMESTAMP) : [Changed Date]) != (ISNULL([L_Last_Updated_Date]) ? NULL(DT_DBTIMESTAMP) : [L_Last_Updated_Date]) &
(ISNULL([Changed Date])==TRUE ? @[DateVariable]: [Changed Date])!=(ISNULL(L_Last_Updated_Date)==TRUE ? @[DateVariable]: (L_Last_Updated_Date))
These expressions I tried all cases but data will automatically throungh the update side
conditional split--- to--- oledbcommand .
Please give me the solution when data updated on that case only data through the update side.
Hello,
I've run into a weird problem where some data being copied to a SQL 2014 db is getting overwritten along the way but a single space... in each row (13,769 rows)... and it only occurs when the package is run in a SQL job. It works fine if I run it using the package execution utility or within the SSIS designer on another machine.
There are two other columns being copied across in this package but they are INT and are coming across fine.
The connection to Sybase is via the v15.7 ASE OLE-DB driver.
This package was running OK in a similar test environment for weeks without any issues and it was only when it went to Prod that we ran into this issue... grrr.
Thanks,
Martin V.
Hi there,
primary platform is 2014.
I am creating a plain file based on a temporary table where I have posted "LCRUS-ООО "СТК"
Nevertheless, once my data flow ends I see in my file "LCRUS-??? "???"
Why SSIS don't take directly the value from my temp table?
How can I work out?
Thanks a lot,
I am facing the issue in data loading where we have conditional expression in Data flow task. Data is loaded from .csv and .txt file SQL server table.Conditional expression used:
(@[System::UserName] == "domain\\jisca") ? (DT_NUMERIC,38,18)WSTR_M_VALUE : (DT_NUMERIC,38,18)REPLACE(WSTR_M_VALUE,",",".")
Account "domain\\jisca" is service account which is having admin access over the server. This expression is written in Derived column transformation for the new column 'M_Value_Num'.
The same configuration is working in our Test Environment but not in our Production.
Hello,
i am fairly new at using ssis so i don't really how to deal with this problem.
i have a blob storage account containing files with different sizes. I wish to use the Azure blob download the files on my local computer but here is what happens.
i use a pattern to search my files(ex : file*.txt), if their is only zero or one match, the task ends well with the success status.
But if there is more than one, the first file on the list is properly downloaded but the rest of the files are created with a size of 0KB each and the doesnn't stop.
On the output window i can read package "mypackage.dtsx" finished : Canceled
Does you have any idea of what is happening?
Thanks in advance.
Hi All,
getting nulls null for mixed datatype column While loading data excel source to SQL server 2012 tableI would like to load a CSV file located remotely (something like "http://productdata-download.affili.net/affilinet_products_xxx_yyyy.csv?auth=moreOrLessSecretCode&type=CSV&file=0") into a MS SQL Server using SSIS. Since I am new to SSIS, I have no idea on how to proceed.
Unfortunately, I cannot use the "flat file source" since it only accepts local files.
Can anybody hint me to the right direction?
HI
I am connecting to SFTP server in my SSIS package using winscp.
The issue that i am facing is i need to check if folder already exists on the SFTP server.
If it does not exist ,create a folder on SFTP
Bit if the directory already exists, then just move the files.
Tried googling it, but could not find relevant answer.
I am creating script in VB and then executing that script through winscp.
If someone can help on urgent basis??
Script generated , looks like and is working fine as of now :
ECHO OFFThanks,
I am new with SSIS and struggling with Script task. Trying to figure out how to build script for following scenario.
Here is scenario
I have XML file has following data
</environments>
I want script task read 'Name' from XML file - match with System variable 'Machine Name' and write output to user variable 'ServerNumber'.
If machine name is DEV3WEB - output will be number '1' and DEV3APP output will be number '2'.
Thanks in advance.
Thanks -Mehta
I have an Excel Destination in My DFT and i am loading the data from SQL Server, i do not have any Unique Identity columns at my source system hence i added a column called Insert date that takes the value form Getdate(), My requirement is if i am executing my SSIS package more than once my Excel destination is been loaded with duplicate records, How to mitigate the same, i am unable to use lookup here as lookup works with Database Table level only. Kindly suggest.
Ramasubramanian S
hi folks,
I am looking for an advanced "Derived Column" component that would take a metadata-like config file for columns transforms, and use that to do all transforms (rather than do all transforms at "design time", through developer intervention).
We currently have an SQLServer based ETL engine that takes various files, of diffrent formats, different set of fields, various different transforms, be them LTRIM, RTRIM, CAST, ISNULL, REPLACE, etc etc....and outputs a (very long) sql statement. The problem is that all of these transforms are *very* slow.... this is a task that is much better suited for SSIS and Derived Column components. However, we cannot entertain to have developer-intervention for each file, etc... we'd like to have a metadata-like file that can be automatically built to be passed unto the component, for processing.
Are there any such tools on the market?
thanks much,
Cos
cos
I am occasionally getting the following errors. My searches have not yielded anything helpful yet.
I am using SQL Server 2014 Integration Services with SSISDB Catalog loading data to another SQL server.
The SSIS Package is executed in a loop from the parent package 10.000+ times during a typical load.
The error typically occurs early on, usually with in the 3-6 iteration through the loop. I have not seen it happen the on the first or second iteration.
Rerunning the package once or twice seems to resolve the issue for a while.
Error : -1073676264 : Error loading value "<DTS:ConnectionManagers xmlns:DTS="www.microsoft.com/SqlServer/Dts"><DTS:ConnectionManager DTS:refId="Package.ConnectionManagers[ABC Flat File]" DTS:CreationName="FLATFILE"
DTS:DelayValidation="True" DTS:DTSID="{AC6FBA97-8F7F-410A-9A58-61A7BDF93566}" DTS" from node "DTS:ConnectionManagers".
Error : -1073659899 : The connection type "FLATFILE" specified for connection manager "ABC Flat File" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager
for an unknown connection type. Check the spelling in the connection type name.
The odd thing is it works 99% of the time. When it does fail, it did work in the previous iteration just a few seconds earlier.
At the same level there are 3 other FLATFILE connections defined the same way, they have not encountered this error, but I am assuming that may due to this connection comes first alphabetically.
The ConnectionString property is determined by a expression containing a variable @[User::ABCFlatFile]
Any thoughts on Troubleshooting this issue?
Thanks much,
I've created a SSIS package which runs smoothly when launched from SQL Server Data Tools (I use SSDT 2015 with SQL Server 2005 Developer Edition on my PC), but fails with only the following line in logs when I run it from a .NET app:
Fields:event,computer,operator,source,sourceid,executionid,starttime,endtime,datacode,databytes,messageOnPreValidate,<my_computer>,<my_operator>,Test,{E7D40776-05B7-4D1D-8D78-8C87E722E596},{755AD039-B5B4-42B0-9ECA-E396054DEB2F},28.10.201614:44:06,28.10.201614:44:06,0,0x,
I use the following code to call the package from my .NET app (I just copied the SSIS package in the .NET project from SSIS project and specified it to be copied to the output directory so I could call it from the file system):
publicvoidExecute(string filePath,DateTime period){var pkg = app.LoadPackage(filePath,null);var variables = pkg.Variables;
variables["Period"].Value= period;var pkgResults = pkg.Execute(null, variables,null,null,null);}
I've tried switching package protection level to DontSaveSensitiveData
,
but it didn't help.
What am I doing wrong? Is there a way to at least get some proper data about why the package is failing?