Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

SSIS with Excel Source - Generically specify first worksheet?

$
0
0

I have an SSIS package that is reading an Excel source to pipe it to a table in my SQL Server database.

In the Data Flow task, I was able to rename the Name field generically, but with the Custom Properties, I have to specify the rowset name, otherwise the task blows off.

How can I generically specify, hey, do the open row set on the first sheet?


ForEach Loop - Parsing XML subgroup into Variables

$
0
0

Hi all,I am designing a new ETL process using SSAS.

We have a controller package which receives an XML string at run time into a variable called BatchRequest :-

<Batch><Request><Name>Filemon</Name><Params><filepaths><Sourcepath>d:\temp</Sourcepath><Destpath>D:\temp2</Destpath><Archivename>\Archive</Archivename></filepaths></Params></Request></Batch>

The Batch can contain a number of Requests and a ForEach Loop is used to parse the text for the Name node and the text for the Params Node into variables RequestName and RequestParams respectively.

however, what I want to be able to do is to take the text for <name> into Requestname but take the entire XML sub string for <Params> into RequestParams.

At the moment I get the text only (e.g. d:\tempd:\temp2\Archive).

The Requestname is used by an XML task within the ForEach loop to retrieve full package path details from an external config file, and hence it only needs to be a keyword.

I played with the idea of including <Params> as a subnode for <Name>, only populating RequestName in the loop parser and then having an XML task which would parse the related parameter string, but i don't know how to tell the XML task that I want the parameter string where the <Name> is Filemon (in this example).

Can anyone give me some suggestions or pointers?

Thanks

Iain


Iain

Using SQL Server 2012 SSIS to Extract Data From SAP

$
0
0

Hi

What is the current best practice for using SQL Server 2012 SSIS to extract data from SAP R3? Please note we are looking for a solution that does not use SAP BW or SAP OHS.

Ideally we would like to build our ETL SSIS process to make a .NET call to an SAP RFC procedure and avoid using web services.

With SS2012 can we use any of these without using SAP BW:

- SAP .NET Connector

- MS ADO .NET

- BizTalk .NET 3.0 Adapter

Thanks and take care,

Shayne

generating the flatfiles getting error

$
0
0

Hi,

While I am generating the flatfile from the server I cannot able to get the data for few of the fields I can retrive only for few of the fields. all fields are configured properly. but may i know the reason for that. is there any way to get the data for all fields properly.

Thanks.

Read Excel Column Name and insert to SQL

Launch VSTA Editor Requires Administrator Privs?

$
0
0

This is a follow-up to another posting I had about problems with launching the VSTA editor from various scripts.  I have been experimenting with this further and while available RAM seems to have something to do with it, more clearly appears to be that launching VSTA requires me to be logged on as administrator.  IT security only allows us developers to get Admin for one hour at a time -- have to request it, log off, then log back on.  After an hour it expires and we would have to request again, log off and then back on yet again for another hour.  Needless to say, this would be rather frustrating.   Any ideas about how/why launching VSTA from SSDT script tasks or components would require administrator privs and how I could get around that?

Thanks kindly,

Mike

Download all excel files from sharepoint location to local folder

$
0
0

Hi,

How to I copy all excel files stored at sharepoint location to local folder?

Regards,

Nidhi

foreach loop only reads one file

$
0
0

I have a Foreach loop set as Foreach File Enumerator.  In the Expressions I have the Directory set to a variable path (User::strSourceFilePath) which is read from SQL Server.  In the variables window I have this defaulted to "C:\Source" to make the File System Task happy.  From what I read this value will be replaced (from SQL as "C:\Source" so it stays the same).  The FileSpec is set to "*.*" but I have tried "*.txt" since it'll only read txt files. 

First I have a Execute SQL Task which returns the strSourceFilePath as the Single Row Result Set.  After all processing I end with a File System Task to move the file.

This all works with one file but when I try two it only reads and backs up the first file. 

I think this is the error message that would help:  Error: The expression "@[User::strSourceFilePath]" on property "\Package.Connections[ConDynamicPath].Properties[ConnectionString]" cannot be evaluated. Modify the expression to be valid.

I have a dynamic Connection Manager also equal to the strSourceFilePath to read each Flat File source.  BTW, each flat file has the same columns.


SSIS Script Task errors when trying to connect to SharePoint using Client Object Model

$
0
0
string client = Dts.Variables["SPclient"].Value.ToString();
string user = Dts.Variables["SPuser"].Value.ToString();
string pass = Dts.Variables["SPpass"].Value.ToString();

MessageBox.Show(client + " " + user + " " + pass);
//Debug style
NetworkCredential credentials = new NetworkCredential(user, pass);
ClientContext clientContext = new ClientContext(client);


/*
NetworkCredential credentials = new NetworkCredential(user, pass);
ClientContext clientContext = new ClientContext(client);
Web site = clientContext.Web;
List list = site.Lists.GetByTitle("Projects");

try
{
       // Load Credentials for authentication
       clientContext.Credentials = credentials;
       // Load the site
       clientContext.Load(site);
       // Load the site list
       clientContext.Load(list);
       // Execute queries
       clientContext.ExecuteQuery();
       MessageBox.Show("Title: {0}", site.Title);

       // Load the folders
       var folders = list.RootFolder.Folders;
       clientContext.Load(folders);
       // Execute the queries
       clientContext.ExecuteQuery();

       // Create a new folder and execute the query
       var newFolder = folders.Add("TESTSSISFOLDER");
       clientContext.ExecuteQuery();
       }
       catch (Exception e)
       {
             MessageBox.Show(e.ToString());
       }

I am trying to run this script in SSIS and I get this error http://img96.imageshack.us/img96/3585/helphj.png

after only these 2 lines of code.

If I try to run the whole script (the commented block) in a standalone console application, it works, but it errors in the Script Task when I set the ClientContext.

Any ideas to why this is happening?

Thanks in advance!


Importing Excel Source does not have all the columns.

$
0
0

Hi,

I'm trying to import an Excel sheet to my DB. I created an Excel source but it is missing the last column of the sheet. The Excel sheet has a freeze pane that I think is causing the issue.  When I remove the freeze pane, the Excel source has the missing column.  Is there a way around this issue?

Thank you in advance.

JohnnyVL

Attunity SSIS connector support for Sql Server 2014?

$
0
0
Hello,

We've been successfully using Attunity SSIS connector 2.0 for a couple of years with SQL SERVER 2012.
We developed some Visual Studio projects and deployed them on our SSISDB. Then launch them with scheduled jobs. Everything worked fine.

After we upgraded the server to Sql Server 2014 the connector failed with the following error:
The connection type "MSORA" specified for connection manager "XXX" is not recognized as a valid connection manager type.
This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.

We tried uninstall/install but was no good. Also 32/64 bits combinations.
We tried DTEXEC utility and worked fine on sqlserver 2012 folder ( \110\DTS\Binn) but didn't work on 2014's folder ( \120\DTS\Binn).
We also tried a fresh 2014 installation but keep getting the same error.

Does Attunity connector 2.0 support Sql Server 2014? If not, should I wait for a new version? Any workaround?
For now, we are using the DTEXEC utility on 2012's folder, but since we have to use the /sql option (/ISSERVER is not working) we can't pass parameters to the .dtsx package.

Any help would be much appreciated.

Regards,
Daniel

query data from excel cells

$
0
0

i'm try to query the departmentID from HumanResources.Department

 below is the Package:

declared Varaiable :@Departmentid

In OLEDBSource : querying  in sqlCommand Mode . with @department as input varaiable .

(select * from HumanResources.Department where departmentid in (?))

but my excel source values keep on changes .. ie today excel sheet may as 2 departmentids, tomorrow  there be may 10 department IDs

can anyone help me in creating a package with such requirement:

can I make use of FOREACHLOOP : if yes please tell me how ?

automate email notification for new report entry

$
0
0

Basically I want to use SSIS to send email notifications to groups of people whenever there is a new entry in a certain SSRS report...

I could have used the SSRS email subscriptions but I was told we are not allowed to use it for some reason...

So I want to create an SSIS package that scans the report dataeach hour and sends an email with a link to the report in case there is anew entry in the report output...

I have no idea where to start and how to get this done, its just that I have never used SSIS before for email/automated tasks...


Dhananjay Rele

Retrieve a specific cell value of an .csv and pass it to a variable so that it can be used for other purposes.

$
0
0

I have three .csv file coming from different departments with, general name_date_department as file name. Example: MP_20140215_SS.CSV; MP_20140319_EE.CSV; MP_20140125_FF.CSV. For both EE and FF department I also have an text file for each, which has description in it. Both the CSV file and the text file (for example: MP_20140319_EE.CSV, MP_20140319_EE_DES.TXT) have to be sent as an attachment via email to a specific group for each department. I have to get specific data from these CSV files, pass it to a variable and write it to a SQL table. 

Below is how the CSV file looks like (All three files have same headers). I want to capture the highlighted value (b) from all three file and the file name, write it to a sql table. 

Note: The value of 'b' will be different in all the three files and will keep changing with new files coming in daily.

 

Expected Results in sql table:

Please help me with this. 

Thank you.

Excel destination - cannot convert between unicode and non-unicode

$
0
0

Hello,

I am having issue when trying to pull data from a table to excel. I am using a data conversion transformation between oledb source and excel destination. The error I am getting is cannot convert between unicode and non-unicode string. The columns its erroring out are of data types varchar(30) null,varchar(900) null and text null. In the data conversion transform I am using DT_WSTR, DT_WSTR and DT_TEXT and it still throws me the same error.

Can someone please tell me where I am doing wrong? Thanks a ton in advance.


SSMA for migrating table from oracle to Sql server

$
0
0

Hi All,

I wanted to replicate oracle huge table to sql server and i am using SSMA.its helpful and fast but can we replicate the table to different name using SSMA.for example i have a table TEST and i wanted to replicate it to SQL_TEST.Can it be possible using SSMA.

Kindly help me out 

Master package with child packages executed in parallel.

$
0
0

Hi,

I have a master package, which executes a few child packages in parallel. All packages are SQL Server 2008 R2 and have two configurations: Indirect XML configuration file and SQL Server. The name of the XML file is Master_Config.xml

Then I have 3 customer DW databases.

So, I created 3 XML files each pointing to appropriate DW. XML file names are MC1.xml, MC2.xml, and MC3.xml

In order to run the master package against 3 customer databases I execute following batch file:

copy MC1.xml  Master_Config.xml
DTExec.exe /F MASTER.dtsx /Reporting "N"

copy MC2.xml  Master_Config.xml 
DTExec.exe /F MASTER.dtsx /Reporting "N"

copy MC3.xml  Master_Config.xml 
DTExec.exe /F MASTER.dtsx /Reporting "N"

The problem is that processing of, say, DW #2 starts before the processing of the previous DW finishes. In order words, Master_Config.xml gets replaced with the XML file for the next step before the previous step completely finishes. I suspect that it has something to do with the parallel execution of child packages.

Any ideas how this can be resolved?


Remember to mark as an answer if this post has helped you.

Any Quick Way to Clone a Package and Change Data Source Mapping ?

$
0
0

Hi! I have a ETL project for different department user. Conceptually their table schema are the same,  the detailed column names differ. The ETL spec/rule is the same for those departments. I already develop one package and think maybe I can clone it, change datasource to reduce the development effort. But the metadata thing is hard to control, seems I need to re-create the whole package?!?! Anyone have better idea on this ?

Thank you!

 

ftp get file without extension

$
0
0

Hi


I'm trying to use the sample code from http://blog.dbandbi.com/tag/ssis-script-task-check-if-file-exists-c/

publicvoid Main(){string userName = Dts.Variables["User::userName"].Value.ToString();string password = Dts.Variables["User::password"].Value.ToString();string fileName = Dts.Variables["User::fileName"].Value.ToString();string ftpURL =String.Format("ftp://ftp.dbandbi.com/public_ftp/incoming/{0}",fileName); try{
       FtpWebRequest ftpRequest =(FtpWebRequest)WebRequest.Create(ftpURL);
       ftpRequest.Method= WebRequestMethods.Ftp.DownloadFile;
       ftpRequest.Credentials=new NetworkCredential(userName, password); using(FtpWebResponse ftpResponse =(FtpWebResponse)ftpRequest.GetResponse()){
           Dts.Variables["User::isFileExists"].Value=true;}}catch{
       Dts.Variables["User::isFileExists"].Value=false;} 
	Dts.TaskResult=(int)ScriptResults.Success;}

but I'm having problems because the files are in unix format and they don't have extension. I  tried using filename*.* and filename*  but I always get "file does not exist".  What I'm trying to do is to know if the file created by another process has been freed so I can download it.

thanks in advance


cognosoft


migrating data from MSSQL to ClustrixDB (drop in replacement for MySQL)

$
0
0

Hi folks,

I'm trying to move data from MSSQL 2012 sp1 w/ cu9 to ClustrixDB.  I have scripted out the table on the MSSQL side and using it to create the table on the Clustrix side.

I'm using MYSQL ODBC 5.2.6.  I have both 32 and 64bit installed on my laptop.

in SSDT, I have a data task flow and in it would be an ODBC source (MSSQL server) and ODBC destination (clustrix).

everything is mapped correctly but when I execute it, i would get the following error:

[ODBC Destination [44]] Error: An error occurred with the following error message: "'' default is invalid for column type TYPE_TIMESTAMP".

So i changed the ODBC source to OLEDB source (MSSQL server) and wrapped all datetime column around an isnull(column_DTM, '1900-01-01 00:00:00') column_DTM and yet I continue to get the above error message.  I have also tried to put in a data conversion in between and still failed.

I'm a bit at a lost and was wondering if anyone has ran into this issue?

Thanks!

Viewing all 24688 articles
Browse latest View live




Latest Images