Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

ODATA: Unable to expand attachments from Sharepoint Online

$
0
0

My objective is to use SSIS with an ODATA source to connect to a Sharepoint Online list. My aim is to extract the ServerRelativeUrl for the attachments of all items within the list.

I've tested the Sharepoint Online URL, having provided the $expand parameter to get the Attachment details.

https://<sitename>.sharepoint.com/sites/FieldManagement/_vti_bin/listdata.svc/MerchandiserInspection?/_api/lists/getbytitle(%27Merchandiser%20Inspection%27)/items?$select=AttachmentsItem/ServerRequestUrl&$expand=Attachments

At this point it's noteworthy that no matter what values I enter into the $select parameter (even if I make obvious errors), it seems to have no bearing on that which is returned. Also, I don't see any fields/parameters called ServerRequestURL, so I don't believe that it is having any impact.

An excerpt of one of the items (ID 41) returned, is as follows:

<entry m:etag="W/&quot;12&quot;"><id>https://<sitename>.sharepoint.com/sites/FieldManagement/_vti_bin/listdata.svc/MerchandiserInspection(41)</id><title type="text"></title><updated>2019-04-03T20:24:34-07:00</updated><author><name /></author><link rel="edit" title="MerchandiserInspectionItem" href="MerchandiserInspection(41)" /><link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Representative" type="application/atom+xml;type=entry" title="Representative" href="MerchandiserInspection(41)/Representative" /><link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/CreatedBy" type="application/atom+xml;type=entry" title="CreatedBy" href="MerchandiserInspection(41)/CreatedBy" /><link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/ModifiedBy" type="application/atom+xml;type=entry" title="ModifiedBy" href="MerchandiserInspection(41)/ModifiedBy" /><link rel="http://schemas.microsoft.com/ado/2007/08/dataservices/related/Attachments" type="application/atom+xml;type=feed" title="Attachments" href="MerchandiserInspection(41)/Attachments"><m:inline><feed><title type="text">Attachments</title><id>https://<sitename>.sharepoint.com/sites/FieldManagement/_vti_bin/listdata.svc/MerchandiserInspection(41)/Attachments</id><updated>2019-04-10T20:41:42Z</updated><link rel="self" title="Attachments" href="MerchandiserInspection(41)/Attachments" /><entry><id>https://<sitename>.sharepoint.com/sites/FieldManagement/_vti_bin/listdata.svc/Attachments(EntitySet='MerchandiserInspection',ItemId=41,Name='D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg')</id><title type="text">D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg</title><updated>2019-04-10T20:41:42Z</updated><author><name /></author><link m:etag="&quot;{B8A3E4EF-5B96-4AE4-B240-08C462661F4E},1&quot;" rel="edit-media" title="AttachmentsItem" href="Attachments(EntitySet='MerchandiserInspection',ItemId=41,Name='D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg')/$value" /><link rel="edit" title="AttachmentsItem" href="Attachments(EntitySet='MerchandiserInspection',ItemId=41,Name='D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg')" /><category term="Microsoft.SharePoint.DataService.AttachmentsItem" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" /><content type="image/jpeg" src="https://<sitename>.sharepoint.com/sites/FieldManagement/Lists/Merchandiser%20Inspection/Attachments/41/D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg" /><m:properties xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices"><d:EntitySet>MerchandiserInspection</d:EntitySet><d:ItemId m:type="Edm.Int32">41</d:ItemId><d:Name>D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg</d:Name></m:properties></entry></feed></m:inline></link><category term="Microsoft.SharePoint.DataService.MerchandiserInspectionItem" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" /><content type="application/xml"><m:properties><d:StoreVisitDate m:type="Edm.DateTime">2018-11-15T10:21:00</d:StoreVisitDate><d:RepresentativeId m:type="Edm.Int32">16</d:RepresentativeId><d:Title m:null="true" /><d:ContentTypeID>0x010007EFE22AD1BAFC4994BA443D963003FB</d:ContentTypeID><d:ComplianceAssetId m:null="true" /><d:Id m:type="Edm.Int32">41</d:Id><d:ContentType>Item</d:ContentType><d:Modified m:type="Edm.DateTime">2019-04-03T20:24:34</d:Modified><d:Created m:type="Edm.DateTime">2018-11-15T10:36:10</d:Created><d:CreatedById m:type="Edm.Int32">16</d:CreatedById><d:ModifiedById m:type="Edm.Int32">6</d:ModifiedById><d:Owshiddenversion m:type="Edm.Int32">12</d:Owshiddenversion><d:Version>12.0</d:Version><d:Path>/sites/FieldManagement/Lists/Merchandiser Inspection</d:Path></m:properties></content></entry>

From the above extract, the value that I am looking to retrieve is the 

<content type="image/jpeg" src="https://<sitename>.sharepoint.com/sites/FieldManagement/Lists/Merchandiser%20Inspection/Attachments/41/D2DDAC5A-304C-4594-9E3F-FF270DEFD5C0.jpg"

where you can see that the full URL of the image for Item with ID 41.

However, when using this URL within my SSIS ODATA connector, the Attachment URL is not included in my result set and so I'm assuming that I'm just not expanding the correct object or selecting the correct fields.


SSIS 2017 Scale-out Master and SQL Server Clustering

$
0
0

Dear

Within the framework of re-designing our BI Architecture, I was planning on clustering SQL Server in an Active/Passive mode, moreover I would like to deploy a Data Integration scale-out friendly architecture therefore I decided to put the SSIS Master on the SQL Server machine and spread out few workers on separate machines, so that everything related to SQL Agent Jobs and distributed ETL is orchestrated within the SQL Server DB Engine server.

Basically this would look like:

[SQLSRV Node 1] ------> Listener <------- [SQLSRV Node 2]
                                                ^
				    	       / \
				    		|
				    		|
		      			[SSIS Worker]


Now this got me thinking, what if there is a failure and SQL Server switches to the fail-over machine node (which let's say would be READ-ONLY since we are talking about a data warehouse):

  • Failure occurs outside of Data Integration Windows: No real impact unless the next scheduled ETL is about to start
  • Failure occurs during Data Integration: Impact on the flows subsequently as the replica is Read-Only (mind SSISDB logs as well)

Now my question is: does this seem to you like a fair approach to segment Master vs Worker in a clustered deployment?

Since we are in a data warehousing solution, I am really not keen on enabling a Read/Write Replica to minimize gaps to the max and I am also trying to reduce the number of machines (licensing cost) as I wouldn't put the SSIS Master on its own machine.

I have also considered having the 1st SSIS Worker machine containing also the SSIS Master service.

Any thoughts on this matter are welcome, and thank you in advance

Upgrade SSIS Package from VS13/SQL2014 to VS17/SQL16 - Upgrade Package has nothing to select

$
0
0

Good day,

Upgrade SSIS Package from VS13/SQL2014 to VS17/SQL16 - Upgrade Package has nothing to select

when I select change target server the package hangs and fails

what is the correct method and how do I upgrade?

I am Getting Error While I Try to Move File To An Archive folder or Error Folder.

$
0
0

Hi

Please Anyone Help me on This Error I Am Trying To Move File In Archive Folder or Error Folder.

Weired package execution

$
0
0

I have a package with only flat file source. When I execute the package I get an error because the file doesn't exist.

But when I excecute the same package on SQL agent, the job succeeded. That seems weired, any explanation please?

Joining the data on source system vs Join on the SSIS

$
0
0
Hi,

I've use case where have to read the data from MySQL and Extract & Transformation using the SSIS and store into the SQL Server.

I'm reading data from multiple dimension tables with Joining to the Fact table. Source data is on MySQL. One fact and one dimension table.

Now my worry is Joining data from all (fct & dimension) on source would be better option or read data using multiple reader & Join in SSIS using Join/Merge component would be wise idea ?.

I'm expecting the 3000 rows on the Fact Table every 15 minutes and 4000 rows on the dimension table. This ETL is continuouslygoing to run every 5 minutes.


SSIS master package cannot finish

$
0
0

Im exporting 3 sql tables to 3 excels. Child packages alone in SSIS run OK. I have master package with 3execute package tasks where I run these 3 packages. Master package child packages finish successful, but if I look into each data flow of child packages I can see that are still running. And they run infinite.

Result of the package is correct, there is no error in log output, but this infinite running of data flow is confusing. 

Any ideas why is this happening?

master package:

child package control flow:

child package data flow:

User connects to sqlserver SSPI handshake failed with error code 0x8009030c, state 14

$
0
0

SSPI handshake failed with error code 0x8009030c, state 14 while establishing a connection with integrated security; the connection has been closed. Reason: AcceptSecurityContext failed. The Windows error code indicates the cause of failure. The logon attempt failed  [CLIENT: x.x.x.x]

This user does not exist in our company so I cannot use this solution:

Go to: Local Security Policy -> Security Settings -> Local Policies -> User Rights Assignment -> Access this computer from the network -> add the user

What is other alternative to fix this issue?


Change format of destination CSV file

$
0
0

I have a simple SSIS package that extracts from SQL and puts the contents into an existing CSV file.

All works which is great how ever I need each column to be in separate cells. Currently each row of data is in a single cell for example B1 might contain field1, field2, field3 etc
I'd like those fields to span across B1, C1, D1 just as if I had pasted the results grid into Excel.
I'm not sure how to achieve this.

Substring and Findstring

$
0
0

Hi There

I am trying to retrieve a date part from my filename using ssis derived , I have filename variable which stores the value of filename as   abcd_efgh_20190412_210010773_115029807. I am trying to write a derived column expression where it picks the date part after abcd_efgh_ as 

SUBSTRING(@[User::FileName],1,FINDSTRING(@[User::FileName],"abcd_efgh_",1))

but the output coming as NUlls, I need the output as 2019-04-12

Can you please help if I am doing anything wrong , I want to typecast the result above with DT_date as well.

Thanks

Raj


Got "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." when execute a SSIS 2008 package

$
0
0

I have a SSIS package which was migrated from DTS SQL2000 to SSIS 2008.

The package designed to perform following jobs:

  1. Do the data cleansing by truncate the tables in SQL Server
  2. Transform data from Oracle Database into SQL Server tables

The package is working fine until the server (Windows Server 2008 R2) which install SQL Server 2008 enabled TLS1.2 and disabled TLS 1.0 & TLS 1.1

Error "System.Runtime.InteropServices.COMException (0x80040427): Execution was canceled by user." thrown when executing task #2 (Oracle to SQL server data transformation). Data cleansing part was done successfully.

Provider for Oracle used = OraOLEDB

Provider for SQL Server = MSOLEDBSQL

Any help will be appreciated.


SSIS programmatically add Aggregate Transform - Count Distinct

$
0
0

I am working on creating Aggregate transform with aggregation type as count distinct programmatically and i am able to create other aggregations like min,max,count.. but when it comes to count distinct i am getting below error

  • The component has detected potential metadata corruption during validation.
    Error at Data Flow Task - Load Count Dist [Aggregate - All [2]]: The "Aggregate - All.Outputs[Aggregate Output 1].Columns[col1]" is missing the required property "CountDistinctScale". The object is required to have the specified custom property.

I am unable to find "CountDistinctScale" custom property as this custom property doesn't exit for other aggregation and magically appears when count distinct is selected,is there a method which i need to call to create new custom property?

I understand there are not a lot of people who know how to programmatically create package, please help me find someone with knowledge or suggest me how i can get some help.

				IDTSComponentMetaData100 Aggregate = pipeline.ComponentMetaDataCollection.New();
                Aggregate.ComponentClassID = app.PipelineComponentInfos["Aggregate"].CreationName;
                // Get the design time instance of the derived column
                var DesignAggregate = Aggregate.Instantiate();
                DesignAggregate.ProvideComponentProperties();        //design time

                Aggregate.Name = "AggregateComponent";               

                IDTSPath100 AggregatePath = pipeline.PathCollection.New();
                AggregatePath.AttachPathAndPropagateNotifications(pipeline.ComponentMetaDataCollection[Prev_Transform.Transformation_Name].OutputCollection[Prev_Transform.Output_Number], Aggregate.InputCollection[0]);


                //update the metadata for the derived columns
                DesignAggregate.AcquireConnections(null);
                DesignAggregate.ReinitializeMetaData();
                DesignAggregate.ReleaseConnections();

               
                // Mark the columns we are joining on
                IDTSInput100 AggregateInput = Aggregate.InputCollection[0];
                IDTSInputColumnCollection100 AggregateInputColumns = AggregateInput.InputColumnCollection;
                IDTSVirtualInput100 AggregateVirtualInput = AggregateInput.GetVirtualInput();
                IDTSVirtualInputColumnCollection100 AggregateVirtualInputColumns = AggregateVirtualInput.VirtualInputColumnCollection;

                IDTSOutput100 AggregateoutputCollection = Aggregate.OutputCollection[0];

                // Note: input columns should be marked as READONLY
                foreach (IDTSVirtualInputColumn100 vColumn in AggregateVirtualInputColumns)
                {
					int sourceColumnLineageId = AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].LineageID;
					DesignAggregate.SetUsageType(AggregateInput.ID, AggregateVirtualInput, sourceColumnLineageId, DTSUsageType.UT_READONLY);

					// create a new output column
					IDTSOutputColumn100 newOutputColumn = DesignAggregate.InsertOutputColumnAt(AggregateoutputCollection.ID, 0,  vColumn.Name, string.Empty);
					
					// set the data type porperties to the same values as these of the input column   
					newOutputColumn.SetDataTypeProperties(AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].DataType, AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].Length, 0, 0, AggregateVirtualInput.VirtualInputColumnCollection[vColumn.Name].CodePage);
					
					newOutputColumn.MappedColumnID = 0;
					for (int i = 0; i < newOutputColumn.CustomPropertyCollection.Count; i++)
					{
						IDTSCustomProperty100 property = newOutputColumn.CustomPropertyCollection[i];
						switch (property.Name)
						{
							case "AggregationColumnId":
								property.Value = sourceColumnLineageId;
								break;
							case "AggregationType":
								property.Value = 3;
								break;
							case "IsBig":
								property.Value = 1;
								break;
							case "AggregationComparisonFlags":
								property.Value = 0;
								break;
						}
					}
                }


Sam

SQL script Job duration unusual

$
0
0

Hi,

If any SQL job ran at first day one hour,if second day running more than one hour means need to know job duration unusal.Need SQL Script.



Regards,

Niranjan B

Need To Send Email In HTML Format Using SSIS Script Task

$
0
0

Hello Everyone,

As I have very less knowledge over C#,I am facing  lot of trouble in generating EMAIL in HTML format.
My requirement is  to get the latest row from the AUDIT LOG table and send it in HTML formatted email using SSIS and SCRIPT TASK.
Using [msdb.dbo.sp_send_dbmail] within Execute SQL task, I was able to achieve the desired email,But the requirement should  be completed using SSIS SCRIPT TASK.

The above result has been achieved using {[msdb.dbo.sp_send_dbmail] within Execute SQL task}

Any help would be appreciated.
Thank you,
Rahul

[Excel Destination [60]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.

$
0
0

I have a for each loop container that produces obtains the data from a SQL database and outputs it in different Excel files based on the condition in the loop. Every time I run the package it randomly produces x amount of Excel files then fails with the error in the subject (Cannot Acquire Connection Manager) ie. sometimes it produces 10 excel files, sometimes 20, sometimes 30 but not the full amount. 

I have set the RunIn64BitMode property to False and set Delay Validation to True. 

My version of Excel is 2010. 

Any help is appreciated. 



shaykh


Change the database name

$
0
0

Greeting

i want to change the data base name in SQL server , but since im using that name in my ssis packages im afraid i will get errors , is there a  way where i can change the same connection manager database on all the packages in a Dynamic way ( i have arround 30 packages )

 

No column information was returned by the SQL command ( Excel Source in SSIS)

$
0
0

Hi All,

I am reading data from Excel Source using SQL command like below :

Select * from [Tabname$B10:E25]

But I am getting following error :

TITLE: Microsoft Visual Studio
------------------------------

The component reported the following warnings:

Error at Data Flow Task [OLE DB Source [1]]: No column information was returned by the SQL command.

I have seen couple of post to use SET FMTONLY OFF but how to use it within excel source.

SSIS 2017 Scale-out Master and SQL Server Clustering

$
0
0

Dear

Within the framework of re-designing our BI Architecture, I was planning on clustering SQL Server in an Active/Passive mode, moreover I would like to deploy a Data Integration scale-out friendly architecture therefore I decided to put the SSIS Master on the SQL Server machine and spread out few workers on separate machines, so that everything related to SQL Agent Jobs and distributed ETL is orchestrated within the SQL Server DB Engine server.

Basically this would look like:

[SQLSRV Node 1] ------> Listener <------- [SQLSRV Node 2]
                                                ^
				    	       / \
				    		|
				    		|
		      			[SSIS Worker]


Now this got me thinking, what if there is a failure and SQL Server switches to the fail-over machine node (which let's say would be READ-ONLY since we are talking about a data warehouse):

  • Failure occurs outside of Data Integration Windows: No real impact unless the next scheduled ETL is about to start
  • Failure occurs during Data Integration: Impact on the flows subsequently as the replica is Read-Only (mind SSISDB logs as well)

Now my question is: does this seem to you like a fair approach to segment Master vs Worker in a clustered deployment?

Since we are in a data warehousing solution, I am really not keen on enabling a Read/Write Replica to minimize gaps to the max and I am also trying to reduce the number of machines (licensing cost) as I wouldn't put the SSIS Master on its own machine.

I have also considered having the 1st SSIS Worker machine containing also the SSIS Master service.

Any thoughts on this matter are welcome, and thank you in advance

convert epoch timestamp to datetime field when importing using ssis into sql server... how?

$
0
0
all:


I am importing an xml file from a 3rd party vendor and their datetime fields are epoch timestamps.
I need to convert these into a datetime field as im importing into my sql server table.

Ive played around with this in SSIS, but I am new to SSIS and cannot find a resolution.  The best I have done is make the .xsd file data types for those epoch datetime fields a ws:string type and also make my corresponding fields in sql server nvarchar datatypes. 

I can get the data to import that way fine, but then when im trying to write report queries, it makes the sql really a pain.

I tried using a derived column, but cannot figure out how to do that properly.

Can someone direct me to the proper way to handle this conversion please?

If you need more information, i will be happy to supply.

Again -- its a epoch timestamp field in an xml file that needs to be converted to datetime field and imported into my sql server table that will have a datatype of datetime.

Thanks!

Excel Column Headings and values getting truncated by two letters in SSIS while loading

$
0
0

Hi Everyone,

I am loading an excel file to the database. strangely, the last two letters of each column heading (and values) is getting truncated for no reason. can you please guide me in this.

For Ex: If the column name is 'MyDate' and has the value '2018-10-14', In 'Columns' tab it is coming as 'MyDa' and the value is '2018-10-'. 



Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>