Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

Getting error "Cannot execute script because the script entry point is invalid" using SSIS Script Task trying to get Google Analytics data using C#

$
0
0

Hello,

I have followed the guide at https://kumarvikram.com/google-analytics-report-v4-csharp-console-app/ to create a C# script to get data using Google Analytics Reporting V4 API. The guide is written as a C# console program, but I am using SSIS to try to accomplish the same results using a Script Task.


I have done the prerequisities needed to fetch data from Google Analytics Reporting, such as creating a service account and granting it read & analyze permission at Google Analytics.

When I try to execute the package I get the error "Cannot execute script because the script entry point is invalid".

What I have tried

Seeing as I have copied a C# console code into a SSIS Script Task, I have gathered that some of the autogenerated code in C# is neccessary.I have therefore deleted the old script task, and made a new one with the auto generated C# code intact.I have read that this is a common cause of the issue.

I have looked into the settings of the script task and I have tried to use entry point valueMain and ScriptMain, both to no avail.

I have seen that using System.Configuration requires you to add a reference to System.Configuration even though it does not give you errors when you do not have it referenced in the project.I have added a reference to System.Configuration.

I can confirm that the SQL Server Target under project is SQL Server 2016, which is the SQL Server version used.

The code itself when I edit the script task is distributed among three files in the project. I have added a new XML file to the project, which I named App.config which consists of the following code:


<?xmlversion="1.0"encoding="utf-8"?>
<configuration>
  <appSettings>
    <add key="KeyFileName"value="Your key file name here"/>
    <add key="ViewId"value="Your ViewId here"/>
  </appSettings>
    <startup>
        <supportedRuntimeversion="v4.0"sku=".NETFramework,Version=v4.5.2"/>
    </startup>

</configuration>

I have a new Code File named ReportManager.cs looking like this:


usingGoogle.Apis.AnalyticsReporting.v4;
usingGoogle.Apis.AnalyticsReporting.v4.Data;
usingGoogle.Apis.Auth.OAuth2;
usingGoogle.Apis.Services;
usingSystem.Configuration;
usingSystem.IO;
 
namespaceGAReportExtractor.App
{
    publicclass ReportManager
    {
        /// <summary>
        /// Intializes and returns Analytics Reporting Service Instance using the parameters stored in key file
        /// </summary>
        /// <param name="keyFileName"></param>
        /// <returns></returns>
        privatestaticAnalyticsReportingService GetAnalyticsReportingServiceInstance(stringkeyFileName)
        {
            string[] scopes = { AnalyticsReportingService.Scope.AnalyticsReadonly };//Read-only access to Google Analytics
            GoogleCredential credential;
            using(varstream = newFileStream(keyFileName, FileMode.Open, FileAccess.Read))
            {
                credential = GoogleCredential.FromStream(stream).CreateScoped(scopes);
            }
            // Create the  Analytics service.
            returnnewAnalyticsReportingService(newBaseClientService.Initializer()
            {
                HttpClientInitializer = credential,
                ApplicationName ="GA Reporting data extraction example",
            });
        }
 
        /// <summary>
        /// Fetches all required reports from Google Analytics
        /// </summary>
        /// <param name="reportRequests"></param>
        /// <returns></returns>
        publicstaticGetReportsResponse GetReport(GetReportsRequest getReportsRequest)
        {
            varanalyticsService = GetAnalyticsReportingServiceInstance(ConfigurationManager.AppSettings["KeyFileName"]);
            returnanalyticsService.Reports.BatchGet(getReportsRequest).Execute();
        }
    }

}


and lastly I have the ScriptMain.cs file:


usingGoogle.Apis.AnalyticsReporting.v4.Data;
usingSystem;
usingSystem.Collections.Generic;
usingSystem.Configuration;
usingSystem.Linq;
 
namespaceGAReportExtractor.App

{

   [Microsoft.SqlServer.Dts.Tasks.ScriptTask.SSISScriptTaskEntryPointAttribute]
public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase

    {
             enum ScriptResults
             {
                       Success = Microsoft.SqlServer.Dts.RunTime.DTSExecResult.Sucess,
                       Failure = Microsoft.SqlServer.Dts.RunTime.DTSExecResult.Failure
             };

        staticvoidMain(string[] args)
        {
            try
            {
                #region Prepare Report Request object
                // Create the DateRange object. Here we want data from last week.
                vardateRange = newDateRange
                {
                    StartDate = DateTime.UtcNow.AddDays(-7).ToString("yyyy-MM-dd"),
                    EndDate = DateTime.UtcNow.ToString("yyyy-MM-dd")
                };
                // Create the Metrics and dimensions object.
                varmetrics = newList<Metric> { newMetric { Expression = "ga:sessions", Alias ="Sessions"} };
                vardimensions = newList<Dimension> { newDimension { Name = "ga:pageTitle"} };
 
                //Get required View Id from configuration
                varViewId = ConfigurationManager.AppSettings["ViewId"];
 
                // Create the Request object.
                varreportRequest = newReportRequest
                {
                    DateRanges =newList<DateRange> { dateRange },
                    Metrics = metrics,
                    Dimensions = dimensions,
                    ViewId = ViewId
                };
                vargetReportsRequest = newGetReportsRequest();
                getReportsRequest.ReportRequests =newList<ReportRequest> { reportRequest };
                #endregion
                 
                //Invoke Google Analytics API call and get report
                varresponse = ReportManager.GetReport(getReportsRequest);
 
                //Print report data to console
                PrintReport(response);
            }
            catch(Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
            finally
            {
                Console.ReadLine();

            }

                      Dts.TaskResult = (int)ScriptResults.Success;


        }
        privatestaticvoidPrintReport(GetReportsResponse response)
        {
            foreach(varreport inresponse.Reports)
            {
                varrows = report.Data.Rows;
                ColumnHeader header = report.ColumnHeader;
                vardimensionHeaders = header.Dimensions;
                varmetricHeaders = header.MetricHeader.MetricHeaderEntries;
                if(!rows.Any())
                {
                    Console.WriteLine("No data found!");
                    return;
                }
                else
                {
                    foreach(varrow inrows)
                    {
                        vardimensions = row.Dimensions;
                        varmetrics = row.Metrics;
                        for(inti = 0; i < dimensionHeaders.Count && i < dimensions.Count; i++)
                        {
                            Console.WriteLine(dimensionHeaders[i] +": "+ dimensions[i]);
                        }
                        for(intj = 0; j < metrics.Count; j++)
                        {
                            DateRangeValues values = metrics[j];
                            for(intk = 0; k < values.Values.Count && k < metricHeaders.Count; k++)
                            {
                                Console.WriteLine(metricHeaders[k].Name +": "+ values.Values[k]);
                            }
                        }
                    }
                }
            }
        }
    }
}













My ssis package scheduleing for every 20 sec but for every 20 sec job run the end time is differ even though there is no data new data from source tables.

$
0
0

Hi,

I have schedule the ssis job for every 20 sec /day. while running the job it is showing different times as end time even though there is no new   data in sql tables.

There is no network issues also. could you please suggest me the reason?

Thanks

AVS

SSMS to SSIS, cant connect

$
0
0

Hi I am trying to connect to Integration services from Management Studio and it fails. If i use a version on the server then it works, but i want to connect from the local pc. I have seen some posts about a bug in SSMS to SSIS. I just upgraded to the latest version but it doesn't work. 2016 SSMS Version 13.0.16 works but 2017 SSMS Version 14.0.17 doesn't.

"Connecting to the intergration services on computer xxxxxx failed with the following error class not registered

Thanks

SSIS 2014 package running ultra slowly in Visual Studio 2013

$
0
0

Hello,

I am trying to run my SSIS 2014 packages in VS 2013 and although they run and I can debug them, they run very very slowly.

I noticed a number of people saying set the package to work offline. Whilst this works fine for a package that connects to SQL Sever, it breaks packages that connect to Oracle (we are using the Oracle Provider for OLE DB).

Anyone know how I can make my Oracle dependant packages run fast?

Thanks in advance!

Working example/script task on consuming data from REST API

$
0
0

Hi all

I have googled my way through the internet searching for one working example of a script task in SSIS, that loads data into a table from an REST API. So far with no luck - it seems that everyone agree that it can be done using a script task, but it's apparently one of the best kept secrets.

So please if someone knows and actually have done it themselves, please give me an example of how to make the following work.

From the app company I got the following information

"The partnerkey is used for authentication. The response returned from the service contains the SessionToken, use this token in subsequent calls to the service to authenticate yourself." 

So I got the partnerkey. With the partner key I can get a sessionkey. Following info is provide from the app company

Request (JSON)
POST /login/application
Content-Type: application/json
{"Name":"micros", "Key":"<partnerkey goes here>"}

And once that works, department can be retrieved with this

Request (JSON)
GET /departments/application/?securityToken=<token>
Content-Type: application/json

So how would this fit into a SSIS flow. First use partnerkey to get a sessionkey. Then use that sessionkey to get department data

Parse body of email and send report based on parsed parameters

$
0
0

OK Gurus,

I have an assignment to check for emails to a certain exchange email address and pull parameters either in the body or subject (I can decide how sender will compose). The email will contain a company number (xxxxxx) and I need to use that as a parameter for a report that I want to send them. I have already built the report, I just need the mechanics of how to extract the company number from an email and plug it into my report.

How would I build an ssis package to complete this task? Of course they want it yesterday.

Thanks

Column overflowed the disk I/O buffer

$
0
0

Hi everyone,

I am using SSIS, and I got the folowing error, I am loading several CSV files in a OLE DB, Becasuse the file is finishing and the tak dont realize of the anormal termination, making an overflow.

So basically what i want is to control the anormal ending of the csv file.

 please can anyone help me ???

 

[DTS.Pipeline] Error: Column Data for Column "Client" overflowed the disk I/O buffer

[DTS.Pipeline] Error: The PrimeOutput method on component "Client Source" (1) returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

 

[DTS.Pipeline] Error: Thread "SourceThread0" has exited with error code 0xC0047038.

 

[DTS.Pipeline] Error: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.

 

[DTS.Pipeline] Error: Thread "WorkThread0" has exited with error code 0xC0047039.

 

[DTS.Pipeline] Information: Post Execute phase is beginning.

 

Thanks a lot

J

can a 2016 pkg call a 2012 sub pkg

$
0
0
we will have some pkgs in both 2016 and 2012 enterprise soon. can a 2016 pkg call a 2012 sub pkg? 

SSIS 2016 SMO Transfer error: An error occurred while transferring data. See inner exception for details

$
0
0

Good day,

We are using SMO to transfer tables over from one server to the other on SQL Server 2016.

For a while now, the transfer succeeds only one out of 3 times on 1 server only, and we always get the same error message, which doesn't say much: An error occurred while transferring data. See inner exception for details.

That same package has been deployed to our 4 environments (Dev, Test, Qa and Prod) and is running everyday. However, we only get that problem in Test.

Our DBAs are not doing much to help us find the reason why that appends. Can you please give me a hint on where to look at? That SSIS package hasn't been updated for a long time, and the built version is the same in all environment.

Here is our .vb script in the package:

Public Sub Main()
    '
    ' Source and Destination Servers
    '
    ' MsgBox(Dts.Variables("User::SourceServer").Value.ToString)
    Dim srvSourceName As String = Dts.Variables("SourceServer").Value.ToString  '"brdsqldev"
    Dim srvDestinationName As String = Dts.Variables("DestinationServer").Value.ToString  '"brdsqldev"
    Dim bt(0) As Byte
    '
    ' Source and Destination Databases
    '
    Dim dbSourceName As String = Dts.Variables("SourceDB").Value.ToString
    Dim dbDestinationName As String = Dts.Variables("DestinationDB").Value.ToString

    Try
        Dts.Log("The Transfer starts", 0, bt)
        Dts.Log("Transfer starts with tables", 0, bt)
        Dts.Log("Transfer starts with tables" & srvSourceName, 0, bt)
        Dts.Log("Transfer starts with tables" & srvDestinationName, 0, bt)
        Dts.Log("Transfer starts with tables" & dbSourceName, 0, bt)
        Dts.Log("Transfer starts with tables" & dbDestinationName, 0, bt)
        ' TableExists()
        Transfer(srvSourceName, srvDestinationName, dbSourceName, dbDestinationName, CType(Dts.Variables.Item("myDataTable").Value, System.Data.DataTable), TransferObjectType.Table)
        ' Transfer(srvSourceName, srvDestinationName, dbSourceName, dbDestinationName, CType(Dts.Variables.Item("myDataViews").Value, System.Data.DataTable), TransferObjectType.View)
        Dts.Log("Transfer done with tables", 0, bt)
        Dts.Log("The Transfer ran successfully", 0, bt)
        Dts.TaskResult = ScriptResults.Success
        'MessageBox.Show("The transfer ran successfully.")

    Catch ex As Exception
        Dts.Events.FireError(99, "", ex.Message.ToString(), "", -1)
        OutPutError(ex)
        Dts.TaskResult = ScriptResults.Failure
        'MessageBox.Show("The transfer was aborted.")
    End Try

End Sub


Private Sub Transfer(ByVal srvSourceName As String, ByVal srvDestinationName As String, ByVal dbSourceName As String, ByVal dbDestinationName As String, ByVal dataTableToTransfer As DataTable, ByVal type As TransferObjectType)
    Try
        Dim srcsrv As Server
        srcsrv = New Server(srvSourceName)

        Dim dbSource As Database
        dbSource = srcsrv.Databases(dbSourceName)

        Dim dessrv As Server
        dessrv = New Server(srvDestinationName)

        Dim dbDestination As Database
        dbDestination = dessrv.Databases(dbDestinationName)

        Dim oleDA As New OleDbDataAdapter
        Dim dt As New DataTable
        Dim row As DataRow
        Dim sMsg As String

        Dim xfr As Transfer

        Dim bt(0) As Byte

        Dts.Log("Transfer Subroutine starts ...", 0, bt)
        xfr = New Transfer(dbSource)

        xfr.CopyAllTables = False
        xfr.Options.WithDependencies = False
        xfr.Options.ContinueScriptingOnError = False

        xfr.CopyAllRoles = False
        xfr.CopyAllLogins = False
        xfr.CopyAllDatabaseTriggers = False
        xfr.CopyAllDefaults = False
        xfr.CopyAllPartitionFunctions = False
        xfr.CopyAllObjects = False
        xfr.CopyAllPartitionFunctions = False
        xfr.CopyAllPartitionSchemes = False
        xfr.CopyAllRules = False
        xfr.CopyAllSchemas = False
        xfr.CopyAllSqlAssemblies = False
        xfr.CopyAllStoredProcedures = False
        xfr.CopyAllSynonyms = False
        xfr.CopyAllUserDefinedAggregates = False
        xfr.CopyAllUserDefinedDataTypes = False
        xfr.CopyAllUserDefinedFunctions = False
        xfr.CopyAllUserDefinedTypes = False
        xfr.CopyAllUsers = False
        xfr.CopyAllViews = False
        xfr.CopyAllXmlSchemaCollections = False
        xfr.CopySchema = True

        xfr.DestinationDatabase = dbDestination.Name
        xfr.DestinationServer = dessrv.Name
        xfr.DestinationLoginSecure = True
        xfr.DropDestinationObjectsFirst = False

        Select Case type
            Case TransferObjectType.Table
                xfr.CopyData = True
            Case TransferObjectType.View
                xfr.CopyData = False
            Case Else
                Dts.Log("Unknown object transfer type. (-99)", -99, bt)
                Dts.TaskResult = ScriptResults.Failure
                'Throw New Exception("Unknown object transfer type.")
        End Select

        dt = dataTableToTransfer
        Dts.Log("Transfer Subroutine just before Row Count...", 0, bt)
        If dt.Rows.Count > 0 Then
            Dts.Log("Transfer Row Count > 0...", 0, bt)
            For Each row In dt.Rows
                Dim tblSourceName As String
                Dim tblSourceSchema As String

                tblSourceSchema = row("transferschema").ToString()
                tblSourceName = row("transferobject").ToString

                Select Case type
                    Case TransferObjectType.Table
                        xfr.ObjectList.Add(dbSource.Tables(tblSourceName.ToString, tblSourceSchema.ToString))
                    Case TransferObjectType.View
                        xfr.ObjectList.Add(dbSource.Views(tblSourceName.ToString, tblSourceSchema.ToString))
                    Case Else
                        Dts.Events.FireError(99, "Transfer Object", "Unknows object type", "", -1)
                        Dts.TaskResult = ScriptResults.Failure
                        'Throw New Exception("Unknown object transfer type.")
                End Select

            Next
            Dts.Events.FireInformation(1, "", "Just before transfering data", "", -1, False)

            xfr.TransferData()

        End If

    Catch ex As InternalSmoErrorException
        Dim bt(0) As Byte
        Dts.Events.FireError(99, "Transfer Objects", ex.Message.ToString(), "", -1)
        OutPutError(ex)
        Dts.TaskResult = ScriptResults.Failure

    End Try

End Sub

Private Sub OutPutError(ex As Exception)
    Dim ErrorLogDir As String
    Dim OutPutFileName As String
    ErrorLogDir = System.IO.Path.GetDirectoryName(Dts.Connections("ExportNightlyTransfer.xml").ConnectionString.ToString())
    OutPutFileName = ErrorLogDir + "\\" + Dts.Variables("PackageName").Value.ToString() + "ErrorReport" + Now.ToString("yyyyMMddHHmmss") + ".txt"
    Using sw As StreamWriter = New StreamWriter(OutPutFileName)
        sw.WriteLine(Now.ToString())
        sw.WriteLine("Package Name: " + Dts.Variables("PackageName").Value.ToString())
        sw.WriteLine("Task Name: " + Dts.Variables("TaskName").Value.ToString())
        sw.WriteLine(ex.Message.ToString())
        sw.WriteLine(ex.GetBaseException.ToString())
        sw.WriteLine(ex.StackTrace.ToString())
    End Using
End Sub

Any idea?

Thanks in advance for your help.

Mylene


Script Task in the newest version of SSDT 2015 broken. KB4336999 needs a patch, I believe.

$
0
0

In trying to create an SSIS package, I keep getting the error: the binary code for the script is not found.  This is with a completely unmodified script task template.  I've tried it with working tasks as well.  It is broken..and after hours and hours of seraching I started uninstalling recent KB's....and testing.  That's when I uninstalled KB4336999 which seemed to fix my issue.

Why is this KB causing an issue and what should I do.  Currently it will probably keep pushing to kace until my IT team can have it removed from the push list.  Will there be a patch?

SSIS Error Code: 0xC0047062, 0xC0047038, 0x80131620

$
0
0

Hi All,

We are facing an issue with SSIS job in the server (2016). Let me describe the issue in detailed:

1. The package runs at every 1 hour.

2. Most of the time the package completes WITHOUT below error. But at time it fails with below provided error.

3. Min server memory: 1024 MB and Max server memory: 3072 MB

4. Used "AutoAdjustBufferSize" and played with DefaultBufferMaxRows and DefaultBufferSize, but no luck

Any help will be appreciated !!!

Error:

-----------------------------------

Executed as user: AAAAAAAAAAAA. ...3.0.4522.0 for 64-bit Copyright (C) 2016 Microsoft. All rights reserved. Started: 5:45:02 PM Error: 2018-11-06 17:47:21.57 Code:0xC0047062 Source: Data Flow from OData to RESOURCE OData Source 1 [187] Description: System.IO.IOException: Received an unexpected EOF or 0 bytes from the transport stream. at System.Net.ConnectStream.Read(Byte[] buffer, Int32 offset, Int32 size) at Microsoft.Data.DataFeedClient.StreamWrapperWithQuota.Read(Byte[] buffer, Int32 offset, Int32 count) at System.IO.Compression.DeflateStream.Read(Byte[] array, Int32 offset, Int32 count) at Microsoft.OData.Core.MessageStreamWrapper.MessageStreamWrappingStream.Read(Byte[] buffer, Int32 offset, Int32 count) at Microsoft.OData.Core.BufferingReadStream.Read(Byte[] userBuffer, Int32 offset, Int32 count) at System.IO.StreamReader.ReadBuffer(Char[] userBuffer, Int32 userOffset, Int32 desiredChars, Boolean& readToUserBuffer) at System.IO.StreamReader.Read(Char[] buffer, Int32 index, Int32 count) at Microsoft.OData.Core.Json.JsonReader.ReadInput() at Microsoft.OData.Core.Json.JsonReader.ParseStringPrimitiveValue(Boolean& hasLeadingBackslash) at Microsoft.OData.Core.Json.JsonReader.ParseValue() at Microsoft.OData.Core.Json.JsonReader.Read() at Microsoft.OData.Core.Json.BufferingJsonReader.ReadInternal() at Microsoft.OData.Core.JsonLight.ReorderingJsonReader.ReadPropertyName(String& propertyName, String& annotationName) at Microsoft.OData.Core.JsonLight.ReorderingJsonReader.ProcessObjectValue() at Microsoft.OData.Core.Json.BufferingJsonReader.ReadNextAndCheckForInStreamError() at Microsoft.OData.Core.Json.BufferingJsonReader.ReadInternal() at Microsoft.OData.Core.JsonLight.ODataJsonLightDeserializer.ReadPayloadStartImplementation(ODataPayloadKind payloadKind, DuplicatePropertyNamesChecker duplicatePropertyNamesChecker, Boolean isReadingNestedPayload, Boolean allowEmptyPayload) at Microsoft.OData.Core.JsonLight.ODataJsonLightPayloadKindDetectionDeserializer.DetectPayloadKind(ODataPayloadKindDetectionInfo detectionInfo) at Microsoft.OData.Core.Json.ODataJsonFormat.DetectPayloadKindImplementation(Stream messageStream, Boolean readingResponse, ODataPayloadKindDetectionInfo detectionInfo) at Microsoft.OData.Core.ODataMessageReader.DetectPayloadKind() at Microsoft.Data.DataFeedClient.DataFeedODataReader.DetectPayloadKind(ODataMessageReader odataMessageReader) at Microsoft.Data.DataFeedClient.DataFeedODataReader.<>c__DisplayClass2.<.ctor>b__1() at Microsoft.Data.DataFeedClient.DataFeedODataReader.InterceptODataException(Action action) at Microsoft.Data.DataFeedClient.DataFeedODataReader..ctor(IODataResponseMessage responseMessage, DataFeedConnectionParameters connectionParameters, DataFeedODataEdmWrapper odataEdmWrapper) at Microsoft.Data.DataFeedClient.ODataHelper.CreateODataV3OrV4Reader(IWebResponse webResponse, Boolean applySlashUpdatingBaseUriResolver, DataFeedConnection connection, Uri commandRequestUri) at Microsoft.Data.DataFeedClient.DataFeedCommand.ExecuteDbDataReader(CommandBehavior behavior) at Microsoft.SqlServer.IntegrationServices.OData.ODataSource.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket) End Error Error: 2018-11-06 17:47:21.81 Code:0xC0047038 Source: Data Flow from OData to RESOURCE SSIS.Pipeline Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OData Source 1 returned error code0x80131620. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error i... The package execution fa... The step failed.

SSIS and OData

$
0
0

All,

We are facing an issue with SSIS (2017 version) and OData (as Source). The error is:

SSIS package starting.
Information: 0x4004300A at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Execute phase is beginning.
Error: 0xC02090F5 at DWH_CMN_MV_LOOKUP, OData Source [20]: The OData Source was unable to process the data. The remote server returned an error: (401) Unauthorized.
Error: 0xC0047038 at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on OData Source returned error code 0xC02090F5.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x4004300B at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: "OLE DB Destination" wrote 1933 rows.
Information: 0x40043009 at DWH_CMN_MV_LOOKUP, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: DWH_CMN_MV_LOOKUP

Any help to resolve the issue will be appreciated.

- Raghu

 

C# Substring issue with unknown length

$
0
0

I am working with C# code and having issue with substring. My text file as belowenter image description here

Gr= line.Substring(0,5);
SGR = line.Substring(7,3);
subscriber = line.Substring(13,11);
INDV = line.Substring(27,11);
P&C = line.Substring(41,4);
MKR = line.Substring(47,2);Zip_Code= line.Substring(52,10);Zip_City= line.Substring(64,21);
RPTF = line.Substring(87,2);

Getting error on Zip_City string. I am counting it has max of 21 char but it has 19 as in above file so not sure how to adjust it every varying length of string in this column. Length of  each column is fixed as shown above and I am counting spaces between the column as well. I have code written that pick up only  line.StartsWith("60000")

T.I.A

SSIS: How to overcome the scenario when a package runs even if the file is not completely copied in the source path?

$
0
0

Hi All,

I came across this question while attending an interview. There is a scheduled package which runs in every 10 mins  expecting that the file will be available in the source path.  But, for some reason the file didn't copied completely within the expected time. Package started running even though file copying is still going on. Seems like the package will not fail ass there is a file with some amount of data. But it will get loaded incompletely. I didn't get any solution at that time. What can be done to resolve this scenario?

Thank You

Julie


My ssis package scheduleing for every 20 sec but for every 20 sec job run the end time is differ even though there is no data new data from source tables.

$
0
0

Hi,

I have schedule the ssis job for every 20 sec /day. while running the job it is showing different times as end time even though there is no new   data in sql tables.

There is no network issues also. could you please suggest me the reason?

Thanks

AVS


did migration to 2016 occur when i opened 2012 pkg in vs 2017

$
0
0
Hi, I just moved a 2012 ssis pkg (developed in vs 2010) to my 2016 ssis solution in vs 2017.  I saw a lot of warnings about migration but I wonder, did a migration occur when I went to view in designer?  or is there another step?  we use the file system.  is there an easy way for me to verify that a migration occurred?  I'm looking in the top 20 or so lines of the xml and don't see anything telltale one way or the other. 

Memory increases and VS freezes - SSIS

$
0
0

Hi all,

I'm trying to run a stored procedure for each row of a table. 
To achieve this, I have the following control flow:
1) [Execute SQL Task] Extract rows from database. Retrieves data (full result set) into an object variable.
2) [ForEach] Loops over the rows. For each row:
2.1) [Execute SQL Task] Executes procedure and extract a single row in another database
2.2) [Execute SQL Task] Performs an update in the database

My problem is that memory continues to increase until a certain point that Visual Studio starts to become slow and freeze.
I believe this can be a memory problem related to the first "Execute SQL Task", which retrieves a full result set from the database. However, I'm a beginner at 
SSIS and I need a help to understand what is causing this problem. Any ideas?

Loading Tables with Dynamic SQL within a Stored Procedure called in Execute SQL task

$
0
0

Hello, 

I have an SSIS package that loads data from excel files into a database. Once the data is loaded, it uses an Execute SQL Task to call a Stored Procedure. 

The stored procedure takes the data from the newly loaded(Stage) tables and loads it into new tables that will be used for reporting. Some of the tables are loaded using Dynamic SQL inside the stored procedure. 

Problem: 

When the Procedure is ran from SSMS all the tables load correctly. This also works as expected when the Execute SQL task in my SSIS package is executed manually. (Right Click - Execute Task).  However, when the entire package is executed, the task finishes successfully, but the tables within Dynamic SQL don't get loaded but the others do. 

It appears the Stored Procedure does indeed execute, but the Dynamic SQL within it does not. 

Any suggestions are appreciated.  

Thanks

Visual Studio 2017 (Version 15.7.4)

Microsoft SQL Server 2008 R2 (SP3-GDR) (KB4057113) - 10.50.6560.0 (X64) 

Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1) (Hypervisor) 

How to insert mutiple select statements into table uisng SSIS

$
0
0

For example I have got 2 tables Employee and Department. With the queries like

Select EmpID,Employee From dbo.EmployeeSelect DepID,DeptName From dbo.Department

which I have put 2 EXECUTE SQL tasks.

and want to insert those select statements into some Destination table.

Output :

ID SQL_Stmt1Select EmpID,Employee From dbo.Employee2Select DepID,DeptName From dbo.Department

where as my structure looks like this :

 

Suggest me how can I proceed on this.

Thank you

passing a value whose target is a sub package's variable

$
0
0
Hi, in ssis 2012 enterprise , I'm faced with the challenge of passing a value to a sub package's variable without changing the sub package. The sub package doesn't accept params. Is this possible? From ssdt?
Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>