Quantcast
Channel: SQL Server Integration Services forum
Viewing all 24688 articles
Browse latest View live

how can i fix this? works good in visual studio

$
0
0

I am having issues while executing my package in SSMS

Package runs fine in Visual studio. I have a oracle as source and sql server destination as connection managers





Message Type

Action

Message Time

Message

Message Source Name

Subcomponent Name

Execution Path

OnError

View Context

7/3/2018 1:05:03 PM

apurvatest:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "OraOLEDB"  Hresult: 0x80004005 Description: "ORA-12154: TNS:could not resolve the connect identifier specified".

apurvatest

\apurvatest

OnError

View Context

7/3/2018 1:05:03 PM

Data Flow Task:Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DWFEI.AAO8290" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.

Data Flow Task

OLE DB Source [61]

\apurvatest\Data Flow Task

OnError

View Context

7/3/2018 1:05:03 PM

Data Flow Task:Error: OLE DB Source failed validation and returned error code 0xC020801C.

Data Flow Task

SSIS.Pipeline

\apurvatest\Data Flow Task

OnError

View Context

7/3/2018 1:05:03 PM

Data Flow Task:Error: One or more component failed validation.

Data Flow Task

SSIS.Pipeline

\apurvatest\Data Flow Task

OnError

View Context

7/3/2018 1:05:03 PM

Data Flow Task:Error: There were errors during task validation.

Data Flow Task

\apurvatest\Data Flow Task





ReinitializeMetaData() method giving error while iterating through dataflow components in dynamic package

$
0
0

Hi,

I am creating a package programatically in a Script Task inside a package. This package  has 1 data flow task created programmatically which is supposed to have 2 sources & 2 destinations running in parallel. Source & destination tables are identical.

I have written below code for the same. The FOR loop works fine for 1st iteration but it give me error on 2nd iteration at "oledbDestinationInstance[i].ReinitializeMetaData();" which is just after setting the AccessMode of destination. The error says

Error: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Runtime.InteropServices.COMException (0xC0202040): Exception from HRESULT: 0xC0202040
   at Microsoft.SqlServer.Dts.Pipeline.Wrapper.CManagedComponentWrapperClass.ReinitializeMetaData()

I dont find any other information related to error. Due to which I am unable to understand the cause of error. I tried to search for this over many sites but didnt find any solution yet.

I will appreciate if anyone can help me.

Code:-

#region actual data flow task code
                string SrcConnection = Dts.Connections["Source"].Name;
                string DestConnection = Dts.Connections["Destination"].Name;
                Connections connections = Dts.Connections;
                string SrcTables = "dbo.DimAccount, dbo.DimCurrency";
                string DestTables = "dbo.DimAccount_Test, dbo.DimCurrency_Test";
                int countOfTables = 0;
                string[] srcTablesArray;
                string[] destTablesArray;
                string sourceQuery = string.Empty;
                IDTSComponentMetaData100[] oledbSource;
                IDTSComponentMetaData100[] oledbDestination;
                CManagedComponentWrapper[] oledbSourceInstance;
                CManagedComponentWrapper[] oledbDestinationInstance;
                IDTSPath100[] pathSourceToDestination;
                IDTSInput100[] destInput;
                IDTSVirtualInput100[] destVirtualInput;
                //TO access the package we need to have an Applicatoin object
                Microsoft.SqlServer.Dts.Runtime.Application app = new Microsoft.SqlServer.Dts.Runtime.Application();
                Package pkg = new Package();
                //Getting the 1st executable of the loaded package
                TaskHost thMainTask = (Microsoft.SqlServer.Dts.Runtime.TaskHost)pkg.Executables.Add("DTS.Pipeline");
                //Naming the executable
                thMainTask.Name = "DynamicDataFlowTask";
                //attaching a Main Pipeline (DataFlowTask) as an inner object of main executable
                MainPipe dataFlowTask = (MainPipe)thMainTask.InnerObject;
                //making sure there is no metadata available in Main Pipeline (Data Flow Task)
                dataFlowTask.ComponentMetaDataCollection.RemoveAll();
                #region Create source connection
                //Create source connection
                ConnectionManager sourceConnection = pkg.Connections.Add("OLEDB");
                //setting up the selected connection in source drop down as source
                sourceConnection.Name = SrcConnection;
                //assigning connection string of selected connection manager as source 
                for (int i = 0; i < connections.Count; i++)
                {
                    if (connections[i].Name == SrcConnection)
                    {
                        sourceConnection.ConnectionString = connections[i].ConnectionString;
                        break;
                    }
                }
                #endregion
                #region Create destination connection
                //Create destination connection
                ConnectionManager destinationConnection = pkg.Connections.Add("OLEDB");
                //setting up the selected connection in destination drop down as destination
                destinationConnection.Name = DestConnection;
                //assigning connection string of selected connection manager as destination 
                for (int i = 0; i < connections.Count; i++)
                {
                    if (connections[i].Name == DestConnection)
                    {
                        destinationConnection.ConnectionString = connections[i].ConnectionString;
                        break;
                    }
                }
                #endregion
                #region code that can be repeated for many sources and destinations 
                //Get table names splitted and count of them
                srcTablesArray = this.SplitTables(SrcTables);
                countOfTables = srcTablesArray.Length;
                destTablesArray = this.SplitTables(DestTables);
                //Defining length of each array object with sourceTable count 
                oledbSource = new IDTSComponentMetaData100[countOfTables];
                oledbDestination = new IDTSComponentMetaData100[countOfTables];
                oledbSourceInstance = new CManagedComponentWrapper[countOfTables];
                oledbDestinationInstance = new CManagedComponentWrapper[countOfTables];
                pathSourceToDestination = new IDTSPath100[countOfTables];
                destInput = new IDTSInput100[countOfTables];
                destVirtualInput = new IDTSVirtualInput100[countOfTables];
                for (int i = 0; i < countOfTables; i++)
                {
                    #region setup source component/s
                    //Setup source component in DataFlowTask
                    oledbSource[i] = dataFlowTask.ComponentMetaDataCollection.New();
                    oledbSource[i].ComponentClassID = "DTSAdapter.OLEDBSource";     //The type of Transformation
                    oledbSource[i].ValidateExternalMetadata = true;
                    //Get the design time instance of source component
                    oledbSourceInstance[i] = oledbSource[i].Instantiate();
                    //Initialize the source component
                    oledbSourceInstance[i].ProvideComponentProperties();
                    oledbSource[i].Name = "OLEDB Source " + i;   //Addinng the counter to create distinct task name
                    //now setup the source connection properties of source component in DataFlowTask
                    if (oledbSource[i].RuntimeConnectionCollection.Count > 0)
                    {
                        oledbSource[i].RuntimeConnectionCollection[0].ConnectionManagerID = sourceConnection.ID;
                        oledbSource[i].RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(sourceConnection);
                    }
                    //Set the custom properties of the Source like Query/Table Names 
                    sourceQuery = "SELECT * FROM " + srcTablesArray[i].ToString();
                    oledbSourceInstance[i].SetComponentProperty("AccessMode", 2);
                    oledbSourceInstance[i].SetComponentProperty("SqlCommand", sourceQuery);
                    //Acquire Connections, reinitialize the component and then release the connection
                    oledbSourceInstance[i].AcquireConnections(null);
                    oledbSourceInstance[i].ReinitializeMetaData();
                    oledbSourceInstance[i].ReleaseConnections();
                    #endregion
                    #region setup destination component/s
                    //setup destination component in DataFlowTask
                    oledbDestination[i] = dataFlowTask.ComponentMetaDataCollection.New();
                    oledbDestination[i].ComponentClassID = "DTSAdapter.OLEDBDestination";   //The type of Transformation
                    oledbDestination[i].ValidateExternalMetadata = true;
                    //get the design time instance of destination component
                    oledbDestinationInstance[i] = oledbDestination[i].Instantiate();
                    //Initialize the destination component
                    oledbDestinationInstance[i].ProvideComponentProperties();
                    oledbDestination[i].Name = "OLEDB Destination " + i;    //Addinng the counter to create distinct task name
                    //now setup the destination connection properties of destination component in DataFlowTask
                    if (oledbDestination[i].RuntimeConnectionCollection.Count > 0)
                    {
                        oledbDestination[i].RuntimeConnectionCollection[0].ConnectionManagerID = destinationConnection.ID;
                        oledbDestination[i].RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(destinationConnection);
                    }
                    //Set the custom properties of the Source like Query/Table Names 
                    oledbDestinationInstance[i].SetComponentProperty("AccessMode", 3);
                    oledbDestinationInstance[i].SetComponentProperty("OpenRowset", destTablesArray[i]);
                    //oledbDestinationInstance[i].SetComponentProperty("FastLoadKeepIdentity", true);
                    //Acquire Connections, reinitialize the component 
                    oledbDestinationInstance[i].AcquireConnections(null);
                    oledbDestinationInstance[i].ReinitializeMetaData();
                    oledbDestinationInstance[i].ReleaseConnections();
                    #endregion
                    #region TUNNEL setup between source & destination
                    //Creating tunnel/path between source and destination i.e. connecting source & destination
                    pathSourceToDestination[i] = dataFlowTask.PathCollection.New();
                    pathSourceToDestination[i].AttachPathAndPropagateNotifications(oledbSource[i].OutputCollection[0], oledbDestination[i].InputCollection[0]);
                    //Iterate through the inputs of the component
                    destInput[i] = oledbDestination[i].InputCollection[0];
                    //Get the virtual input column collection for the input
                    destVirtualInput[i] = destInput[i].GetVirtualInput();
                    //oledbDestinationInstance[i].ReinitializeMetaData();
                    //Iterate through the virtual column collection
                    foreach (IDTSVirtualInputColumn100 vColumn in destVirtualInput[i].VirtualInputColumnCollection)
                    {
                        //Call the SetUsageType method of the design time instance of the component
                        IDTSInputColumn100 vCol = oledbDestinationInstance[i].SetUsageType(destInput[i].ID, destVirtualInput[i], vColumn.LineageID, DTSUsageType.UT_READONLY);
                        //Creating the mapping between columns
                        oledbDestinationInstance[i].MapInputColumn(destInput[i].ID, vCol.ID, destInput[i].ExternalMetadataColumnCollection[vColumn.Name].ID);
                    }
                    //After mapping the columns re-initialize the metadata for destination and release the connection                    
                    oledbDestinationInstance[i].AcquireConnections(null);
                    oledbDestinationInstance[i].ReinitializeMetaData();
                    oledbDestinationInstance[i].ReleaseConnections();
                    #endregion
                }
                //Execute the package
                DTSExecResult pkgResult = pkg.Execute();
                //Return the execution result
                executionResult = pkgResult;
                if (executionResult == DTSExecResult.Success)
                {
                    Dts.TaskResult = (int)ScriptResults.Success;
                }
                else
                {
                    Dts.TaskResult = (int)ScriptResults.Failure;
                }
                #endregion
                #endregion

Thank you.

Mr myName




Create dynamic File Name using Destination Table Name in SSIS

$
0
0

Hi,

    I have a source OLEDB staging table . I multicast the data from staging to loadreadytable as well as flat file.

I have close to 1000 tables and hence 1000 flat file. so am creating new Flatfile connection for each of the tables.

Instead is there a way to dynamically create the path and name of flat files . And the names of the files should be name of my load ready tables.

Remove Intermittent Quotation Marks From Flat File

$
0
0

I need to import a text file we receive from a vendor. Unfortunately, the vendor sends a somewhat messed up file. The file is comma delimited, which is no big deal. But, the file sprinkles quotation marks throughout the file, and it's not consistent. Not every column has quotation mark qualifiers. Here's an example:

Canada,abc-123, "VP, Technical operations", 123-XYZ, "some text, more text", last column

So in this example, there should be six columns, but when using SSIS flat file connector, no matter what you do, it will actually see eight columns - splitting "VP, Technical Operations" and "some text, more text" both into into two columns each - totaling eight columns.

I would like to remove the double quotes and change the column identifier to a pipe instead of a comma. 

I found the following blog post, but it doesn't really fill in some gaps - like the first step. How do you get SSIS to "see" an input file as one gigantic row??

http://geekswithblogs.net/Compudicted/archive/2011/09/19/ssis-how-to-remove-occasional-quotes-and-replace-the-column.aspx

Is anyone aware of another way to accomplish this? If not a different way, could you provide a littler more detail on how to get this to work?

Thanks in advance!


A. M. Robinson

SSIS Failed to Start after Protection Level Changed

$
0
0

I had a SSIS project which by default used EncryptSensitiveWithUserKey as Protection level. Due to security purposes and the fact that the proection is being manipulated by my new co-workers I need it to use EncryptSensitiveWithPassword. But when I changed the setting, both in the Project and Package properties, next time I try to debug the project within Visual Studio it gives this error message as soon as I click start...

Exception deserializing the package "The package failed to load due to error 0xC0011008 "Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.". This occurs when CPackage::LoadFromXML fails.
". (Microsoft.DataTransformationServices.VsIntegration)

Help!

SSIS reusable transformations

$
0
0

Hi,

I am using SSIS 2015.   

I have a table which has 30 DATE columns. I want to apply transformation logic on these date fields. the logic is same but it has to be applied on all 30. Instead of writing the code again and again in derive transformation expression 30 times, is there a way to write kind of function, which when called by passing my date field as parameters returns me the expected result.

similar to this I have transformations for multiple data types. So is there a way to write re-usable transformation expression in SSIS

Thanks 

connection info disappears every time i close and reopen solution

$
0
0

Hi. I'm running vs2008. I cloned a package, added two conns and then tried to add them to configs which already had a few configs in , some using a sql table of conn strings.

While the pkg was open everything was ok.  The two new conns showed the familiar "I'm config'd" stripe and test connection worked.  But every time I close and then reopen the pkg, the conn string in the two new conns goes blank and most of the components show the familiar little red circle. 

when I edit one oledb ds in a dft with red circle in advanced editor, I see 2 error sections in the connection managers tab.  The first references error code 0x80004005 and says [ODBC Driver Manager]"data source name not found and no default driver specified".

in the second section it says The AcauireConnection method call to the connection manager "my conn name" failed with error code 0xc0202009.

I remember some confusing rules from way back about adding configs after the fact but don't know if that is related.  I looked at the xml code and don't see any difference in the conns and config tags (when everything is good), at least not one that would indicate they are being treated differently.  The only way I've found to make things look right while the package is open in the deaigner is to delete and readd the conns.  I've tried to delete and readd the configs as well but nothing seems to be helping.  I cant show too much here due to proprietary issues.

 

Question on setting a TLS 1.2 value within a SSIS package when connecting to Vendor's URL using a Web Service Task

$
0
0

We've been connecting to a vendor, Heartland, using a web service task in SSIS to consume our data and bring it down and insert the data into a SQL server table.

This last Wednesday, they upgraded their environment to only accept TLS 1.2 protocols and gave us no notice.  Unfortunately, we can no longer send a request as our connection is immediately closed.

We've been working to find a solution to help us use a TLS 1.2 value within our SSIS package as we cannot make a change on the SQL Server or OS to use the TLS 1.2 protocol because other apps are using TLS 1.0 and TLS 1.1.  

We're working on SQL Server 2014 and SSIS 2013.  Our understanding is that TLS 1.2 is enabled in these environments by default.  But I'm not finding that to be the case.  When I look at the value of this class:  System.Net.ServicePointManager.SecurityProtocol in my C# script were I try to set a TLS value for the security protocol I see Ssl3 | Tls in the AUTOS window.  And when I change the value of the protocol I see Tls12 value.  Then I execute my web service task and get this error:

[Web Service Task] Error: An error occurred with the following error message: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: The Web Service threw an error during method execution. The error is: The underlying connection was closed: An unexpected error occurred on a send..
   at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebMethodInvokerProxy.InvokeMethod(DTSWebMethodInfo methodInfo, String serviceName, Object connection)
   at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTaskUtil.Invoke(DTSWebMethodInfo methodInfo, String serviceName, Object connection, VariableDispenser taskVariableDispenser)
   at Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTask.executeThread()".

Any suggestions on a work around?  Any help/direction would be appreciated. Thanks.


Does Azure-SSIS Integration Runtime support FTP connection?

$
0
0

Hi,

I have configured Integration runtime in Azure Data Factory (Azure-SSIS type) and deployed to SSISDB catalog a package. The package tries to connect to FTP, receive a file and put it to temporary folder (https://docs.microsoft.com/en-us/sql/integration-services/lift-shift/ssis-azure-files-file-shares)

I am starting an execution and get the error:

The connection type "FTP, {A39B653A-5D92-44A7-8E22-45F19C4B0088}" specified for connection manager "FTP" is not recognized as a valid connection manager type. This error is returned when an attempt is made to create a connection manager for an unknown connection type. Check the spelling in the connection type name.  

Does it mean that Azure-SSIS Integration Runtime doesn't support FTP connection at all?

Any suggestion?

Thanks in advance!

SSIS

$
0
0

hello

I have a table 'Test' in sql server with following columns

|path       |Name| Table name|

|C:\Test1 |x.txt | Test1         |

|C:\Test2 |y.txt |Test2          |

|C:\Test3 |a.txt |Test3          | 

|C:\Test4 |b.txt |Test4          |

My task is to read this table from sql, loop through each row and get complete path of file and load data from each file to corresponding table. All the tables have same structure. Please help! 

Dynamic configuration file name in select configuration type

$
0
0

I have created a package.  I have setup dtsconfig  file under

1. Package configuration wizard

2. select configurationtype = xml configuration file

3. Specify configuration settings directly

3a. Configuration file name = "C:\a\b\a.dtsConfig" 

This file consists of servername. When I move this package to QA, i see there we have a different path like "D:\a\a.dtsConfig". Due to this, am getting an error saying pathC:\a\b\a.dtsConfig not valid. How do I get / set path dynamically in "Configuration file name" text box? 

Which variables return warning number and description in OnWarning Event Handler?

$
0
0

Hi All,

When there's an execution error in our SSIS 2012 packages, we use the OnError event handler to run a stored procedure that logs the error using the variables System::ErrorCode and System::ErrorDescription to a user-defined table.  Now we want to also log SSIS warnings to that same table.  Which variables do I use in the OnWarning event handler?  I don't see any system variables with "warning" in the name.

We don't want to use SSIS built-in logging because those records wouldn't get logged out our user-defined table.

Thanks,
Eric B.

The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009

$
0
0

I have deployed my packages into Sql Server and I am using Configuration File. As my Data Source is Excel, I have changed the connection string during deployment with Server Path. But I am getting the following errors. Actually the File Exist in Path. May I know What is cause of the issue? Do I need to give any permission to execute the package.

 

SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER.  The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009.  There may be error messages posted before this with more information on why the AcquireConnection method call failed. 

 

component "Excel Source Service Contract Upload" (1) failed validation and returned error code 0xC020801C. 

One or more component failed validation. 

 

There were errors during task validation. 

DTS_E_OLEDBERROR, Error Code: 0x80004005 Source: "MS JET DB Engine" Description : Path is not valid


 

How to run Deployed Pkg's at 64 bit

$
0
0

Hello, I'm on a single SQL Server instance and box where both Dev and prod occur. I have VS 2015 installed, with Teradata Drivers and Attunity driver to reach the data source. All that is 32 bit for the VS/SSDT needs. The pkgs get deployed, and a job is created, steps added, etc and then its scheduled and runs. But it runs in 32 bit for the connection, as that's what is used to design the pkg.
I have also installed  Teradata 64 bit driver and 64 Attunity on the same one server as well. Is, or would, the easiest way to switch my job that runs the steps, in the connection manager or the SSISDB? I look, both places look similar, with this is as a connection string: {Teradata};CHARSET=UTF8;LOGINTIMEOUT=20;DBCNAME=<sysnamehere>;UID=<usernamehere>;DATABASE=;ACCOUNT=;AUTHENTICATION=;WINAUTH=0. I'm not really seeing a good place to switch the jobs connection to use 64 drivers and providers. Thanks

Multiple Inputs to Script Component

$
0
0

I have SSIS flow performing the following

Step1: Extract Records from Source Table ODBC source

Step 2: Derived field for transformation-> error output of step 2 is connected to step 2a

  Step 2a. Script Component for Transformation -- to capture Error Column /Error Description and other error related fields.

My question is , how do I connect error output of step1 as well to this script. When I try to connect it says "All available inputs on target component are connected to outputs"

Following is the code in my script

 public override void Input0_ProcessInputRow(Input0Buffer Row)
    {
        /*
         * Add your code here
         */

        Row.ErrorDescription = this.ComponentMetaData.GetErrorDescription(Row.ErrorCode);
        Row.TableName = "TXXXXXX";
        Row.Key = Row.CLIENT.ToString() +"|"+ Row.LSTUPDTIMESTAMP.ToString();
        var componentMetaData130 = this.ComponentMetaData as IDTSComponentMetaData130;
        if (componentMetaData130 != null)
        {
            Row.ColumnName = componentMetaData130.GetIdentificationStringByID(Row.ErrorColumn);
        }
    }

}


Error at the time of Package Execution on AlwaysOn Environment

$
0
0
Hi Guys,


We are facing issue during executing SSIS Package in AlwaysOn Environment. We have 2 nodes on Win 2012 R2 DC & SQL Serer 2016. The SSIS DB is the part of Availability Databases & in the SSIS, we are AG_Name for connection. We have also enabled the "Enable  AlwaysOn Support" feature.


We have also created job for the package & while executing the Package, it is throwing below error message:


Package path: \SSISDB\SSISPackages\Interface2018Primary\FORT_Events.dtsx, Environment reference Id: NULL.
Description: Please create a master key in the database or open the master key in the session before performing this operation.
Source: .Net SqlClient Data Provider
Started:  11:48:49 AM
Finished: 11:48:49 AM
Elapsed:  0.171 seconds


Will you please help on the issue.


Thanks
D

Import .csv file with date time stamp standard using SSIS

$
0
0

I tried to create .csv file with date timestamp like "XXX_2018-07-11-09.12.33.731000.csv" but I can able to create the file name till seconds using SSIS.

How to get .csv file name mentioned below using SSIS?

Ex: XXX_2018-07-11-09.12.33.731000.csv

raw file to stored procedure

$
0
0

1-
In server1, stored procedure (SP1) produces a resultset
exec sp1;
2-
In server2, stored procedure (SP2) accepts a TableType
exec sp2 @TableType

Trying to create a ssis package so that SP2 is executed with the input of TableType
from the resultset of SP1

I do not want to use linked servers. so I am thinking of something like the following:
start the ssis package with a dataflow task with oledb command to execute sp1.
In the same dataflow, have a raw file destination to place in it the result of sp1.
In control flow, have a execute sql task to execute sp2.

Question, how do I pass the raw file data into sp2?

Thank you

SSIS

$
0
0

hello

I have a table 'Test' in sql server with following columns

|path       |Name| Table name|

|C:\Test1 |x.txt | Test1         |

|C:\Test2 |y.txt |Test2          |

|C:\Test3 |a.txt |Test3          | 

|C:\Test4 |b.txt |Test4          |

My task is to read this table from sql, loop through each row and get complete path of file and load data from each file to corresponding table. All the tables have same structure. 

This is solved now.

Regarding this task I now have another table which contains packagename, taskname, start_datetime, end_datetime, no. of rows downloaded from source, no. of rowinserted and no. of rows updated.

I need to insert into this table with corresponding values after each iteration of foreach loop. How can I do this??

SSIS

$
0
0
how do I get package name, taskname, start_datetime, end_datetime, no of rows inserted, no of rows updated,no of rows downloaded from source after every iteration of for leach loop and insert these values into a table in sql server after each iteration.??
Viewing all 24688 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>