Hi
I have been trying to resolve an error message ("The value is too large to fit in the column data area of the buffer") I am getting from a Script Component (C#) that decrypts GPG encrypted column from VARBINARY(MAX) datatype to VARCAHR(MAX) datatype. It has been working fine until the past few days. Here are the Data Flow Task steps:
- OLE DB Source - Gets the data from a SQL Server table, which is VARBINARY(MAX) datatype.
- Script Component - dectypts from gpg incrypted column using C#. OutPut Columns is set to DT_STR(8000), which fails if I try to increase the column size above 8000.
- OLE DB Destination - Destination table has a VARCHAR(MAX) datatype column size.
Here is the C# script for step #2:
int length = (Int32)Row.MyColumnName.Length; if (length == 0) { Row.DecryptMyColumnName = null; } else { byte[] bytes = Row.MyColumnName.GetBlobData(0, length); String keyString = System.Text.Encoding.ASCII.GetString(bytes); if (keyString.Contains("\\r\\n")) { keyString = keyString.Replace("\\r", "\r"); keyString = keyString.Replace("\\n", "\n"); } else { keyString = keyString.Replace("\\n", "\r\n"); } Stream stream = new MemoryStream(System.Text.Encoding.ASCII.GetBytes(keyString)); Row.DecryptMyColumnName = getDecryptedData(stream, keyString); }
I have tried the following links but to no avail:
- SSIS - The value is too large to fit in the column data area of the buffer
- The value is too large to fit in the column data area of the buffer?
- Microsoft.SqlServer.Dts.Pipeline.DoesNotFitBufferException
Any help is greatly appreciated!
IN~