My goal is to load JSON documents from a JSON query in SQL into MongoDB.
I've discovered that when I have more than 2000 characters in my JSON results, the OLE DB Source object in SSIS will split the results into 2 rows. This results in 2 poorly formed JSON strings which then fail to insert into MongoDB. I see the same thing happen in management studio, when the line of JSON results wraps, the results tab shows 2 rows affected rather than just 1. Any ideas on how to fix this? The data type is implicitly understood as DT_NTEXT in SSIS, which should theoretically support far more than 2000 characters.
Steps to reproduce: Author a SQL query using "FOR JSON" with results that are larger than 2000 characters in a single document. Note that in SQL Management studio, when there are fewer than 2000 characters, only one row is affected, when over 2000, 2 rows are affected. Also note, that the results are truly a single JSON result with one root node - this is confirmed by copying and pasting the entire JSON results into a JSON validator.
Using the OLE DB Source object in SSIS, include the query for a JSON document that is larger than 2000 characters, using a data viewer, notice that there are 2 rows.
Expected Results; one row is affected.
The actual error message is as follows:
"[MongoDB Destination [11]] Error: System.IO.FileFormatException: End of file in JSON string. '"Limit_'.
at CozyRoc.SqlServer.SSIS.MongoDbDestination.PostExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPostExecute(IDTSManagedComponentWrapper100 wrapper)"
where "Limit_" is the place in the JSON document where the document was split into 2 documents.
Ideas on how to fix?