We are developing a process that sources a CLOB field from Db2, and inserts it into a SQL Server database.
The CLOB data size can be up to 50MB and it contains data stored in XML format. The aim of the process is 1. To get the data from Db2 into SQL Server 2. To "unpack" the values in the XML, using XQuery 3. To load the "unpacked" values into seperate table columns.
To give some background. The values in the CLOB field contains Importer / Exporter information and is grouped as Commodity items in the Clob field. There will be anything from 1 - 10000 Commodity items and their associated values inside the CLOB field.
For development purposes, the data we received from source was limited to 5 items. Everything looked fine, until they started to increase the number of items sent through. This obvioulsy increase the size of the CLOB field.
The result was that all the CLOB values, where the umber of items exceeded 5 items, were loaded into SQL Server as a NULL value. Just to emphasise, the larger CLOB values are not written to SQL Server, the column is NULL in SQL Server
I am therefore convinced that the size of the CLOB field has something to do with this. I was under the impression that VARCHAR(MAX) could handle up to 2GIG.
We have tried the following things, without sucess.
1. Played around with datatypes CLOB - NTEXT, CLOB - NVARCHAR(MAX), CLOB - TEXT, CLOB - VACHAR (MAX). No luck.
2. We even set up a linked server, and tried doing this wth an Openquery statement. Also no luck.
Is there anything else we should look at, sp_tableoption settings etc. ?
We are not very experienced in SQL Server and SSIS. The client forced us to replatform. Any help would be appreciated. Unfortunately I cannot include more info or code snippets. It is all considered to be sensitive info.