In SQL Server 2008r2, the maximum row size of 8060 is not applicable to varchar(max), nvarchar(max), varbinary(max), text, image, or xml columns.
I thought that this was a statement of fact, but I'm having some difficulties with an upgrade test for a backup of one of our customer databases to the latest version of our schema.
The table in question has 23 columns and we're adding 2 new columns. 1 column is an XML column, all of the rest are fixed width columns (no TEXT, no IMAGE, no MAX size). I ran through the upgrade script for that table, it essentially drops the msrepl_tran_version column, adds two new columns (CHAR(1) and BIGINT) then adds the msrepl_tran_version (UNIQUEIDENTIFIER) column back with the default constraint of NEWID() (I know that none of this is necessary, it's something we have to do with our OEM version of the software because the guys we OEM to insist on ordinal positions of columns being the same and msrepl_tran_version always being the last column).
Anyway, whilst adding the column back I get the following error: -
Msg 511, Level 16, State 1, Line 4
Cannot create a row of size 8063 which is greater than the allowable maximum row size of 8060.
I wrote a script to take a look at the data, as I assume that this is a data issue since upgrading a blank version of this schema to the latest one has no issue but can't see any issues. When I run the script like this: -
DECLARE @table VARCHAR(30) = 'TB_DIM_ITEM_PKEY_MAP', @idcol VARCHAR(20) = 'PKEY_MAP_URN', @sql NVARCHAR(MAX);
SET @sql = 'SELECT ' + @idcol + ' , ROW_SIZE = (0';
SELECT @sql = @sql + ' + ISNULL(DATALENGTH(' + NAME + '), 1)'
WHERE id = object_id(@table) --AND name <> 'XML_DATA'
SET @sql = @sql + ') FROM ' + @table + ' ORDER BY ROW_SIZE DESC';
EXEC sp_executesql @sql;
The biggest row size is '130273', however if I uncomment the XML_DATA part to exclude the XML column then the biggest row size is '190'.
So, I can't honestly see the issue. I've setup a test where I am inserting the data from this table to a copy of the table with the new definitions, one row at a time, to see if I can find the "bad" data. But since there is a lot of data in the table, this will take some time. I was wondering if someone else had come across the same issue? Or if someone knows more about it than me? 😉