Job failing with maximum row size exceeds the allowed maximum of 8060 bytes.

  • Hi All,
    Noticed that our job is failing with below error message.not sure why this appearing now ,earlier its working fine this SQL job.

    Cannot create a row of size 11355 which is greater than the allowable maximum row size of 8060. [SQLSTATE 42000] (Error 511)  Warning: The table "Test_tbl" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit. [SQLSTATE 01000] (Error 1708)  The statement has been terminated. [SQLSTATE 01000] (Error 3621)  Cannot create a row of size 21381 which is greater than the allowable maximum row size of 8060. [SQLSTATE 42000] (Error 511)  Warning: The table "Test_tbl" has been created, but its maximum row size exceeds the allowed maximum of 8060 bytes. INSERT or UPDATE to this table will fail if the resulting row exceeds the size limit. [SQLSTATE 01000] (Error 1708) 

    Please suggest me how should i proceed further on this.

    Thanks in advance!!

    SQL server DBA

  • Well, just a guess but I would say you have a table with quite a few variable length columns that if you added up the maximum size of each column it exceeds 8060 bytes.  SQL Server will create this table and you can successfully enter data into it as long as the the actual data going in doesn't exceed 8060 bytes.

    Without more information, such as the DDL for the table, not much more we can do to help.

  • I can create this table:


    CREATE TABLE MyBigTable2
    ( myint INT,
      mychar CHAR(8000),
        mydate DATETIME,
      mychar2 VARCHAR(8000),
      mychar3 VARCHAR(8000)
    )
    GO

    If I do this, however, I get an error. The error lets me know a row with those max sizes will exceed the page size. This code


    INSERT dbo.MyBigTable2 (myint, mychar, mydate, mychar2, mychar3)
    VALUES (1, REPLICATE('A',8000), GETDATE(), REPLICATE('A',8000), REPLICATE('A',8000))

    Generates the error.

    You cannot put more than 8060 bytes on a page. That's a SQL Server limit.

  • Lynn Pettis - Tuesday, September 19, 2017 10:32 AM

    Well, just a guess but I would say you have a table with quite a few variable length columns that if you added up the maximum size of each column it exceeds 8060 bytes.  SQL Server will create this table and you can successfully enter data into it as long as the the actual data going in doesn't exceed 8060 bytes.

    Without more information, such as the DDL for the table, not much more we can do to help.

    Thank you Lynn for your quick response but all the cloumns we mentioned as nvarchar(64),some more coulmsn varchar but how we can know which cloumn is the with issue but the errow showing as complete at table "Test_tbl" leve.
    so if we need to change the existing column values to nvarchar(max).

    Please advise.

    Best Regards,

    SQL server DBA

  • Database admin(DBA) - Tuesday, September 19, 2017 12:58 PM

    Lynn Pettis - Tuesday, September 19, 2017 10:32 AM

    Well, just a guess but I would say you have a table with quite a few variable length columns that if you added up the maximum size of each column it exceeds 8060 bytes.  SQL Server will create this table and you can successfully enter data into it as long as the the actual data going in doesn't exceed 8060 bytes.

    Without more information, such as the DDL for the table, not much more we can do to help.

    Thank you Lynn for your quick response but all the cloumns we mentioned as nvarchar(64),some more coulmsn varchar but how we can know which cloumn is the with issue but the errow showing as complete at table "Test_tbl" leve.
    so if we need to change the existing column values to nvarchar(max).

    Please advise.

    Best Regards,

    It could be one or more columns causing the issue.  The only way to know is to look at the data that is causing the issues.

  • Lynn Pettis - Tuesday, September 19, 2017 1:12 PM

    Database admin(DBA) - Tuesday, September 19, 2017 12:58 PM

    Lynn Pettis - Tuesday, September 19, 2017 10:32 AM

    Well, just a guess but I would say you have a table with quite a few variable length columns that if you added up the maximum size of each column it exceeds 8060 bytes.  SQL Server will create this table and you can successfully enter data into it as long as the the actual data going in doesn't exceed 8060 bytes.

    Without more information, such as the DDL for the table, not much more we can do to help.

    Thank you Lynn for your quick response but all the cloumns we mentioned as nvarchar(64),some more coulmsn varchar but how we can know which cloumn is the with issue but the errow showing as complete at table "Test_tbl" leve.
    so if we need to change the existing column values to nvarchar(max).

    Please advise.

    Best Regards,

    It could be one or more columns causing the issue.  The only way to know is to look at the data that is causing the issues.

    Understood but its coming from txt file which is bulk insert into the table as batch size more than 12000.it could be that issue? Please suggest

    Thanks in advance!!

    SQL server DBA

  • No, that's not the issue.
    Batch size is the number of rows that are loaded at a time. The problem that you have is that one or more of your rows are too large, and hence failing to insert.
    You need to look at the data, see why the row is too large, and then decide from there what to do. It could even be an error in the file that's resulting in way too much going into one row. You have to look at the data to see.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster - Tuesday, September 19, 2017 2:22 PM

    No, that's not the issue.
    Batch size is the number of rows that are loaded at a time. The problem that you have is that one or more of your rows are too large, and hence failing to insert.
    You need to look at the data, see why the row is too large, and then decide from there what to do. It could even be an error in the file that's resulting in way too much going into one row. You have to look at the data to see.

    Thank you Gail for your response on this.I will check on that part, but in case if the row size is more than 8060 then how we can handle this issue...
    Please suggest.

    SQL server DBA

  • Redesign your table(s).  Best suggestion I can give, especially since we know nothing about your system.

  • There's nothing you can do from in your existing process if there are rows with a size > 8060. You need to truncate them before they load to ensure the total row size is < 8060 or load to multiple tables or something else.

    I could have  lots of columns that are nvarchar(64), which is 128 bytes. If I had 40 of these, I'd exceed 8060 bytes in total, if every column used the full 64 Unicode bytes.

  • Database admin(DBA) - Tuesday, September 19, 2017 2:28 PM

     in case if the row size is more than 8060 then how we can handle this issue...

    Redesign the table so that the row size is not over 8060, or change the column that has the large size to MAX data type and accept the performance hit.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Database admin(DBA) - Tuesday, September 19, 2017 2:10 PM

    Lynn Pettis - Tuesday, September 19, 2017 1:12 PM

    Database admin(DBA) - Tuesday, September 19, 2017 12:58 PM

    Lynn Pettis - Tuesday, September 19, 2017 10:32 AM

    Well, just a guess but I would say you have a table with quite a few variable length columns that if you added up the maximum size of each column it exceeds 8060 bytes.  SQL Server will create this table and you can successfully enter data into it as long as the the actual data going in doesn't exceed 8060 bytes.

    Without more information, such as the DDL for the table, not much more we can do to help.

    Thank you Lynn for your quick response but all the cloumns we mentioned as nvarchar(64),some more coulmsn varchar but how we can know which cloumn is the with issue but the errow showing as complete at table "Test_tbl" leve.
    so if we need to change the existing column values to nvarchar(max).

    Please advise.

    Best Regards,

    It could be one or more columns causing the issue.  The only way to know is to look at the data that is causing the issues.

    Understood but its coming from txt file which is bulk insert into the table as batch size more than 12000.it could be that issue? Please suggest

    Thanks in advance!!

    Import each line of the table as a "blob" into a table that uses a MAX datatype for the receiving column.
    Write a select to list all of the rows > 8060.  Yeah, that count's delimiters and some rows <= 8060 of actual data will squeak by but we're just looking for things and need some place to start.
    Either use a splitter to find the out what the content of each column is on the long rows or export to a file and open that file in Excel to start your research on which row(s) is/are causing the problem.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply