Running Out Of Foreign Keys

  • Hugo Kornelis (5/20/2014)


    Oh wow - that is so awesome! 😉

    What error message do you get - directly related to the foreign keys, or just the optimizer crashing or running out of resources?

    Please forgive me as I'm doing this from memory (and the memory of my coworker who originally encountered the problem that I asked). I don't believe the error specifically said foreign keys. I believe the plan was too complicated to create.

    It appears however that it maybe version dependent. I just tried it locally on my 2008 R2 SQL Express 64bit and it didn't error. It also didn't error on our new production servers which are 64bit (previously the servers were 32bit).

    Hugo Kornelis (5/20/2014)


    When you re-create the FKs, do you use WITH CHECK or WITH NOCHECK, and how long does that take? Have you considered/tried to not remove but just disable the FKs, then enable them again after the removal? (If you do this, make sure to use ALTER TABLE xxx WITH CHECK CHECK CONSTRAINT yyy - the double CHECK is not an error, you actually need this to prevent getting non-trusted constraints).

    I was not the one working on the customer ticket that required the dropping and re-creating so I don't know how it was done or how long it took.

    Hugo Kornelis (5/20/2014)


    Are there any plans to redesign this mess into something more manageble?

    I believe there are plans, but this involves some significant structural changes so it's not happening soon. We actually have another table with the same problem but not as many foreign keys that will be getting fixed as it is not as difficult.

  • I think the most I've seen 12 FKs on a table.

    Tom

  • Nice, inventive way to form a QotD. Thanks, Andy!

  • Nice question, clear, unambiguous.

    But it's kind of interesting .... you know that Chinese proverb (or maybe urban myth?) about interesting times?

    The interesting thing here is that 253 is a suggested limit on breadth immediately below a node in a tree all of whose members have the same key, and limts neither the number of nodes in the whole tree nor the numer of nodes at any level not adjacent to the root (root is above? terminology is loony, but that's pretty normal). So for any practical purposes it seems rather meaningless: consider 127 tables referencing table A attribute X, and 127 tables referencing table B attribute X, and attribute X in table A referencing table C attribute X, and attribute X in table B referencing table C attribute X; that delivers in effect 254 tables referencing (indirectly) table C attribute X so the limit as expressed is pretty meaningless. Perhaps there is some real problem with going beyond 253 direct attributes but there's no indication of how that problem arises and whether the same limit applies to indirect references as in the suggested example above, and perhaps there would have to be some pretty bizarre logic to make 253 a performance turning point for direct but not for indirect references.

    Tom

  • TomThomson (5/20/2014)


    Nice question, clear, unambiguous.

    But it's kind of interesting .... you know that Chinese proverb (or maybe urban myth?) about interesting times?

    The interesting thing here is that 253 is a suggested limit on breadth immediately below a node in a tree all of whose members have the same key, and limts neither the number of nodes in the whole tree nor the numer of nodes at any level not adjacent to the root (root is above? terminology is loony, but that's pretty normal). So for any practical purposes it seems rather meaningless: consider 127 tables referencing table A attribute X, and 127 tables referencing table B attribute X, and attribute X in table A referencing table C attribute X, and attribute X in table B referencing table C attribute X; that delivers in effect 254 tables referencing (indirectly) table C attribute X so the limit as expressed is pretty meaningless. Perhaps there is some real problem with going beyond 253 direct attributes but there's no indication of how that problem arises and whether the same limit applies to indirect references as in the suggested example above, and perhaps there would have to be some pretty bizarre logic to make 253 a performance turning point for direct but not for indirect references.

    Interesting, 253 is (2^8) - 3, 127 is (2^7) -1, so for this equation, where has 1 gone?

    Good question Andy, made me ponder on the reasons for the suggested limits. I have seen this and a lot worse when the schema has been derived from (read created by) the likes of Entity Framework.

    😎

  • I decided to run a quick experiment to see if I could find any limits. Here is the code I used to create a table with lots of foreign keys:

    USE tempdb;

    IF EXISTS (SELECT * FROM sys.databases WHERE name = 'QotD')

    BEGIN;

    ALTER DATABASE QotD SET SINGLE_USER WITH ROLLBACK IMMEDIATE;

    DROP DATABASE QotD;

    END;

    CREATE DATABASE QotD;

    GO

    USE QotD;

    DECLARE @Base nvarchar(MAX) = N'',

    @Ref nvarchar(MAX) = N'CREATE TABLE dbo.Ref(PKCol int NOT NULL, ';

    DECLARE @max-2 int = 1000,

    @i int = 1;

    WHILE @i < @max-2

    BEGIN;

    SET @Base += N'CREATE TABLE dbo.T' + CAST(@i AS nvarchar(20))

    + N'(Col' + CAST(@i AS nvarchar(20)) + N' int NOT NULL PRIMARY KEY); ';

    SET @Ref += N'Col' + CAST(@i AS nvarchar(20))

    + N' int NOT NULL DEFAULT(1) REFERENCES dbo.T' + CAST(@i AS nvarchar(20)) + N', ';

    SET @i += 1;

    END;

    SET @Ref += N'PRIMARY KEY(PKCol));'

    EXECUTE (@Base);

    EXECUTE (@Ref);

    /*

    -- Execute the above, then select the below and execute to check time, or get estimated plan for fun

    SET STATISTICS TIME ON;

    INSERT INTO dbo.Ref DEFAULT VALUES;

    go

    SET STATISTICS TIME OFF;

    */

    By changing the value of @max-2, I could specify the number of foreign keys. It will create lots of tables, but you can just drop the QotD database afterwards.

    I didn't see a noticeable performance difference between 250 and 260. I noticed a slight slowdown at 500, and a severe slowdown at 750. At a 1000, the query failed with error 701: insufficient system memory in resourcepool 'default' to run the query. I got the same error when requesting the estimated plan, so in spite of the error message, I think that the compilation failed. More memory (I have 8GB installed, 6GB available to SQL Server) might have helped.

    I was impressed to see that Management Studio was able to render the graphical plan for up to 250 foreign keys - SQL Sentry Plan Explorer already crashed at "only" 200 foreign keys.


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Hugo Kornelis (5/20/2014)


    I decided to run a quick experiment to see if I could find any limits. Here is the code I used to create a table with lots of foreign keys:

    USE tempdb;

    IF EXISTS (SELECT * FROM sys.databases WHERE name = 'QotD')

    BEGIN;

    ALTER DATABASE QotD SET SINGLE_USER WITH ROLLBACK IMMEDIATE;

    DROP DATABASE QotD;

    END;

    CREATE DATABASE QotD;

    GO

    USE QotD;

    DECLARE @Base nvarchar(MAX) = N'',

    @Ref nvarchar(MAX) = N'CREATE TABLE dbo.Ref(PKCol int NOT NULL, ';

    DECLARE @max-2 int = 1000,

    @i int = 1;

    WHILE @i < @max-2

    BEGIN;

    SET @Base += N'CREATE TABLE dbo.T' + CAST(@i AS nvarchar(20))

    + N'(Col' + CAST(@i AS nvarchar(20)) + N' int NOT NULL PRIMARY KEY); ';

    SET @Ref += N'Col' + CAST(@i AS nvarchar(20))

    + N' int NOT NULL DEFAULT(1) REFERENCES dbo.T' + CAST(@i AS nvarchar(20)) + N', ';

    SET @i += 1;

    END;

    SET @Ref += N'PRIMARY KEY(PKCol));'

    EXECUTE (@Base);

    EXECUTE (@Ref);

    /*

    -- Execute the above, then select the below and execute to check time, or get estimated plan for fun

    SET STATISTICS TIME ON;

    INSERT INTO dbo.Ref DEFAULT VALUES;

    go

    SET STATISTICS TIME OFF;

    */

    By changing the value of @max-2, I could specify the number of foreign keys. It will create lots of tables, but you can just drop the QotD database afterwards.

    I didn't see a noticeable performance difference between 250 and 260. I noticed a slight slowdown at 500, and a severe slowdown at 750. At a 1000, the query failed with error 701: insufficient system memory in resourcepool 'default' to run the query. I got the same error when requesting the estimated plan, so in spite of the error message, I think that the compilation failed. More memory (I have 8GB installed, 6GB available to SQL Server) might have helped.

    I was impressed to see that Management Studio was able to render the graphical plan for up to 250 foreign keys - SQL Sentry Plan Explorer already crashed at "only" 200 foreign keys.

    +1

    😎

  • Good shooting, Hugo1 🙂 I was too lazy to experiment :blush:, so I'm glad to see your results.

    edit: "Hugo1"! My left little finger has failed me - that "1" should be a "!".

    Tom

  • TomThomson (5/20/2014)


    Hugo1

    So you finally discovered the existence of my evil twin? (Hugo2)


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • Hugo Kornelis (5/20/2014)


    TomThomson (5/20/2014)


    Hugo1

    So you finally discovered the existence of my evil twin? (Hugo2)

    (a) See edit to above post.

    (b) I'm tempted, but won't.

    (c) Well, you know about my evil twin already, so it's only fair.

    Tom

  • TomThomson (5/20/2014)


    Good shooting, Hugo1 🙂 I was too lazy to experiment :blush:, so I'm glad to see your results.

    edit: "Hugo1"! My left little finger has failed me - that "1" should be a "!".

    As Master Oogway says.. "there are no accidents"... It was meant to be as Hugo1 (as in Chosen-one) for this QToD. 😀

    ww; Raghu
    --
    The first and the hardest SQL statement I have wrote- "select * from customers" - and I was happy and felt smart.

  • Hugo Kornelis (5/20/2014)


    I decided to run a quick experiment to see if I could find any limits. Here is the code I used to create a table with lots of foreign keys...

    Thank you Hugo, this is excellent. 🙂

    (I was in between of creating such script late night and then one yawn came... then thud)

    ww; Raghu
    --
    The first and the hardest SQL statement I have wrote- "select * from customers" - and I was happy and felt smart.

  • Thanks Andy for the question...

    I have a table with 30 foreign key reference attached to on which I am working on, with on an average 5 lac record being updated daily...

    Bhaskar Shetty

  • Good question and great followup. Thanks for a good one, Andy.

  • Hugo, that's good stuff! Should send a note to the Sentry guys too.

Viewing 15 posts - 16 through 30 (of 38 total)

You must be logged in to reply to this topic. Login to reply