GUID vs BIGINT

  • GilaMonster (5/21/2014)


    Are you explicitly inserting NEWID() into the table, or are you allowing the default to be used?

    I am explicitly Inserting into table using UidcreateSequential Method

    ]http://msdn.microsoft.com/en-us/library/aa379322(VS.85).aspx[/url]

    http://msdn.microsoft.com/en-us/library/ms189786.aspx

    Thanks

  • on newsequentialid() below is the table fragmentation level

    - Scan Density [Best Count:Actual Count].......: 92.33% [17644:19110]

    - Logical Scan Fragmentation ..................: 38.06%

    - Extent Scan Fragmentation ...................: 7.83%

    is this ok?

    this one was with newid()

    - Scan Density [Best Count:Actual Count].......: 59.42% [17612:29642]

    - Logical Scan Fragmentation ..................: 94.77%

    so can i say that using newdsequentialid the fragmentation has reduced ?

  • Is this a homework exercise or something? Do you understand how indexes work? Please start with these articles by Gail Shaw:

    http://www.sqlservercentral.com/articles/Indexing/68439/

    http://www.sqlservercentral.com/articles/Indexing/68563/

    http://www.sqlservercentral.com/articles/Indexing/68636/

    Then please read posts #2 and #3 of this thread.

  • If i wish to use bigint for storing huge number of records in a table, i know bigint can go upto 9223372036854775807.

    What if for next record if i reach max value(9223372036854775807)?

    Any suggestions would be appreciated

  • msu.shahidullah wrote:

    If i wish to use bigint for storing huge number of records in a table, i know bigint can go upto 9223372036854775807.

    What if for next record if i reach max value(9223372036854775807)?

    Any suggestions would be appreciated

     

    I am guessing that you used 1 to 9223372036854775807.

    If so, then you can always use -1 to -9223372036854775807.

     

    Although, I would probaly add another field to make a composite key, allowing you to re-use the numbers, but with a separate uniquifier.  Not knowing your data, but also not likely to use up a full set of BIGIINT in a year, the 2nd column could be populated using YEAR(GETDATE()).

  • DesNorton wrote:

    msu.shahidullah wrote:

    If i wish to use bigint for storing huge number of records in a table, i know bigint can go upto 9223372036854775807.

    What if for next record if i reach max value(9223372036854775807)?

    Any suggestions would be appreciated

    I am guessing that you used 1 to 9223372036854775807.

    If so, then you can always use -1 to -9223372036854775807.

    Although, I would probaly add another field to make a composite key, allowing you to re-use the numbers, but with a separate uniquifier.  Not knowing your data, but also not likely to use up a full set of BIGIINT in a year, the 2nd column could be populated using YEAR(GETDATE()).

     

    Regardless of the above, what is the chance of you ever hitting the BIGINT max value in your lifetime.

    If you burned through 1 million numbers per second, it would take 292471 years to get to the max number that BIGINT could handle.

Viewing 6 posts - 31 through 35 (of 35 total)

You must be logged in to reply to this topic. Login to reply