Storage Size

  • alok-500895

    SSC Rookie

    Points: 37

    Comments posted to this topic are about the item Storage Size

  • Vivien Xing

    SSChampion

    Points: 12297

    Name + profession can take max 154?

    [Name] NVARCHAR(50) --> 50 X 2 + 2 = 102

    Ref BOL: "nchar and nvarchar (Transact-SQL)"

    The storage size, in bytes, is two times the number of characters entered + 2 bytes.

    sp_help tbPersonalDetails shows total 316 bytes in length.

  • Anders Pedersen

    SSChampion

    Points: 11410

    Without couting them all up.... there are 150 Nvarchars, so the answer has to be somewhere above 300 bytes....

  • Manish_

    Ten Centuries

    Points: 1070

    CoulumnName Type Length

    ---------------------------------------

    PK_ID int 4

    Name nvarchar 100

    Married bit 1

    DOB smalldatetime 4

    BachelorsDegree bit 1

    MastersDegree bit 1

    Profession nvarchar 200

    Retired bit 1

    Age int 4

    ---------------------------------------

    316

    ---------------------------------------

    - Manish

  • gbn

    SSCertifiable

    Points: 6379

    Age is not persisted, so it is 312.

    But there is the "+2" per variable length column.

    So 316... but for the wrong reasons

  • MarkusB

    SSC-Dedicated

    Points: 37369

    What's going on with these questions.

    Again one with no correct answer to choose from.

    I know Steve Jones explained the process of choosing and testing QOTD in his article yesterday and I really don't wanna blame him.

    But come on this one today is so obviously wrong, I didn't even bother to pick an answer.

    [font="Verdana"]Markus Bohse[/font]

  • Yogeesh

    SSC-Addicted

    Points: 489

    I am not seeing the source from where this answer is taken. New way of computations!

  • skra

    SSCommitted

    Points: 1614

    By the way, all bit fields are kept in one byte.

    BOL:

    "Microsoft® SQL Server™ optimizes the storage used for bit columns. If there are 8 or fewer bit columns in a table, the columns are stored as 1 byte. If there are from 9 through 16 bit columns, they are stored as 2 bytes, and so on."

  • Mighty

    SSCrazy Eights

    Points: 8597

    Randomly choose an answer since all were wrong (25% chance I would choose the "right" one).

    Based on BOL "Estimate the size of a heap" (Row_Size = Fixed_Data_Size + Variable_Data_Size + Null_Bitmap + 4 = 17 + 306 + 4 + 4) the answer should be 331, but I'm most likely also making a calculation mistake 😉

  • Hugo Kornelis

    SSC Guru

    Points: 64685

    Hi all,

    I think this is a great question, though thwarted by an incorrect answer, unfortunately.

    The question is great because hardly anyone knows exactly how this works. That is shown by the fact that of the "correct" answers posted thus far in this thread, none is actually correct. (And boy, do I expose myself to ridicule now should my answer prove to be wrong as well :D)

    But the incorrect answer options result in this QotD being more of a missed opportunity than a learning experience. Sadly.

    So, here are the things that are wrong in either the answer provided as "correct", or in the replies I have seen so far.

    * NVarchar takes two bytes per character. The QotD misses this, the replies here don't.

    * Up to 8 BIT columns are combined in a single bytes. Calculated correctly by QotD, missed by all forum replies.

    * Calulcated columns are not stored, unless marked as PERSISTED or included in an index (and in the latter case, they are stored only in the index). Calculated correctly by QotD, missed by most forum replies.

    * All varying length columns have a two byte overhead. Missed by the QotD and some of the replies here

    * Another two bytes overhead per row are added if a table has at least one varying length column. Missed by all.

    * The row also stores a NULL bitmap, one byte for the first 8 columns, one for the next 8, etc. I think that calculated columns don't count (but couldn't find conmfirmation), so that would be 1 byte for the 8 "normal" columns. Missed by all.

    * Each row has 4 bytes row header overhead; again missed by all.

    * And finally, 2 bytes per row are needed for the row's entry in the slot array, though Books Online does not include this in the calculation for Row_Size.

    So we have a maximum data size of 300 bytes for the two nvarchar columns, 4 for the integer, 4 for the smalldatetime, 1 for the 4 bits, and 0 for the (incorrectly) calculated age - max. 309 data size.

    We also have some in-row overhead: null bitmap (1), varying length columns (2 because they exist, plus 2 for each varying length column - adding up to 6), and row header overhead (4) - a total of 11 bytes for in-row overhead.

    According to the definition of Row_Size in Books Online, that would make for 309 + 11 = 320 bytes per row.

    For the real storage requirement per row, we should also include the 2 bytes for the row slot array, so this table will realy take a maximum of 322 bytes of storage per row.

    Ref: http://technet.microsoft.com/en-us/library/ms178085.aspx


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/

  • majorbloodnock

    SSCrazy Eights

    Points: 9384

    I agree with Hugo; I thought this was a good question.

    Being before my second morning cuppa, I misread nvarchar, so neglected to factor unicode into my calculations, and so got the question "right". However, I most certainly didn't think about the other "overhead" costs, and so I've learnt something new. Is it an important something? Probably not, given the fact that my table size estimates are based on far more approximate figures anyway (Microsoft don't have a 10% "For Luck" overhead included in their calculations ;)), but it's added to my understanding which I don't think can ever be a waste.

    So thank you, Alok, for a good question, and commiserations that your answer choices were off the mark.

    And a reminder to everyone having a moan that Steve did pose the far bigger challenge (if you're up for it) that you can submit QOTDs of your own. Basically, if you think you can do better, prove it.

    Semper in excretia, sumus solum profundum variat

  • Evgeny G. Shymshyt

    Old Hand

    Points: 313

    So, correct variant of question is:

    What is the maximum data size of a row in the following table

    (without additional overhead expenses bytes and for NVARCHAR type calculating only symbols, not bytes)

    CREATE TABLE tbPersonalDetails(

    PK_ID INT NOT NULL PRIMARY KEY,

    [Name] NVARCHAR(50) NOT NULL,

    Married BIT NOT NULL,

    DOB SMALLDATETIME NOT NULL,

    BachelorsDegree BIT NOT NULL,

    MastersDegree BIT NOT NULL,

    Profession NVARCHAR(100) NOT NULL,

    Retired BIT NOT NULL,

    Age AS (DATEDIFF(yy,DOB,GETDATE()))

    )

    GO

  • Maarten Takens

    SSC Eights!

    Points: 814

    I would have suggested 316 bytes max storageper row but i have read in this Forum ( special thanks to Hugo Kornelis ) that this should be 322 bytes per row ..

    although it is a pity that the QOD does not provide the correct answers to this question the discussion in this Forum is great and you really can learn from it! ..

    See yaa

    Maarten

  • shamas saeed

    SSC-Addicted

    Points: 458

    Check ur Question Complete Context before post because this will missguid many peoples who do not know detail

    In Your Qustion nvarchar used double bytes as length define and Computed column also use int datatype storage bytes for check create the mentioned table and use Alt+F1 to see its help

  • Bill Mell

    SSCommitted

    Points: 1621

    I got it right only because I pretended that an nvarchar(50) only takes 50 bytes instead of the 100 it actually takes and that an nvarchar(100) only takes 100 bytes instead of the 200 that it actually takes.

    You would think that you would have had the correct answer in there at least once.

Viewing 15 posts - 1 through 15 (of 58 total)

You must be logged in to reply to this topic. Login to reply