Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase 12345»»»

Table space usage 2 Expand / Collapse
Author
Message
Posted Tuesday, November 15, 2011 9:37 PM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:58 PM
Points: 5,965, Visits: 8,216
Comments posted to this topic are about the item Table space usage 2


Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1206403
Posted Wednesday, November 16, 2011 12:09 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:58 PM
Points: 5,965, Visits: 8,216
Quick message to those who arrive here: I am aware that this question involves a lot of calculations - much more hard work than normal questions. So my compliments on all your hard work, and I hope it paid off by earning you a point.

My question is: would you like a few more questions of this kind, or would submitting the rest of the questions I was planning expose me to death threats, stale fruit, or lots of boo-ing and hissing?
This is a serious question. If the majority feels that the QotD should not involve that much work, I will not submit the other similar questions I was planning.

(For full disclosure, my plan was to have one or two more questions to simply calculate disk space, then take it to the next level and use this knowledge as a foundation to calculate the amount of reads required for certain queries and execution plans, which should help you understand the optimizers' choices and the usefulness (or lack thereof) of indexes).



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1206434
Posted Wednesday, November 16, 2011 12:51 AM
Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Friday, May 9, 2014 12:47 AM
Points: 3,448, Visits: 4,407
I really like this kind of questions

...but the question has a minor issue. When I created the QOD database with the DemoTable table and tried to rebuild the index, I got the following error message:
The statement has been terminated.
Msg 1101, Level 17, State 12, Line 1
Could not allocate a new page for database 'QOD' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.


PS. When I was calculating the size of the upper levels of the index, I stubbornly divided the number of rows by 426, not the number of leaf pages.
1,000,000 / 426 = 2348
2348 / 426 = 6
6 / 426 = 1
2348 + 6 + 1 = 2355
"Why there's no such answer as 202356?" – that was a sore subject for me for at least 20 minutes
Post #1206452
Posted Wednesday, November 16, 2011 1:06 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:58 PM
Points: 5,965, Visits: 8,216
vk-kirov (11/16/2011)
...but the question has a minor issue. When I created the QOD database with the DemoTable table and tried to rebuild the index, I got the following error message:
The statement has been terminated.
Msg 1101, Level 17, State 12, Line 1
Could not allocate a new page for database 'QOD' because of insufficient disk space in filegroup 'PRIMARY'. Create the necessary space by dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.

#$%#$^& !!!

You won't believe the time I spent going over my calculations again and again, to make sure I didn't mess this one up - and then I forget that rebuilding the index will temporarily require double the size. In a few hours, my mailbox will flood with ne wpost notifications of the "I want my point back" type. (kicks self)

BTW, I deliberately chose to use a very high number of rows because I didn't want people to run the code, but to do the maths. I did run the code myself, though, but only with 100,000 rows; I used that to check the formulas in my spreadsheet, then changed the number of rows in the sheet to 1,000,000. So I guess I should have tested it with the correct number of rows. Hmmm, maybe I should just pop up the row count to a billion for the next question?



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1206455
Posted Wednesday, November 16, 2011 2:58 AM
SSCommitted

SSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommittedSSCommitted

Group: General Forum Members
Last Login: Yesterday @ 10:10 AM
Points: 1,930, Visits: 3,155
Once the table is created, you add exactly one million rows to it, all with ShortDescription equal to 'None', LongDescription and AddDate set to NULL, all numeric columns equal to 1, and AddDate left at its default value. You then execute the statement below to rebuild the indexes and eliminate any potential fragmentation resulting from adding the one million rows.

Quick point - surely that should be with ShortDescription equal to 'None', LongDescription and ChgDate set to NULL ?


Maybe I'm just sore from getting it wrong :)

I liked the question - it's useful because it helps marry the real world space calculation to the way SQL stores it.
If all QOTD's were easy or quick wouldn't bother with them :)

Oh yes - and I got it wrong because I thought the rebuild would push the size over 2GB. My estimate for the number of rows was wrong in any case though!
Post #1206491
Posted Wednesday, November 16, 2011 3:19 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:58 PM
Points: 5,965, Visits: 8,216
Gazareth (11/16/2011)
Quick point - surely that should be with ShortDescription equal to 'None', LongDescription and ChgDate set to NULL ?

Yes, that is right. *curses self for not proofreading better*
Ah well. At least this mistake should not cause people to answer wrong. (My oversight with the rebuild will, though)



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1206506
Posted Wednesday, November 16, 2011 3:34 AM
SSC-Addicted

SSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-AddictedSSC-Addicted

Group: General Forum Members
Last Login: Tuesday, March 19, 2013 7:24 AM
Points: 467, Visits: 1,681
I also like this type of questiion. It tests several levels of understanding.

The explanation is also well constructed, however I think it also contains a couple of typos where the 200,000 pages becomes 20,000 part way through.

For level 1 (the lowest intermediate level), one entry is needed for each of the 200,000 pages; these 20,000 entries need 470 pages total.

and
That works out to a total of 20,000 leaf pages, 472 intermediate pages, 1 root page, and 1 IAM page = 20,474 pages.


Dave
Post #1206516
Posted Wednesday, November 16, 2011 3:54 AM


SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Wednesday, August 13, 2014 4:58 AM
Points: 2,467, Visits: 6,437
Hugo, there is another issue. DemoTableKey is the PRIMARY KEY of that table, so it must be unique. But what does one put in there (you are not specifying that) . I ended up splitting it into an INT (IDENTITY) and a CHAR(6) when I ran the script.

And despite having to run the script instead of doing the calculations I learned something from the question. I don't see why this type of question should be considered too much work by anyone. After all, when I don't know an answer to a QOtD off-hand, I need to go and consult BOL or some other source, which also takes time.


--------------------------------------------------------------------------
A little knowledge is a dangerous thing (Alexander Pope)

In order for us to help you as efficiently as possible, please read this before posting (courtesy of Jeff Moden)
Post #1206522
Posted Wednesday, November 16, 2011 4:12 AM


SSCertifiable

SSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiableSSCertifiable

Group: General Forum Members
Last Login: Yesterday @ 5:58 PM
Points: 5,965, Visits: 8,216
I was so caught up in responding to (and feeling bad about) the mistakes, that I forgot to thanks everyone for the kind words. All respondents to far have indicated that they like this (type of) question. Thanks! ;)


Dave Brooking (11/16/2011)
The explanation is also well constructed, however I think it also contains a couple of typos where the 200,000 pages becomes 20,000 part way through.

They just keep coming, don't they?
You are right, all the 20,000's should be 200,000's. The numbers derived from them are correct, though - I made the mistake while typing the explanation, but not while doing the math.


Jan Van der Eecken (11/16/2011)
Hugo, there is another issue. DemoTableKey is the PRIMARY KEY of that table, so it must be unique. But what does one put in there (you are not specifying that) .

Ah, but this was deliberate! I deliberately did NOT create an integer primary key, again in an attempt to make it a bit harder for those who might be tempted to simply "run the code". I hope the question made it clear that there were no values (although with hindsight, I could have added "and with unique values for the DemoTableKey column" to the text). Since char(10) allows for a theoretical maximum of 255 ^ 10 = 1162523670191533212890625 values, this is definitely not impossible. At only a million rows in the table, you could even create a string of 10 randomly chosen characters and have less than 0.00001% chance of getting a duplicate key violation.



Hugo Kornelis, SQL Server MVP
Visit my SQL Server blog: http://sqlblog.com/blogs/hugo_kornelis
Post #1206533
Posted Wednesday, November 16, 2011 5:27 AM
SSC Veteran

SSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC VeteranSSC Veteran

Group: General Forum Members
Last Login: Friday, May 30, 2014 9:29 AM
Points: 287, Visits: 124
Admittedly, getting it right may have biased my opinion, but I have to say that this is possibly the best QotD I've tried so far so please, more like this Hugo!

I would also mention that having done the calculations by hand, I double-checked them by running the code (I used the numbers 1 to 1000000 from a tally table in another DB I keep on hand for the PK column values) then using this query:
SELECT SUM(page_count)
FROM sys.dm_db_index_physical_stats
(
DB_ID(),
OBJECT_ID('DemoTable'),
NULL,
NULL,
'DETAILED'
);

Since you specified SQL Server 2005 and up in the question, I figured it was a good way of highlighting how the Dynamic Management Views can help us work these things out.
Post #1206583
« Prev Topic | Next Topic »

Add to briefcase 12345»»»

Permissions Expand / Collapse