• Iulian -207023 (11/16/2011)


    I am not worried about the lost point, but is there anything I have missed?

    No, that was my mistake. I added the index rebuild to the question to make sure there was no fragmentation, regardless of how the data was filled. But I forgot that rebuilding an index required SQL Server to temporarily duplicate it - so during the rebuild, you need twice as much space for the table. That's a little over 3 GB.

    I should have tested with the given amount of data. But I wanted to save time, so I tested with one tenth and checked that all the calculation and formulas I had in my spreadsheet exactly predicted the amount of pages used. I was then confident that my date size calculations would be correct for the full million rows as well. Which in fact they were - if I had specified the size of the data file as 3.5GB or so, there would not have been any problem.

    (Funny side story - just to be sure, I just now did run the tests for the full million rows. First with a 2GB data file, then with a 3.5GB data file. To my utter surprise, the index rebuild did NOT produce an error when I tested with the 2GB data file. It took me some time before I found the cause - a missing USE statement. I now have a very bloated master database... :Whistling:)


    Hugo Kornelis, SQL Server/Data Platform MVP (2006-2016)
    Visit my SQL Server blog: https://sqlserverfast.com/blog/
    SQL Server Execution Plan Reference: https://sqlserverfast.com/epr/