TempDB when processing a cube

  • Hi

    We are new to Analysis Services, we created 2 cubes and processed them with no problem. However we cannot process our third cube as the tempdb file. Initially we were limited in space for the tempdb file to 24Gb - however with a new format the last tempdb file filled up 94Gb before it ran out of space.

    Any ideas on what the issue could be or on calculating the size of the tempdb file?

    There are 16 million rows in the fact table and five dimensions - one difference to the other cubes is that we have one more dimension and instead of using technical keys to link the tables we are using text strings

    Thanks

    Colin

  • After a long search I found the following document that began to answer some of my questions and may prove useful to others.

    http://www.microsoft.com/technet/treeview/default.asp?url=/TechNet/ProdTechnol/SQL/Maintain/Optimize/ANSvcsPG.asp

    Any other Insight would be useful

  • May help if you create indexes on the text fields.

  • Hi Colin,

    Have you optimised the model? You may/may not be able to do this based on your current join strategy.

    By optimising the model (where possible) it means that the actual TSQL query fired off against the SQL DB is usually just a table scan of the fact table, versus the usual joining of all the dim tables with the fact table (which can require heaps of temp space depending on the query).

    HTH,

    Steve

    Steve.

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply