July 10, 2003 at 2:00 am
Hi
We are new to Analysis Services, we created 2 cubes and processed them with no problem. However we cannot process our third cube as the tempdb file. Initially we were limited in space for the tempdb file to 24Gb - however with a new format the last tempdb file filled up 94Gb before it ran out of space.
Any ideas on what the issue could be or on calculating the size of the tempdb file?
There are 16 million rows in the fact table and five dimensions - one difference to the other cubes is that we have one more dimension and instead of using technical keys to link the tables we are using text strings
Thanks
Colin
July 10, 2003 at 10:25 am
After a long search I found the following document that began to answer some of my questions and may prove useful to others.
Any other Insight would be useful
July 10, 2003 at 1:31 pm
May help if you create indexes on the text fields.
July 27, 2003 at 1:44 am
Hi Colin,
Have you optimised the model? You may/may not be able to do this based on your current join strategy.
By optimising the model (where possible) it means that the actual TSQL query fired off against the SQL DB is usually just a table scan of the fact table, versus the usual joining of all the dim tables with the fact table (which can require heaps of temp space depending on the query).
HTH,
Steve
Steve.
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply