ssas tabular model view

  • Hello, I have a SQL DB where we have created some views based on dim and fact tables. I need to build SSAS tabular model based on my tables and views. But one of the view runs for 1.5 hour inside SQL query (SSMS). Now I need to use this same view to build my SSAS tabular model but 1.5 hour is not acceptable. This view is made up of lot of table joins. So 1) if I bring all these tables being used in this view inside my SSAS tabular model but then I am not sure how to join them all and use where clauses inside SSSAS and build something similar to my view. Can you help me with that? or 2) if there is a better easy way to achieve this? Thanks.

  • IMO, the tabular model should be kept simple.  You want it be easily consumed by BI users without a ton of modeling transformations.  I would look hard at the execution plan to see why my view is so slow.  Start there.  What do you find?  After you check that let me know.  There are options to make it faster.

  • Thanks a lot @heb1014. So When I look at the view it uses around 10 tables join and one of the table has 3 billion rows and uses partitions. It has only 1 clustered index. The other tables are not large. The result of the view produces  1.9 miliion rows. I am not sure about the best practices around partitioned tables and how to improve performance etc.

  • That's a decent size table 😉

    I would shove any intimidation I had about such a large table and press forward with tuning my query.  I'd probably start with getting a restored database on a server where I could do anything I wanted.  I don't want my work to interfere with other workloads and I want to be able to change things without any concern.

    Having a single clustered index is probably an issue.  Maybe the main application doesn't need more than that, but other queries (like those loading Tabular models) likely do.

    After you're sure your work won't impact others, you should run the query and look at the execution plan and query stats.  Just off the top of my head, I'd start small:

    • Maybe get the estimated execution plan and investigate that.
    • Run a top 1000 query (looking for it to finish fast).
    • Take out the big table from the query to see if the query runs fast (wanting to make sure the big table is the problem).

    main point - keep working at it until you make it faster.  I've seen very small adjustments (like adding a filtered non-clustered index) yield MASSIVE improvements.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply