Today I’ll be continuing in a blog series designed to help you decide which presentation tool is best for your reporting needs. So far I’ve discussed in previous posts:
You learned how these two tools are almost opposites of each other. Reporting Services is generally thought of as a static reporting tool, while Excel is usually used for ad-hoc reporting. The products still left to cover are:
Part three of the series will focus on PowerPivot.
PowerPivot is much more powerful than any of the other reporting tools that will be discuss in this series because it does much more than just product reports. While it does have the ability to create some impressive reports it really cannot create any addition visualization that regular Excel PivotTables can't already do.
The real impressive part about PowerPivot is the modeling part of it. With PowerPivot you actually design a modeling layer that brings in the different objects that you choose. I purposely use the word objects generically here because you can import any data source that you can think of into PowerPivot. Whether you need to bring in a table from SQL Server, a flat file or even something from DB2 it can all be done very easily with PowerPivot. In fact, even if I needed to bring in all three of those objects into the same PowerPivot document and relate them to each other that is possible too. Within the tool you have the ability to create logical relationships that may not exist on the source systems. The view below shows how the new PowerPivot 2012 has improved the way relationships are designed. Previously designing these relationships was done through a single dialog box instead of the new graphic diagram view shown here.
PowerPivot uses a fairly new in-memory technology called xVelocity (formally called Vertipaq) to handle all of the report processing requests. With xVelocity Excel is able to process hundreds of millions of rows with amazing response times from a desktop machine. The xVelocity engine uses in-memory column-oriented storage and a highly compression data storage to produced the results you see today.
Another key thing to understand about PowerPivot is that the data pulled into PowerPivot is actually stored in the document an import is performed. That allows it to use the store engine described earlier. The one problem with this is that the data is static until manual update is kicked off. When the update is run all of the data in the PowerPivot document is reloaded backed back in file. Unfortunately, there is not incremental update yet so if you have a significantly large data source it may take a while update. Later when I discuss how PowerPivot can be consumed I’ll talk about how SharePoint can assist in automating this data refresh process. SharePoint is really the true way to scale out PowerPivot so other users can utilize the reports you build.
One worry that many IT staffers feel about PowerPivot is that power users will begin creating these documents and making decisions off of them while10 other users have created simpler but different documents and get different results. If you see something like this happening in your environment, which is completely possible, you could create a full Analysis Services solution (either tabular or multidimensional) to replace the PowerPivot documents to ensure all these users a consuming a single source for their reports.
It is also important to note that PowerPivot is not a replacement for any ETL that performs data cleansing or applies business rules. Most data warehouses have a set of ETL processes that perform these tasks and that way all users are looking at the same data set. PowerPivot does not eliminate the need for this.
The goal is to have power users the driving force behind the creation of PowerPivot documents. This would be a person that understands the source database(s), business needs, understands the concept database relationships and understands Excel all fairly well.
While that may be the goal for who should be using PowerPivot that is not what I actually see happening in the field. I am still seeing most PowerPivot implementations being completely controlled by IT.
Over the last year I have seen this start to change so it does become more user driven but I think the problem that is preventing more users from getting their hands on the tool is education. A lot of clients that I visit either don’t know what PowerPivot is or if they do they’ve never be taught (even in a simple demo) how it works. So until this changes we may see a lot of PowerPivot solutions started in IT.
The two ways PowerPivot can be consumed is the either directly through Excel or through SharePoint. PowerPivot documents that are used directly through Excel rely completely on the machines resources they are viewed from. So for example if I create and use a PowerPivot document on my laptop then it use all the resources of my laptop for importing data and processing results. If I wanted others to see this report using this method I would have to either place the file on a shared drive or email it to those I want to view it. That sounds terrible!
The best way to scale PowerPivot so that it is usable by a larger number of users is to setup SharePoint integration. When PowerPivot for SharePoint is installed any reports viewed from a PowerPivot Gallery (SharePoint library for PowerPivot) run using a special Analysis Services instance to do all report processing rather than your laptops resources as previously described. You may have noticed going through the SQL Server 2008 R2 install that there is an option to install SQL Server PowerPivot for SharePoint. Using this SharePoint integration not only allows you to using Analysis Services for report processing but also allows you to schedule data refreshes, which is a huge help because normally data refreshes are a manual process without SharePoint.
The major known limitations with PowerPivot are experienced when SharePoint is not part of the solution. All of these have been detailed previously but as a reminder:
PowerPivot for SharePoint of course addresses each of these limitations. One other limitation that I have not detailed yet is working on 64 bit vs 32 bit PowerPivot. I highly recommend if you use PowerPivot that you only do it with the 64 bit version. You will find out very quickly when you begin importing data sources into you document that without 64 bit the process can be slow and painful. You might even run into some limitations with the amount of data it will import on a 32 bit instance. With 64 bit PowerPivot the sky is the limits though!
As we go through this series remember these high level characteristics about PowerPivot:
I hope you’ve found this helpful and stay tuned for the Part 3 in this series on PerformancePoint. To read any of the other parts to this series follow the links below.