Blog Post

TechEd 2010 – BI Keynote Part 2

,

In the customer demo, a few reports that were built with PowerPivot and Report Builder 3.0, which are helpful to business people in their hospital. The mapping control in RB3 is nice, and the customer said they built the report in minutes.

There are tremendous features and capabilities in Reporting Services for SQL, and you can have end users handle a lot of their needs. However you will need to provide them guidance so that they aren’t just playing with reports. I can see data pros learning to teach people how to use reports, helping them schedule data refrehses, and find data, as important skills.

The Alpha Geek Challenge, with Donald Farmer. Always a highlight of keynotes for me to see Donald. He ran this challenge, and they collected cool Powerpivot workbooks from various geeks.

The most interesting analysis was by Dan Comingore, which used AdventureWorks for an employee moral.

Dan English won the most interesting data set, with the flight information analyzed from attendees of the last BI conference.

The third winner for Brian Fosse, for most interesting visualization, showing a circular graph visualization, the radar chart.

The winners show some very cool capabilities of Excel, but I wouldn’t show them to end users as they might spend too much time actually reformatting their data, much like some people do with Word.

The overall winner was Brian Fosse, who is actually a business person. That’s the point of these technologies according to Donald and Ted.

Looking to the Future

No commitments here, but these are ideas on their mind. Real or vaporware to gauge response? Who knows.

The cloud is something they’re thinking about, and a good analogy to what is being done here. End users are using data and building applications, and someone else manages the data. That’s what happens in cloud computing. Not a bad analogy, but is it something that we want? I wonder.

The idea is to provide all capabilities of SQL Server in SQL Azure, including reporting and analytics capabilities. Makes sense, and no timeline, but I suspect we’ll see something over the next year here.

The consumization of IT. Things that used to be only for the public, search, social media features, etc. are making their way into IT. At least in MS products. I wonder if these are good or bad for business as some of these can end up being their own time sinks.

Compliance is a big deal. In terms of BI, this means (to some extent) that data quality needs to be high. But what is the correct data? At least for reference data, MDM is designed to fill this need. It’s not a bad solution, and I saw an interesting session on that yesterday.

Do BI people think of lineage? I haven’t, but it can be something that’s important.

Dependencies are also an issue, especially as we start to share reports, and build on other reports. I can see this being very important, and potentially a problem in Sharepoint 2010.

Data volumes are growing, both in size and in variety. More sources, more types, more data in absolute terms. Parallel Data Warehouse will come out this year, to allow 100TB+ for warehouses. CTP2 complete in that area, so they are working to get that finished as a product, and with reference configurations from vendors.

Project Dallas, a data marketplace, looks cool. You can get public and commercial data sets from here. That’s worth checking out, and using where you can.

“Stuff in code”, “hot off the developer machine”, glimpses of what’s coming. Amir Netz showing a few things, and he’s also one of the better speakers I’ve enjoyed over the years.

Amir shows an application authored by a VP at MS, looking at accounts and sales. It has a waterfall chart for examining how the various units, managers, salespeople are doing. Names and $$ changed, but not bad. However the VP wanted something else, so the BI people went to redesign it.

They changed the report to add the account person’s image, change the color of the background to imply the account status. Interesting idea, and then it’s extended to show everyone’s image. It uses a query, which is something that I think data pros will be writing.

It’s a fun way to examine data, but the analyst seems to still require the person using the data to play with various drill in/out of data. It would be nice to quickly, and easily do some comparisons, like the developers can do as they check in/check out code. Compare two versions of the code. It would be great to be able to compare to reports, easily, especially at two levels.

This technology, with the tile maps, should be available later this month, or next month.

Powerpivot shipped without KPI features, but since it’s built on Analysis Services, they are there. So they have bene working on adding these, and they will be available soon, without requiring MDX skills.

Amir also showed off a record view for a complex, wide table. Instead of seeing a long row, or a partial row, you see more of a report view with a single row moved into a multi-row record that fits on one screen, with labels for values. I can see this as being useful in some ways.

There are some new capabilities to actually edit and program Powerpivot sheets in BIDS. There are some more developer oriented extensions, perhaps making it easier to build complex calculations or reports for end users.

100 million rows aren’t enough sometimes. So in BIDS, Amir shows us a connection to a real SSAS instance. We see 2 billion rows of data being manipulated in Powerpivot (in BIDS). The sorting seems to work just as fast, same for filtering. Amir said that this “is beyond wicked fast. It’s the engine of the devil”

A larger data set, showing refreshes, and an extrapolated scan rate of 2 trillion rows/sec.

That is pretty amazing, and I’m sure it is great for most companies, many of whom have much smaller data sets.

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating