Blog Post

Power BI Meets Programmability – TOM, XMLA, and C#

,

For anyone who read the title of this and immediately thought, “Oh no, I can’t do C#! Since when do I need to be in app dev to do Power BI?!” Never fear, I had the same panic when writing it haha. I recently did another blog post that uses the TMSL to accomplish a similar goal as this blog (TMSL Blog), but there are some added benefits to using TOM and C# instead of TMSL and SQL.

Recently, a client suggested that they would like to update their Power BI model schema through a pipeline triggered by their application. They allow end users to create custom UDFs (user defined fields) on the fly and also delete them. Normally, Power BI developers would have to open the PBIX file in the Power BI Desktop application and refresh the data model there to pull in the new columns. However, we have another option using the XMLA endpoint, TOM, and C#.

To start, let’s define a couple key terms.

TOM = Tabular Object Model. The TOM can be used inside numerous scripting languages to manipulate the data model. In this case, we are going to use C# so that the code can be called by a larger variety of applications.

TMSL = Tabular Model Scripting Language. TMSL can be used inside SSMS, and is very easy to manipulate, but does not lend itself well to C#-based applications and automation.

Limitations: You cannot export the PBIX file from the service once the XMLA updates have been made. For adding columns to the model, that’s not a big problem since those would be added in once you opened the desktop tool again. The problem comes if you create or edit visuals in the online service that you don’t want to overwrite in future iterations.

Tools needed:

Notes:

  • Ensure you have a data source you can add columns to if you are following the example below
  • Save a copy of your PBIX report so you can make visual edits in the future. Once you edit a data model using the XLMA endpoint, you can no longer export it as a PBIX file from the online PBI service

Process:

  1. First, create and publish a Power BI Report to the online service. No need to add any visuals, but make sure you have at least one table you have access to edit the columns in to follow along with this demo. You will need a Power BI Pro license and access to publish to a Premium workspace.
  2. Next, add a column to your data source that does not currently exist in your Power BI report. For example, make a column in Excel or SQL called “New Column Test” with the letter “a” filled in for every row. I will make one called “Description” in my example.
  3. Unfortunately, Power BI does not refresh the schema in the service so it will not pull in the new column unless you open up the report in Power BI Desktop and refresh there then republish. One way around this is using the XMLA endpoint from the premium workspace and add the column into the JSON code using the TOM (Tabular Object Model) in C#. Before we walk through each of those steps, keep in mind that doing this will prevent that Power BI dataset from being downloaded as a PBIX file ever again. So, it’s best to keep a local copy of that PBIX file for any visual updates that need to be made, or simply use this dataset as a certified dataset to be used in multiple reports.
  4. Open the premium workspace, select settings, and go to the “Premium” tab to copy the workspace connection.

5. Here comes the scary part, but hey it’s October, the month for tackling our fears! So here we go. Time to make a basic C# application. Open up a file in Visual Studio (ensure you have .Net 5.0 and .Net Core installed as well) and navigate to File –> New Project, choose the Console Application template (should be top one), pick any name you’d like (aka PowerBI_TOM_Testing), select .NET 5.0 for your framework, then hit create. Phewf, you have your app, yay! Under the view tab, go ahead and select Solution Explorer
and you should see it pop open on the right side of your screen.

6. Double-click on “Program.cs” to open your project. Now, go under the Tools tab to NuGet Package Manager then to Manage NuGet Packages for Solution. This is where we get to inform our application of the packages of code we want to use.

7. Go to Browse and search Microsoft.AnalysisServices.NetCore.retail.amd64 and two options should pop up. Go ahead and hit “install” for each of them. Once you’re done, double-check the install by hopping over to the Installed app and make sure they are both there (be sure to clear your search first).

8. Go ahead and close this window and go back to the Program.cs tab and let’s try out using a script using our XMLA endpoint! Swap out PowerBI_TOM_Testing with whatever you named your project in step 5. And Swap out the powerbi://api.powerbi.com/v1.0/myorg/POC with the link you copied in step 4. You should see zero errors show up on the bottom. If not, double check that you have all of the brackets and semi-colons.

using System;
using Microsoft.AnalysisServices.Tabular;
namespace PowerBI_TOM_Testing
{
    class Program
    {
        static void Main(string[] args)
        {
            // create the connect string
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/POC";
            string connectString = $"DataSource={workspaceConnection};";
            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);
            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine(database.Name);
            }
        }
    }
}

9. To run it and get back the datasets in your workspace, simply hit the green arrow at the top. It will pop open with a sign in option, so sign into your Power BI account and watch it go! To see your output, wait for the debugging window to finish running and you should see a list of all the datasets in your workspace!

10. Okay time to add a column into the data model!

For this section, I am going to add some conditional logic so that the script knows what to do if the column already exists. Now fair warning, there’s also a bit of script that adds a measure for you as well. You can delete that section of code, or use it as a template for adding measures into your data model. For more example code, please check out the Power BI Development Camp (https://github.com/PowerBiDevCamp/Tabular-Object-Model-Tutorial/blob/main/Demos/Learning-TOM/Learning-TOM/DatasetManager.cs).

Notes are in green.

Important note, you have to have the SaveChanges() command AFTER the refresh request. If you put the refresh after the save changes, you will have a column with zero data in it.

Here’s the full script, including the script for adding a measure. Please feel free to utilize the additional resources for more examples and assistance. Paste pieces of the code below into your visual studio and enjoy watching your data magically appear into your data model.

using System;
using Microsoft.AnalysisServices.Tabular;
namespace PowerBI_TOM_Testing
{
    class Program
    {
        static void Main()
        {
            // create the connect string
            string workspaceConnection = "powerbi://api.powerbi.com/v1.0/myorg/POC";
            string connectString = $"DataSource={workspaceConnection};";
            // connect to the Power BI workspace referenced in connect string
            Server server = new Server();
            server.Connect(connectString);
            // enumerate through datasets in workspace to display their names
            foreach (Database database in server.Databases)
            {
                Console.WriteLine($"ID : {database.ID}, Name : {database.Name}, CompatibilityLevel: {database.CompatibilityLevel}");
            }
            // enumerate through tables in one database (use the database ID from previous step)
            Model model = server.Databases["bb44a290-f82c-4ec3-a510-e9c1a9a28af2"].Model; 
            
            //if you don't specify a database, it will only grab models from the first database in the list
            foreach (Table table in model.Tables)
            {
                Console.WriteLine($"Table : {table.Name}");
            }
           
            // Specify a single table in the dataset
            Table table_product = model.Tables["Product"];
            
            // List out the columns in the product table
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
             }
            // Adding our column if it doesn't already exist
            if (table_product.Columns.ContainsName("Testing")) //this looks to see if there is a column already named "Testing"
            {
                Console.WriteLine($"Column Exists");
                table_product.Columns.Remove("Testing"); //if the column exists, this will remove it
                Console.WriteLine($"Column Deleted");
                Column column_testing = new DataColumn() //this will add back the deleted column
                {
                    Name = "Testing",
                    DataType = DataType.String,
                    SourceColumn = "Description"
                };
                table_product.Columns.Add(column_testing);
                Console.WriteLine($"Column Created!");
            }
            else
            {
                Column column_testing = new DataColumn() //this will add the column
                {
                    Name = "Testing",  //name your column for Power BI
                    DataType = DataType.String, //set the data type
                    SourceColumn = "Description" //this must match the name of the column your source 
                };
                table_product.Columns.Add(column_testing);
                Console.WriteLine($"Column Created!");
            }

            // List out the columns in the product table one more time to make sure our column is added
            foreach (Column column in table_product.Columns)
            {
                Console.WriteLine($"Columns: {column.Name}");
            }
            // Add a measure if it doesn't already exist in a specified table called product
            if (table_product.Measures.ContainsName("VS Test Measure"))
            {
                Measure measure = table_product.Measures["VS Test Measure"];
                measure.Expression = ""Hello Again World""; //you can update an existing measure using this script
                Console.WriteLine($"Measure Exists");
            }
            else
            {
                Measure measure = new Measure() 
                {
                    Name = "VS Test Measure",
                    Expression = ""Hello World"" //you can also use DAX here
                };
                table_product.Measures.Add(measure);
                Console.WriteLine($"Measure Added");
            }
 
            table_product.RequestRefresh(RefreshType.Full);
            model.RequestRefresh(RefreshType.Full);
            model.SaveChanges();
        }
    }
}

Additional Resources:

Original post (opens in new tab)

Rate

You rated this post out of 5. Change rating

Share

Share

Rate

You rated this post out of 5. Change rating