How To Use Power BI Embedded For Your Customers

Recently I had a need to build a Power BI report for a client.  This client has a multi-tenant database and their own custom web app.  They want to show each client their data without showing any other clients’ data. They also don’t want to require their customers have a Power BI license.  Easy enough, just use Power BI Embedded with the App Owns Data model and set up Roles (row-level security) in the Power BI Desktop file, should take less than five minutes to set up.  Well, we all know it took longer than five minutes or this blog post wouldn’t exist.

There are some great tutorials from Microsoft out there, they even provide a sample app on Github that you can download if you are not a web developer (I am so NOT a web developer!) to help you get started.  There are some great blog posts about it too, one from Reza Rad and one from Kasper de Jonge.  So why I am writing yet another blog post about it?  Because the devil is in the details, and I completely missed the details which means someone else must have as well.

I don’t want to repeat what others have already written, so go ahead, go read their posts, it’s okay I’ll wait here.

Now that you familiar with Row Level Security in Power BI, how do you make it work when you want to pass in your customer’s identifier because your customers don’t have Power BI accounts?  It seems like the only way to make dynamic row level security is to use the Username() DAX function?  But wait, doesn’t that require the user to have a Power BI account?  Sigh, it seems we are going in circles.

The one thing these articles don’t talk about is that when you are using Power BI Embedded, you can pass in whatever you like for the EffectiveIdentity via the Power BI API and it will “overwrite” the Username() function.  What?!  That’s right, it will completely ignore the Username() function and use whatever you give it.  WooHoo!

Let’s see an example, so it’s crystal clear.  I have a small sample of data that has some sales.

Embedded-Sample-Relationships

Let’s say I want to limit the view of the data by Region.  Here are my available regions.

Embedded-Sample-Regions

I would need to create a role based on Region that looks like this

Embedded-Sample-ManageRoles

Notice that I am using the Username() Dax function for Region.  I know, I know, my customers aren’t Power BI users, so this makes no sense, why would I use this?  Like I said earlier, the value that you pass via the Web API as the EffectiveIdentity will overwrite the Username() function.

Now that I have my Roles set in my Power BI Desktop file, I can publish my file to the Power BI service.  All that’s left to do is to add members to my role.  Now when using Power BI Embedded in this scenario, you only need to add one account to the role that was just created.  Just keep in mind that this account must be have a Power BI Pro license.

Navigate to the Security for the dataset.

Embedded-Sample-Navigate-to-Security

Add your user to the role

Embedded-Sample-Add-Account-To-Role

It should look like this now

Embedded-Sample-Final-Role

Now, let’s look at the sample web app and how this will work.  In the homecontroller.cs there is a call to EmbedReport which contains the creation of the EffectiveIdentity, this is where the magic happens.

Embedded-Sample-Effective-Identity

Let’s take a look at how this works from the front end.  Here’s my report with no security.

Embedded-Sample-No-Security

Now I just click the checkbox View as a different user, enter my values then click Reload.

Embedded-Sample-Filtered-Result

Presto change-o, the data is filtered for just the Midwest Region.  But wait, Midwest isn’t a Power BI user, so how does that work?  Again, the EffectiveIdentity overwrites the Username() function and applies whatever you have entered.  It’s magic, I’m telling you.

One thing that tripped me up was that I wanted to filter by RegionID, not region name.  When we work with data, we want to be as precise as possible so we typically use a primary key (or other unique identifier) to filter our data.  However I have discovered through trial and error that you cannot pass a number, you have to pass character data in order for the filter to work.  This means for your data you may need to ensure that there is some unique character data field to identify your customers other than a numeric ID field.

Oh, and if you want to play around with this you can use the sample app from Github and this Power BI Desktop file.

I hope this was helpful.

Update: 2-Feb-2019

I was asked an interesting question via Twitter by Dave Ruijter about converting data to string so you could essentially still use the numeric ID field.

Instead I added a computed column to my dataset that uses the FORMAT() function to convert the existing numeric ID field to a string.

Embedded-Sample-New-Computed-Column

Then I based my role on that column, still using the Username() function.

Embedded-Sample-Using-RegionID-Role

Now I can filter based on the ID field, WooHoo!

Embedded-Sample-Final-Using-RegionID

Thanks Dave Ruijter for making me think about this a little bit more.

Don’t Forget To Take Break

So many times we get sucked into a rabbit whole when fixing/troubleshooting things we forget to take a break.  More often than not this leads to the “not seeing the trees through the forest” syndrome as my grandfather used to call it.

Sadly this happens to me more than I like to admit.  I am tenacious when it comes to problem solving and I don’t like to “give up”.  However, once I hit that mentality of “I’m not going to let this problem beat me” is when I most need to take a break.

I have found that if I quit focusing on the problem at hand and do something else, often times not work related, for a few hours when I come back to the issue I solve it within 10-15 minutes.

Case in point, just yesterday I was troubleshooting a Power BI Data Gateway issue.  We had the Gateway installed and configured.  The Power BI Service could see and communicate with it so we were all set to create our first data source.

We entered in all our information about the data source and when we clicked the Add button, it came back with a failure.  It was a very generic error that didn’t really tell us anything.  I searched the Power BI Community forums only to find that we had done everything correctly.  I was stumped, but I started down the path of “I’m not going to let this problem beat me!”  Right then is when I should have put it down and walked away.  But no, I kept doing the same things over and over again expecting a different result (and we all know what that leads to – CRAZY town).

I was forced to take a break and go coach my swim team of 6-12 year old kids.  I completely forgot about the problem for 2 hours while I coached and talked with my kids.  When I got home and sat down at my desk, I had an epiphany.  Check the Gateway logs!  Wow, why didn’t I think of that before?! I discovered that the gateway had gotten itself into a bad state so I restarted the gateway service and was able to successfully create the data source using the data gateway.  That took less than ten minutes, sheesh.  If only I’d been forced to coach swim team earlier.

So the next time you encounter a problem and just can’t seem to figure it out, take a break.  Give your brain something else to focus on and you just may save yourself from a trip to CRAZY town.

Speaking at DataGrillen

After speaking at SQL Saturday Iceland last year, which is a smaller event that I absolutely loved (you can read about that adventure here), I decided I wanted to do another smaller event.  They are much more intimate and offer a better chance of engaging with attendees, sponsors, organizers and other speakers.

I heard about a smaller event called SQLGrillen that took place in Germany.  Their motto is “Databases, Bratwurst and Beir”.  How cool is that?!  But alas, I missed the call for speakers deadline, so I kept my fingers crossed that the event organizers would put it on again this year and low and behold they did.  They’ve changed the name to DataGrillen, but it’s still the same cool event.

I submitted a session and to my surprise and amazement, I was selected to speak.  I will be giving my Profiling Your Data session on Friday, June 21, 2019.  Now my German is very limited (Nein, Ja, Bitte, Bier & Bahnhof) but I’m still excited and I hope to see you there.

The event is already sold out, but they do have a wait list.  If you want to go get your name on the list as soon as possible because it’s FIFO.

Auf wiedersehen for now.

Speaking at SQLBits

I got an early Christmas present this year, I found out I had been selected to speak at SQLBits!  That’s what I call a gift that keeps on giving.

I have always wanted to attend SQLBits so I decided that 2019 would be the year I would finally attend.  Since I had decided to attend, I thought, “what the heck, why not submit session?  I’m going to be there anyway.”  But never in my wildest dreams did I ever expect to be selected.  I will be presenting my Profiling Your Data session.

It’s been twenty years since I was in England and I am super excited to be going back.  I have family ties to England so I added a few extra days for sight seeing.  Last time I was there I visited Malvern Link, home of the Morgan Motor Car Company, my dad’s favorite auto manufacturer.   This time I am planning a quick trip over to Liverpool so I can see where my Dad’s favorite band (and one of mine) was started, you may have heard of them, they’re called The Beatles Winking smile

Speaking at SQL Saturday Nashville

I am excited and honored to announce that I have been selected to speak at SQL Saturday Nashville on January 12, 2019.

I’ve been to Nashville before, in fact I was just there last June for Music City Tech, and am super excited to be going back.

I will be presenting my Profiling Your Data session.  If you’re in area and haven’t registered yet, there are still seats available, you can register here.

Feel free to stop by and say, “Hi”, I’d love to see you.

Where to Store Index DDL

Recently I was asked my opinion, via Twitter, on where to store the index DDL for a single database that had multiple clients with varied usage patterns.  You can see the whole thread here.

It’s a great question and there were some interesting suggestions made.  My approach to this scenario is kind of a hybrid of all the suggestions and comments.

I’ve had to deal with this kind of thing in the past and what I found worked best is to create a single file for each client that contains the client specific DDL.  I wish I could take credit for this idea, but it wasn’t mine, it belonged to a co-worker.  At first I resisted and thought it was a bad idea.  I mean really, mixing DDL for more than one object in a single script just seemed wrong and goes against every fiber of my OCD organizational self.  But in the end, this is what worked best in our environment.

Our initial thought was to include our index DDL with the table, but use variables to name the index objects that were specific to the client.  This way the index names would never collide, but then that kind of defeated the whole purpose of different indexing strategies for different clients.  Thankfully we scrapped that idea before we implemented it.

We tried creating separate files for each table that had client specific DDL in each file.  That became a nightmare when it came time to deploy and maintain.  We had to build logic in our pre & post deployment scripts to handle that.

Then we tried separating the index DDL files out by client, so we ended up with a bazillion index DDL files for each table.  Okay, may not a bazillion, but it was a lot and it was even more of a nightmare to maintain.

We settled on the approach I mentioned earlier, one DDL file per client that held all the DDL that was specific to the client, not specific to any object.  We found it was much easier to maintain and deploy.  We defaulted each of our client specific DDL files to be NOT included in the build.  When it came time to do a build/deploy for a specific client, we would set the option to include the client specific file in the build.  We were not using continuous integration, so this may not work if that is what your shop is doing.  Or it may work with just a few tweaks to your process.  It did work for our situation and it worked well.

I don’t think there is a single correct answer to this question.  Like almost everything else in our tech world, the answer is going to be, “it depends”.  Figure out what works in your environment and then go with it.  It might take a few trial and error runs to get it right, but you’ll figure out what works best over time with a little persistence.

I’d love to hear your thoughts on this.

Speaking At SQL Saturday DC

I am so excited to announce that I was selected to speak at SQL Saturday DC on December 8, 2018.

I will be presenting two sessions, What is Power BI? and Data Types Do Matter.  My Data Types Do Matter session is the same session that I presented at PASS Summit 2018, so if you couldn’t make it to PASS Summit this year, now’s your chance to see it.

If you’re in the Washington DC area on December 8, 2018, register for SQL Saturday DC and stop by and say, “Hello”.  I’d love to see you.