Sharing Microsoft Fabric Data With External Users

Merry Christmas from That Fabric Guy! I’m wearing my warm sweater because it’s getting cold outside, and it’s Christmas time.

Christmas is all about sharing. And you know what else is all about sharing? Data.

Today we’re going to learn how we can use Microsoft Fabric to share data with external parties. Let’s dive in.

The Scenario

We have a tenant that is the provider tenant. This provider tenant contains a lakehouse, and the lakehouse contains data. The data needs to be shared with one or multiple external parties, and these parties want to consume the data using Power BI or other applications.

Today we’re going to create a link between the two tenants—one containing our lakehouse, the other containing their dataset, with Entra ID guest user access in between.

Important Things to Be Aware Of

In this video, I’m going to show you how to set this up using import mode functionality. Import mode means there is a copy of data on the consuming tenant.

So we need to keep in mind: they are copying data out of the provider lakehouse into the consumption semantic model in Power BI.

We need to be very strict about row-level security or maybe having a customer-specific lakehouse that they can read data out of. That is very important.

Setting Up the Provider Tenant

I have two web browsers side by side. On the left side (in green) is my Kimura tenant—the Fabric environment for my company. On the right side (in dark mode) is the Azure portal of the KDF2.io tenant, which is also a Kimura tenant but an external one we use for testing purposes.

We’ll assume today that KDF2 will be the provider tenant—the one that contains the actual data.

Step 1: Create Guest Users in Entra ID

The first thing we need to configure: we need to have a guest user on our tenant. It should be a user in the consuming tenant, and they need to become a guest user in the provider tenant.

What we have on the screen is a guest user called “Bas Land” from Kimura. It says “member”, but it’s actually a guest user.

How can we see the user type is member, but it’s a guest? We can see this was originated as a guest user because the user principal name is bas_kimura.nl. Also, we see the email is bas@kimura.nl, and there’s a nickname with pound signs, which basically means this is an external user.

What you need to do: add all the users who are going to consume data to your own tenant. Invite them as guest users.

What I’ve done is I’ve put the user type from guest to member. They’re still an external user, but they get member rights instead of guest rights. This is not something you necessarily need to configure—a guest user is perfectly fine for now.

Step 2: Create Security Groups

Going back into Entra ID for KDF2, we have security groups. We have a group called “KDF2 Developers”.

This group contains my external account. So “Bas” here is an internal user, then bas@kimura.nl is an external user, and so on.

This is basically all we need to set up. Strictly speaking, setting up the security group is not necessary, but because we want to do things the nice way, we’re not going to put individual users on all kinds of access and allow lists. We’re going to do that through groups.

Step 3: Grant Access in Microsoft Fabric

Now what we have to do on the same originating site is go into Microsoft Fabric. We have a lakehouse called “Internal Data Warehouse” and a SQL Analytics Endpoint with the same name. We’re going to use this SQL Analytics Endpoint.

Now what we have to do here on the Fabric portal is give access to the objects themselves.

Mind you, in this scenario, the security group “KDF2 Developers” is actually a workspace member. If you want your users to be able to log into Microsoft Fabric—into the portal—and see the workspace with all its contents, then this is the way to go.

If you do not want to do that, you can give access to just the data, not the portal.

For this scenario, both will work fine. You have to think about: do we want the external user to log into the portal or just connect to the data?

If we want them to go just to the data without actually logging into your Fabric environment, then we find the SQL Analytics Endpoint (called “Internal DW” in this scenario), click on “More options”, and go to “Manage permissions”.

In the manage permissions tab, we can add users or security groups that have read/write permissions. Usually what you want is only “Read” and “ReadAll” permissions. You need both for this to work.

Setting Up the Consumer Tenant

In the end, how this external data share works: we have access to the SQL Analytics Endpoint so the user can query that SQL endpoint.

Step 4: Create a New Semantic Model

We go to “New item”, click “All items”, and go for “Semantic model”. We create a new semantic model by clicking on “Get data”.

We look for SQL. Whether we pick “SQL Server database” or “Azure SQL database”, it doesn’t matter—it’s the same connection.

From here, we’re going to put in the server name. We go to the right side of the screen where we see “Internal Data Warehouse” and copy the SQL connection string.

When doing that, we get this connection string. We can copy it and paste it on the left side of the screen again. You see that this connection string ends at datawarehouse.fabric.microsoft.com, which means we’re looking at a SQL Analytics Endpoint. This is basically the server we’re working with.

Step 5: Configure the Connection

We have to create a new connection. Let’s call this “External Demo”. We’re not going to use a gateway, and we’re going to put in “Organizational account”.

From here, we see that bas@kimura.nl is indeed the organizational account we want to use. We click “Next”.

We get an invalid credentials error. I don’t know why, but if we now click on “Switch account”, we click on the bas@kimura.nl account and click “Next” again.

Now we’ve created this external connection. Why we have to do that twice, I’m not sure. Probably there’s some kind of cookie or something that has to be set. But anyway, we now have a connection.

Mind you, on the right side of the screen, we see that our lakehouse is called “Internal DW” (Internal Data Warehouse). We have not put “Internal Data Warehouse” in the connection string yet, but Power BI semantic models on the left side in the Kimura tenant knows that “Internal Data Warehouse” is a database on that SQL server endpoint.

If we open this, we can indeed find all the tables that we have in that lakehouse.

Step 6: Select Tables and Create the Model

I’ll put in a simple one that we can use without showing any actual data—our unit of measure table from the ERP system, for example.

We click on “Transform data”. You can select multiple tables, of course.

From here, we have our Power Query in the browser that we know how to work with. We click on “Create a semantic model only”. We name the semantic model “External Demo”.

Click “Create”, and here we are in the semantic model. It took a while to save, but we now have a model called “External Demo” in our workspace.

If we go back, we’ll find that indeed the “External Demo” semantic model is here. We can go to its settings and see it was last refreshed just a couple of minutes ago, there are some data source credentials, there’s a cloud connection mapped to our external demo connection in Power BI, and the refresh is not scheduled yet.

This is now a fully functioning semantic model that we can use on the external tenant. The data can now be copied from the KDF tenant into the Kimura tenant—or more conceptually speaking, from the provider tenant on the right side of the screen to the consuming tenant on the left side of the screen.

Wrapping Up

To recap: we have a scenario where a provider has data in their own Fabric environment in their own lakehouse, and they want to share that with one or multiple external tenants (consumption tenants). These can be customers, partners, or suppliers—basically anybody with their own tenant that wants to extract data out of your tenant.

What I’ve shown today is how we can set up an external user through Entra ID, make the user a reader on the lakehouse SQL endpoint, and then in the consumption tenant create a semantic model that can consume data out of a lakehouse.

Mind you, the users in the external tenants are going to copy data from your tenant to their tenant. So we have to make sure that the lakehouse we give them access to is locked down and contains only the data they’re allowed to see.

Also, there are more ways to externally share data in Power BI. This was just one of the things that we see in practice and that we think are interesting to share with you.

Have you set up external data sharing in Fabric? What challenges did you run into? Let me know in the comments below!

Leave a Comment