Connecting Event Hubs in Microsoft Fabric

by May 26, 2023

Connecting Azure Event Hubs in Microsoft Fabric

In my previous blog I did give you an introduction of the possibilities of Real-Time Analytics in Microsoft Fabric.

In this blog we will have a closer look into how we can connect data from one of our existing Azure Event Hubs.

End to End scenario Microsoft Fabric Real-Time Analytics

Looking to the above picture, you see an end to end workflow for a Real-Time Analytics scenario. We can directly see which Fabric Artifact we need to use to build the solution. To build the complete solution below took me maximum 20 minutes,.

Loading data from Azure Event Hubs to Lakehouse

Requirements:

  • An existing Azure Event Hub.
  • New consumer group(never you use an existing). If you use an existing consumer group then it can happen that the event hub stop sending messages to your existing environment.
  • Fabric Workspace

Note:

Adding a consumer group is not available in the Basic tier but only in the Standaard Tier.

Creating a Shared Access Policy on the Event Hub

Create a new Shared Access Policy on the Event Hub, with the manage option enabled.

Event Hubs Shared Access Policy

Note down the SAS Policy name and the Primary Key. We will need this later to setup the Connection in Microsoft Fabric.

Create a Data Connection in Microsoft Fabric

In the menu bar(top right) open the settings toggle and open the Manage Connection option.

Microsoft Fabric Manage Connection

Search for Event Hub.

Microsoft Fabric add Event Hub connection

Connection name Name of the Connection
Event Hub Namespace https://xxxxxxx.servicebus.windows.net:443/
Authentication

Username: Name of the SAS Policy

Password: Primary Key of the SAS Policy

Now we have created a connection to our Azure Event Hub, we’re ready to receive our streaming data and to setup an Eventstream.

So lets start to open the the Synapse Real-Time Analytics Experience. This can be found in the left bottom corner of your Microsoft Fabric environment.

Microsoft Fabric

Microsoft Fabric Real-Time Analytics

Fabric Capacity

Make sure you have a Microsoft Fabric or Power BI Premium capacity assigned to this workspace. 

Create Eventstream in Microoft Fabric

Within our Fabric Workspace, select NEW on the left upper corner and select Eventstream.

Microsoft Fabric Create Event Stream

Define a name for the Evenstream and click on create.

Microsoft Fabric Name Eventstream

This can take a couple of minutes to setup, but don’t worry there are a lot of things happening in the background. Microsoft Fabric is a SaaS application so things needs to be deployed for you.

The great advantage for you, things will much easier to setup.

So once everything is ready you will see this new screen:

Microsoft Fabric Overview Eventstream

Create the Eventstream Source

Next step is to connect our Source, in this case the connection to the Event Hub.

Microsoft Fabric create source Eventstream

Select the Azure Event Hubs, a new pane will open.

Microsoft Fabric configure source eventstream

   
Source name Define a name for your source, you can use the name of the Event Hub or a custom name
Cloud Connection Select the connection you’ve created in the beginning of this blog
Data Format

Define the correct format based on your Event Stream

Microsoft Fabric Configure data format Eventstream

Consumper group

You can select a group you have a created in the beginning of this blog. Or you create a new one as well.

Microsoft Fabric set consumer group

 

Note: Never you use an existing Consumer Group, because your current application connected to this Consumer Group will stop receiving data.

Once all the required field are filled in, click on Create. Now the source of your Eventstream will be created.

Microsoft Fabric Create Source

After the connection is setup successfully you can click on Data Preview, to see what kind of data is coming in and if this is the correct data.

Microsoft Fabric Created Source SuccesfullMicrosoft Fabric Preview Source Eventstream

If you data is not shown the correct way, you can change data format to csv or avro.

Destination

One of our last steps in our configuration is to setup the destination for the Eventstream.

In this blog we will use a Lakehouse(more destination are available), so that we can store our data and use it in a later stadium to build reports on top of the data.

Lakehouse

You can choose if you want to create a new Lakehouse or use an existing one.

If you do not have created a Lakehouse , you need to create one.

Select in left bottom corner, the option Data Engineering.

Microsoft Fabric Data Engineering

Create a New Lakehouse, define a name and click on create.

Microsoft Fabrics Create Lakehouse

After creating a Lakehouse, you will see that Automatically a Dataset and a SQL Endpoint are created by default. How easy is that!

Microsoft Fabric Lakehouse artifact

Create the Eventstream Destination

Create Lakehouse as Eventstream Destination

Microsoft Fabric Eventstream Destination

A new windows will open were we can configure the Lakehouse connection/destination.

Microsoft Fabric create table in Lakehouse for Eventstream

Destination Name The name of the destination
Workspace The workspace were you’re Lakehouse is located
Lakehouse The Lakehouse you want to use(you can have more than 1 in the same workspace)
Delta table The Delta Table were you want to store the data, you can also create a new table from here.
Data format Mostly the same format as the data you added to in Source

Event Processing

Before you create the destination, you can transform and preview the data that is being ingested for the destination with the Event Processor. The event processor editor is a no-code experience that provides you with the drag and drop experience to design the event data processing logic.

Microsoft Fabric Real-Time Analytics Event Processing detailed

As you can see there’re a lot of operations/transformation possible to transform your data in a correct way, renaming a field is a matter of seconds with a no-code experience.

The last step is to create the destination. It is just as easy as it is, click on Create.Microsoft Fabric Real-Time Analytics Eventstream working

The Eventstream is ready, Source is streaming data and the destination is Ingesting data.

Navigate to your Lakehouse to verify  the ingested data.

Microsoft Fabric Eventstream Lakehouse

If you prefer to verify with a TSQL command, you can easily switch to a SQL Endpoint mode, which is located in the upper right corner.

Microsoft Fabric switch to sql endpoint

And now you can run any type of query you want.

Microsoft Fabric Warehouse tsql querie

 

Next Steps

Build Power BI report with the ingested eventdata in the Lakehouse. As mentioned before a default dataset is already created.

In my next blog I will explain how we can start using the KQL database as a destination, so stay tuned.

Documentation

Click below to read more about Microsoft Fabric and Real-Time Analytics.

Microsoft Fabric Real -Time Analytics documentation

Exploring the Fabric technical documentation

OneLake in Fabric blog

Exploring the Fabric technical documentation

Like always, I case you have some questions left, do hesitate to contact me.

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

11 + eleven =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Blog Serie: Provision users and groups from AAD to Azure Databricks

This blog post series contains topics on how to Provision users and groups from Azure Active Directory to Azure Databricks using the Enterprise Application(SCIM). This is a summary of the all the blogs I posted the last couple of days. I am very happy with all the feedback and tips I have received about this blog series.

Azure Synapse Pause and Resume SQL Pool

Pause or Resume your Dedicated SQL Pool in Azure Synapse Analytics Azure Synapse Analytics went GA in beginning of December 2020, with Azure Synapse we can now also create a Dedicated SQL Pool(formerly Azure SQL DW). Please read this document to learn what a Dedicated...

Data Sharing in Microsoft Purview

In today's world, data is the key to success for businesses. The more data a business has, the better it can make decisions and stay ahead of its competitors. However, data is not always easy to come by, and many businesses struggle with finding and accessing the data...

Azure Synapse Analytics Power BI Integration

Creating a Linked Service for Power BI Open your Synapse Studio and select the Management Hub. Add a new Linked Service If you haven't connect to Power BI before, you will see the screen above. If you want to add another Power BI Linked Service(Workspace). Search for...

SSMS 18.1: Schedule your SSIS Packages in Azure Data Factory

Schedule your SSIS Packages with SSMS in Azure Data Factory(ADF) This week SQL Server Management Studio version 18.1 was released, which can be downloaded from here. In version 18.1 the Database diagrams are back and from now on we can also schedule SSIS Packages in...

Data Sharing Lineage in Microsoft Purview

In my previous blog, I wrote how you can share data within your organization or across organizations. Now it's time to have a look how the lineage will look like. In this article I will explain the Microsoft Purview Data Sharing Lineage and not the Lineage for Azure...

Azure SQL Data Warehouse: Reserved Capacity versus Pay as You go

How do I use my Reserved Capacity correctly? Update 11-11-2020: This also applies to Azure Synapse SQL Pools. In my previous article you were introduced, how to create a Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). Now it's time to take a look at how this...

Provision users and groups from AAD to Azure Databricks (part 1)

Blog Serie: Provisioning identities from Azure Active Directory to Azure Databricks. Instead of adding users and groups manual to your Azure Databricks environment, you can also sync them automatically from your Azure Active Directory to your Azure Databricks account...

Updated Microsoft Purview Pricing and Applications

Microsoft Purview Pricing and introduction of Purview Applications The Microsoft Purview pricing page has been updated. Below I have listed most of the changes. The most important changes are the introduction of the Microsoft Purview Applications and the pricing of...

Service Healths in Azure

Creating Service Health Alerts in AzureAzure Portal In the Azure Portal go to Monitor – Service Health – Health alerts If you have created alerts before you will see them over here. Assuming you haven’t created an Alert before, we will start to create an Alert.1...