Connecting Event Hubs in Microsoft Fabric

Connecting Azure Event Hubs with Eventstream in Microsoft Fabric

In my previous blog I did give you an introduction of the possibilities of Real-Time Analytics in Microsoft Fabric.

In this blog we will have a closer look into how we can connect data from one of our existing Azure Event Hubs.

End to End scenario Microsoft Fabric Real-Time Analytics

Looking to the above picture, you see an end to end workflow for a Real-Time Analytics scenario. We can directly see which Fabric Artifact we need to use to build the solution. To build the complete solution below took me maximum 20 minutes,.

Loading data from Azure Event Hubs to Lakehouse

Requirements:

  • An existing Azure Event Hub.
  • New consumer group(never you use an existing). If you use an existing consumer group then it can happen that the event hub stop sending messages to your existing environment.
  • Fabric Workspace

Note:

Adding a consumer group is not available in the Basic tier but only in the Standaard Tier.

Creating a Shared Access Policy on the Event Hub

Create a new Shared Access Policy on the Event Hub, with the manage option enabled.

Event Hubs Shared Access Policy

Note down the SAS Policy name and the Primary Key. We will need this later to setup the Connection in Microsoft Fabric.

Create a Data Connection in Microsoft Fabric

In the menu bar(top right) open the settings toggle and open the Manage Connection option.

Microsoft Fabric Manage Connection

Search for Event Hub.

Microsoft Fabric add Event Hub connection

Connection name Name of the Connection
Event Hub Namespace https://xxxxxxx.servicebus.windows.net:443/
Authentication

Username: Name of the SAS Policy

Password: Primary Key of the SAS Policy

Now we have created a connection to our Azure Event Hub, we’re ready to receive our streaming data and to setup an Eventstream.

So lets start to open the the Synapse Real-Time Analytics Experience. This can be found in the left bottom corner of your Microsoft Fabric environment.

Microsoft Fabric

Microsoft Fabric Real-Time Analytics

Fabric Capacity

Make sure you have a Microsoft Fabric or Power BI Premium capacity assigned to this workspace. 

Create Eventstream in Microoft Fabric

Within our Fabric Workspace, select NEW on the left upper corner and select Eventstream.

Microsoft Fabric Create Event Stream

Define a name for the Evenstream and click on create.

Microsoft Fabric Name Eventstream

This can take a couple of minutes to setup, but don’t worry there are a lot of things happening in the background. Microsoft Fabric is a SaaS application so things needs to be deployed for you.

The great advantage for you, things will much easier to setup.

So once everything is ready you will see this new screen:

Microsoft Fabric Overview Eventstream

Create the Eventstream Source

Next step is to connect our Source, in this case the connection to the Event Hub.

Microsoft Fabric create source Eventstream

Select the Azure Event Hubs, a new pane will open.

Microsoft Fabric configure source eventstream

   
Source name Define a name for your source, you can use the name of the Event Hub or a custom name
Cloud Connection Select the connection you’ve created in the beginning of this blog
Data Format

Define the correct format based on your Event Stream

Microsoft Fabric Configure data format Eventstream

Consumper group

You can select a group you have a created in the beginning of this blog. Or you create a new one as well.

Microsoft Fabric set consumer group

 

Note: Never you use an existing Consumer Group, because your current application connected to this Consumer Group will stop receiving data.

Once all the required field are filled in, click on Create. Now the source of your Eventstream will be created.

Microsoft Fabric Create Source

After the connection is setup successfully you can click on Data Preview, to see what kind of data is coming in and if this is the correct data.

Microsoft Fabric Created Source SuccesfullMicrosoft Fabric Preview Source Eventstream

If you data is not shown the correct way, you can change data format to csv or avro.

Destination

One of our last steps in our configuration is to setup the destination for the Eventstream.

In this blog we will use a Lakehouse(more destination are available), so that we can store our data and use it in a later stadium to build reports on top of the data.

Lakehouse

You can choose if you want to create a new Lakehouse or use an existing one.

If you do not have created a Lakehouse , you need to create one.

Select in left bottom corner, the option Data Engineering.

Microsoft Fabric Data Engineering

Create a New Lakehouse, define a name and click on create.

Microsoft Fabrics Create Lakehouse

After creating a Lakehouse, you will see that Automatically a Dataset and a SQL Endpoint are created by default. How easy is that!

Microsoft Fabric Lakehouse artifact

Create the Eventstream Destination

Create Lakehouse as Eventstream Destination

Microsoft Fabric Eventstream Destination

A new windows will open were we can configure the Lakehouse connection/destination.

Microsoft Fabric create table in Lakehouse for Eventstream

Destination Name The name of the destination
Workspace The workspace were you’re Lakehouse is located
Lakehouse The Lakehouse you want to use(you can have more than 1 in the same workspace)
Delta table The Delta Table were you want to store the data, you can also create a new table from here.
Data format Mostly the same format as the data you added to in Source

Event Processing

Before you create the destination, you can transform and preview the data that is being ingested for the destination with the Event Processor. The event processor editor is a no-code experience that provides you with the drag and drop experience to design the event data processing logic.

Microsoft Fabric Real-Time Analytics Event Processing detailed

As you can see there’re a lot of operations/transformation possible to transform your data in a correct way, renaming a field is a matter of seconds with a no-code experience.

The last step is to create the destination. It is just as easy as it is, click on Create.Microsoft Fabric Real-Time Analytics Eventstream working

The Eventstream is ready, Source is streaming data and the destination is Ingesting data.

Navigate to your Lakehouse to verify  the ingested data.

Microsoft Fabric Eventstream Lakehouse

If you prefer to verify with a TSQL command, you can easily switch to a SQL Endpoint mode, which is located in the upper right corner.

Microsoft Fabric switch to sql endpoint

And now you can run any type of query you want.

Microsoft Fabric Warehouse tsql querie

 

Next Steps

Build Power BI report with the ingested eventdata in the Lakehouse. As mentioned before a default dataset is already created.

In my next blog I will explain how we can start using the KQL database as a destination, so stay tuned.

Documentation

Click below to read more about Microsoft Fabric and Real-Time Analytics.

Microsoft Fabric Real -Time Analytics documentation

Exploring the Fabric technical documentation

OneLake in Fabric blog

Exploring the Fabric technical documentation

More information about Microsoft Fabric can be found at:

Microsoft Fabric Content Hub

 

Like always, I case you have some questions left, do hesitate to contact me.