Provision users and groups from AAD to Azure Databricks (part 2)

by Jan 18, 2023

Assign and Provision users and groups in the Enterprise Application

In the previous blog you learned how to configure the Enterprise Application. In this blog, you will learn how to assign and Provision Users and Groups.

Once the Users and groups are assigned to the Enterprise application you can provision the Users and groups to your Azure Databricks account or Azure Databricks Workspace.

Add users and groups

Click on the Add user/group in the Enterprise application on the left pane to add the required users and groups.

Azure Databricks SCIM Users and Groups

License warning Enterprise Application

When you see above message, that means that you don’t have a Premium Azure Active Directory edition account. Don’t worry, you can still provision users, for Groups you to need a Premium edition.

Note: If you have existing Azure Databricks workspaces, in case you sync on Account Level, make sure that you add all existing users and groups in those workspaces to the above Enterprise application.

Start the provisioning

The last step is to provision the users and the groups. The provision will automatically sync the assigned users and groups to your Azure Databricks account.

Go back to the provisioning option on the left pane.

Mappings

Enable the user and group sync option in the mappings section.

Mapping detail to Provision users and groups in the Enterprise Application

Settings

Set the scope to Sync only assigned users and groups, otherwise all your users in your Azure Active Directory will be synced, which is not necessary

The next step is, set the Provisioning Status toggle to on.

Setting details to Provision users and groups in the Enterprise Application

After a few minutes your users will be synced.

There are 2 more options which we can set:

Notification Email: Send an email notification when a failure occurs

Prevent accidental deletion: Set a threshold for Accidental deletion more on how this works can be found here.

Checking the Provisioning Logs

Once the provision of the users and groups has been done, you can check the details in the provision logs.
Click on the left side provisioning:
Log details in the Enterprise Application
The details of the provisioning should be visible now, good to know that the interval of syncing is fixed to 40 minutes.
 
Click on the View provisioning logs to see a detailed overview of the sync.
Log overview in the Enterprise Application

Tips and tricks for Provisioning

  • The interval of syncing is fixed and set to 40 minutes, the initial one is directly started.
  • The username or email address of an Azure Databricks workspace user cannot be updated.
  • The admin group cannot be used as Group name.
  • Groups cannot be renamed in Azure Databricks or in the Azure Active Directory.
  • Nested groups or service principals cannot be synced.
  • More tips and tricks can be found here.

In my next blog I will explain how to Create a metastore in your Azure Databricks account to assign an Azure Databricks Workspace.

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

twenty − sixteen =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Use Global Parameters to Suspend and Resume your Analysis Services in ADF

Suspend or Resume your Azure Analysis Services in Azure Data Factory Last week one of my customer asked me if they could start or stop his Azure Analysis Services within Azure Data Factory. After a search on the internet I came across a blog from Joost, I'm using that...

Azure Data Factory Let’s get started

Creating an Azure Data Factory Instance, let's get started Many blogs nowadays are about which functionalities we can use within Azure Data Factory. But how do we create an Azure Data Factory instance in Azure for the first time and what should you take into account? ...

Azure SQL Data Warehouse: How to setup Reserved Capacity

Purchase your Azure SQL Datawarehouse Reservation   Since a few weeks you can buy Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). This Reservation can save you up to 65% on the normal Pay as You go rates with a 3 year pre-commit. A pre-commit of 1 year...

How to setup Code Repository in Azure Data Factory

Why activate a Git Configuration? The main reasons are: Source Control: Ensures that all your changes are saved and traceable, but also that you can easily go back to a previous version in case of a bug. Continuous Integration and Continuous Delivery (CI/CD): Allows...

Azure Synapse Analytics overwrite live mode

Stale publish branch In Azure Synapse Analytics and Azure Data Factory is an new option available "Overwrite Live Mode", which can be found in the Management Hub-Git Configuration. With this new option your can directly overwrite your Azure Synapse Analytics or Azure...

How to create a Azure Synapse Analytics Workspace

Creating your Azure Synapse Analytics Workspace In the article below I would like to take you through,  how you can configure an Azure Synapse Workspace and not the already existing Azure Synapse Analytics SQL Pool(formerly Azure SQL DW): In de Azure Portal search for...

Azure Data Factory: Generate Pipeline from the new Template Gallery

Last week I mentioned that we could save a Pipeline to GIT. But today I found out that you can also create a Pipeline from a predefined Solution Template.Template Gallery These template will make it easier to start with Azure Data Factory and it will reduce...

Connect Azure Synapse Analytics with Azure Purview

How do you integrate Azure Purview in Azure Synapse Analytics? This article explains how to integrate Azure Purview into your Azure Synapse workspace for data discovery and exploration. Follow the steps below to connect your Azure Purview account in your Azure Synapse...

Azure Data Factory and Azure Synapse Analytics Naming Conventions

Naming Conventions More and more projects are using Azure Data Factory and Azure Synapse Analytics, the more important it is to apply a correct and standard naming convention. When using standard naming conventions you create recognizable results across different...

Connect Azure Databricks to Microsoft Purview

Connect and Manage Azure Databricks in Microsoft Purview This week the Purview team released a new feature, you’re now able to Connect and manage Azure Databricks in Microsoft Purview. This new functionality is almost the same as the Hive Metastore connector which you...