Provision users and groups from AAD to Azure Databricks (part 4)

by Jan 23, 2023

In the previous blog, you created the metastore in your Azure Databricks account to assign an Azure Databricks Workspace. In this blog, you will learn how to assign Users and Groups to an Azure Databricks Workspace and define the correct entitlements.

You need to assign the synced groups to your Azure Databricks workspace, this needs to be done for every workspace. That’s one of the reasons to create groups of users for every environment.

SG_DATABRICKS_USERS_DVLM: for the users which are allowed to use the Development environment.

SG_DATABRICKS_USERS_PROD: for the users which are allowed to use the Production environment.

SG_DATABRICKS_ACCOUNT_ADMIN: for the users which needs to be assigned the Account Admin role.

You can add the users in both groups, but this way you are already prepared for the future if you still want to separate the users from each other in a later stage.

Azure Databricks Workspace

Log in to your Workspace, in case you’re still logged in, in your account console, you can open the workspace directly from Data setting icon, on the left side.

Once the Workspace is open, select the admin console in the upper right corner.

Select Groups

adb-admin-console

Select add Group.

adb-admin-console-add

Select the groups you want to add one by one.

adb-admin-console-group-add

 

The groups are now visible and you can assign the correct entitlements to the group.

adb-admin-console-entitlement

Workspace access:

  • When granted to a user or service principal, they can access the Data Science & Engineering and Databricks Machine Learning persona-based environments.
  • Can’t be removed from workspace admins.

adb-admin-console-entitlement-enable

Databricks SQL access:

  • When granted to a user or service principal, they can access Databricks SQL.

Allow unrestricted cluster creation:

  • When granted to a user or service principal, they can create clusters. You can restrict access to existing clusters using cluster-level permissions.
  • Can’t be removed from workspace admins

 

Account admins are synced by default to all workspaces.

User added through a group do have separate icon displayed.

add-user-group

Please note that Databricks recommends that you assign group permissions to workspaces, instead assigning workspace permissions to users individually.

In my next blog I will explain how to Add Service Principals to your Azure Databricks account using the account console.

Other Blog post in this serie:

  1. Configure the Enterprise Application(SCIM) for Azure Databricks Account Level provisioning
  2. Assign and Provision users and groups in the Enterprise Application(SCIM)
  3. Creating a metastore in your Azure Databricks account to assign an Azure Databricks Workspace
  4. Assign Users and groups to an Azure Databricks Workspace and define the correct entitlements
  5. Add Service Principals to your Azure Databricks account using the account console
  6. Configure the Enterprise Application(SCIM) for Azure Databricks Workspace provisioning

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

five × 4 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Azure Purview announcements and new functionalities

This week the Azure Purview Product team added some new functionalities, new connectors(these connectors where added during my holiday), Azure Synapse Data Lineage, a better Power BI integration and the introduction of Elastics Data Map. Slowly we are on our way to a...

Azure Data Factory and Azure Synapse Analytics Naming Conventions

Naming Conventions More and more projects are using Azure Data Factory and Azure Synapse Analytics, the more important it is to apply a correct and standard naming convention. When using standard naming conventions you create recognizable results across different...

Azure Data Factory Let’s get started

Creating an Azure Data Factory Instance, let's get started Many blogs nowadays are about which functionalities we can use within Azure Data Factory. But how do we create an Azure Data Factory instance in Azure for the first time and what should you take into account? ...

Azure Data Factory: Generate Pipeline from the new Template Gallery

Last week I mentioned that we could save a Pipeline to GIT. But today I found out that you can also create a Pipeline from a predefined Solution Template.Template Gallery These template will make it easier to start with Azure Data Factory and it will reduce...

How to setup Code Repository in Azure Data Factory

Why activate a Git Configuration? The main reasons are: Source Control: Ensures that all your changes are saved and traceable, but also that you can easily go back to a previous version in case of a bug. Continuous Integration and Continuous Delivery (CI/CD): Allows...

Azure DevOps and Azure Feature Pack for Integration Services

Azure Feature Pack for Integration ServicesAzure Blob Storage A great addition for SSIS is using extra connectors like  Azure Blob Storage or Azure Data Lake Store which are added by the Azure Feature Pack. This Pack needs to be installed on your local machine. Are...

Data Sharing in Microsoft Purview

In today's world, data is the key to success for businesses. The more data a business has, the better it can make decisions and stay ahead of its competitors. However, data is not always easy to come by, and many businesses struggle with finding and accessing the data...

Create an Azure Synapse Analytics Apache Spark Pool

Adding a new Apache Spark Pool There are 2 options to create an Apache Spark Pool.Go to your Azure Synapse Analytics Workspace in de Azure Portal and add a new Apache Spark Pool. Or go to the Management Tab in your Azure Synapse Analytics Workspace and add a new...

Provision users and groups from AAD to Azure Databricks (part 6)

In one of my previous blogs, I explained how to setup the Enterprise Application for Azure Databricks account level  provisioning. This blog is related to the Azure Databricks Workspace Level.

Azure SQL Data Warehouse: Reserved Capacity versus Pay as You go

How do I use my Reserved Capacity correctly? Update 11-11-2020: This also applies to Azure Synapse SQL Pools. In my previous article you were introduced, how to create a Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). Now it's time to take a look at how this...