Provision users and groups from AAD to Azure Databricks (part 6)

by Jan 25, 2023

Configure the Enterprise Application(SCIM) for Azure Databricks Workspace provisioning

In one of my previous blogs, I explained how to setup the Enterprise Application for Azure Databricks account level  provisioning. This blog is related to the Azure Databricks Workspace Level. This is a slightly different setup hence a separate blog.

Preview

Provision identities to your Azure Databricks workspace is still in preview

Steps to configure

Login to the Azure Databricks Workspace as a workspace admin.

Generate a personal acces token and store it in a save location(Azure Key Vault)

Generate Personal Access Token Databricks

Configure the Enterprise Application

In the Azure portal, go to Azure Active Directory > Enterprise Applications.

Click on new application and search for the “Azure Databricks SCIM Provisioning Connector”

app-scim-adb

Click on the app:

SCIM add Workspace

Enter a Name for the application, I used Azure Databricks SCIM MyAzureDatabricksWorkspace

Click on Create and wait until the application is created.

app-scim-adb-configure

Click on Provisioning and set Provisioning Mode to Automatic.

app-scim-adb-configure-automatics

Set the Tenant URL to https://<databricks-instance>/api/2.0/preview/scim

Set Secret Token to the Azure Databricks token that we generated and saved earlier in our Key Vault.

Click on Test Connection so see if everything is configured correctly.

You can learn how to assign and sync users in the following blogs as this is a similar approach to setting up at the account level.

  1. Assign and Provision users and groups in the Enterprise Application(SCIM)
  2. Assign Users and groups to an Azure Databricks Workspace and define the correct entitlements

 

Add Service principal to your Azure Databricks workspace

To add a service principal using Postman:

  1. Create a new HTTP request (File > New > HTTP Request).
  2. In the HTTP verb drop-down list, select POST.
  3. Enter in the request URL   https://DATABRICKS ID.azuredatabricks.net/api/2.0/preview/scim/v2/ServicePrincipals
  4. On the Authorization tab, in the Type list, select Bearer Token.
  5. For Token, enter your Databricks personal access token for your workspace user.
  6. On the Headers tab, add the Key and Value pair of Content-Type and application/scim+json
  7. On the Body tab, select raw and JSON.
  8. add
    {
      “displayName”: “sp-edk-databricks-dvlm-deployment”,
      “applicationId”: “xxxxx”,
      “entitlements”: [
        {
          “value”: “allow-cluster-create”
        }
      ],
      “schemas”: [
        “urn:ietf:params:scim:schemas:core:2.0:ServicePrincipal”
      ],
      “active”: true
    }

Click on send and the Service Principal is added to the Databricks Workspace and is ready for further usage. You can find the Service Principals in the admin console, groups, users.

When you use Postman I advise you to work with Environments and variables, this makes the reusing of scripts a lot easier. In this blog I have not done that for simplicity.

This was my last blog in the series, I hope you enjoyed reading these blogs. A summary of these blogs can be found below. If there are any questions or ambiguities, I would of course be happy to hear and answer them.

Other Blog post in this serie:

  1. Configure the Enterprise Application(SCIM) for Azure Databricks Account Level provisioning
  2. Assign and Provision users and groups in the Enterprise Application(SCIM)
  3. Creating a metastore in your Azure Databricks account to assign an Azure Databricks Workspace
  4. Assign Users and groups to an Azure Databricks Workspace and define the correct entitlements
  5. Add Service Principals to your Azure Databricks account using the account console
  6. Configure the Enterprise Application(SCIM) for Azure Databricks Workspace provisioning

 

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

16 − two =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Azure Data Factory Let’s get started

Creating an Azure Data Factory Instance, let's get started Many blogs nowadays are about which functionalities we can use within Azure Data Factory. But how do we create an Azure Data Factory instance in Azure for the first time and what should you take into account? ...

Azure Data Factory: Generate Pipeline from the new Template Gallery

Last week I mentioned that we could save a Pipeline to GIT. But today I found out that you can also create a Pipeline from a predefined Solution Template.Template Gallery These template will make it easier to start with Azure Data Factory and it will reduce...

Provision users and groups from AAD to Azure Databricks (part 2)

Assign and Provision users and groups in the Enterprise Application In the previous blog you learned how to configure the Enterprise Application. In this blog, you will learn how to assign and Provision Users and Groups. Once the Users and groups are assigned to the...

Get control of data loads in Azure Synapse

Load Source data to DataLake There are several ways to extract data from a source in Azure Synapse Analytics or in Azure Data Factory. In this article I'm going to use a metadata-driven approach by using a control table in Azure SQL in which we configure the...

Data Sharing in Microsoft Purview

In today's world, data is the key to success for businesses. The more data a business has, the better it can make decisions and stay ahead of its competitors. However, data is not always easy to come by, and many businesses struggle with finding and accessing the data...

Scale SQL Database dynamically with Metadata

Scale SQL Database Dynamically with Metadata Use this template to scale up and down an Azure SQL Database in Azure Synapse Analytics or in Azure Data Factory. This article describes a solution template how you can Scale up or down a SQL Database within Azure Synapse...

Azure SQL Data Warehouse: How to setup Reserved Capacity

Purchase your Azure SQL Datawarehouse Reservation   Since a few weeks you can buy Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). This Reservation can save you up to 65% on the normal Pay as You go rates with a 3 year pre-commit. A pre-commit of 1 year...

Azure Purview announcements and new functionalities

This week the Azure Purview Product team added some new functionalities, new connectors(these connectors where added during my holiday), Azure Synapse Data Lineage, a better Power BI integration and the introduction of Elastics Data Map. Slowly we are on our way to a...

Blog Serie: Provision users and groups from AAD to Azure Databricks

This blog post series contains topics on how to Provision users and groups from Azure Active Directory to Azure Databricks using the Enterprise Application(SCIM). This is a summary of the all the blogs I posted the last couple of days. I am very happy with all the feedback and tips I have received about this blog series.

Provision users and groups from AAD to Azure Databricks (part 1)

Blog Serie: Provisioning identities from Azure Active Directory to Azure Databricks. Instead of adding users and groups manual to your Azure Databricks environment, you can also sync them automatically from your Azure Active Directory to your Azure Databricks account...