Provision users and groups from AAD to Azure Databricks (part 6)

by Jan 25, 2023

Configure the Enterprise Application(SCIM) for Azure Databricks Workspace provisioning

In one of my previous blogs, I explained how to setup the Enterprise Application for Azure Databricks account level  provisioning. This blog is related to the Azure Databricks Workspace Level. This is a slightly different setup hence a separate blog.

Login to the Azure Databricks Workspace as a workspace admin.

Generate a personal acces token and store it in a save location(Azure Key Vault)

Generate Personal Access Token Databricks

Configure the Enterprise Application

In the Azure portal, go to Azure Active Directory > Enterprise Applications.

Click on new application and search for the “Azure Databricks SCIM Provisioning Connector”

app-scim-adb

Click on the app:

SCIM add Workspace

Enter a Name for the application, I used Azure Databricks SCIM MyAzureDatabricksWorkspace

Click on Create and wait until the application is created.

app-scim-adb-configure

Click on Provisioning and set Provisioning Mode to Automatic.

app-scim-adb-configure-automatics

Set the Tenant URL to https://<databricks-instance>/api/2.0/preview/scim

Set Secret Token to the Azure Databricks token that we generated and saved earlier in our Key Vault.

Click on Test Connection so see if everything is configured correctly.

You can learn how to assign and sync users in the following blogs as this is a similar approach to setting up at the account level.

  1. Assign and Provision users and groups in the Enterprise Application(SCIM)
  2. Assign Users and groups to an Azure Databricks Workspace and define the correct entitlements

 

Add Service principal to your Azure Databricks workspace

To add a service principal using Postman:

  1. Create a new HTTP request (File > New > HTTP Request).
  2. In the HTTP verb drop-down list, select POST.
  3. Enter in the request URL   https://DATABRICKS ID.azuredatabricks.net/api/2.0/preview/scim/v2/ServicePrincipals
  4. On the Authorization tab, in the Type list, select Bearer Token.
  5. For Token, enter your Databricks personal access token for your workspace user.
  6. On the Headers tab, add the Key and Value pair of Content-Type and application/scim+json
  7. On the Body tab, select raw and JSON.
  8. add
    {
      “displayName”: “sp-edk-databricks-dvlm-deployment”,
      “applicationId”: “xxxxx”,
      “entitlements”: [
        {
          “value”: “allow-cluster-create”
        }
      ],
      “schemas”: [
        “urn:ietf:params:scim:schemas:core:2.0:ServicePrincipal”
      ],
      “active”: true
    }

Click on send and the Service Principal is added to the Databricks Workspace and is ready for further usage. You can find the Service Principals in the admin console, groups, users.

When you use Postman I advise you to work with Environments and variables, this makes the reusing of scripts a lot easier. In this blog I have not done that for simplicity.

This was my last blog in the series, I hope you enjoyed reading these blogs. A summary of these blogs can be found below. If there are any questions or ambiguities, I would of course be happy to hear and answer them.

  1. Configure the Enterprise Application(SCIM) for Azure Databricks Account Level provisioning
  2. Assign and Provision users and groups in the Enterprise Application(SCIM)
  3. Creating a metastore in your Azure Databricks account to assign an Azure Databricks Workspace
  4. Assign Users and groups to an Azure Databricks Workspace and define the correct entitlements
  5. Add Service Principals to your Azure Databricks account using the account console
  6. Configure the Enterprise Application(SCIM) for Azure Databricks Workspace provisioning

 

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

four × one =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Connect Azure Databricks to Microsoft Purview

Connect and Manage Azure Databricks in Microsoft Purview This week the Purview team released a new feature, you’re now able to Connect and manage Azure Databricks in Microsoft Purview. This new functionality is almost the same as the Hive Metastore connector which you...

Migrate Azure Storage to Azure Data Lake Gen2

Migrate Azure Storage to Storage Account with Azure Data Lake Gen2 capabilities Does it sometimes happen that you come across a Storage Account where the Hierarchical namespace is not enabled or that you still have a Storage Account V1? In the tutorial below I...

Azure Data Factory updates June

Azure Data Factory updates There have been quite a few updates in Azure Data Factory and Azure Synapse Analytics in the last few days.Below is a summary of these updates:   Time-To-Live (TTL) on Integration Runtime with managed virtual network enabled The new TTL...

Azure Purview Pricing example

Azure Purview pricing? Note: Billing for Azure Purview will commence November 1, 2021. Updated October 31st, 2021 Pricing for Elastic Data Map and Scanning for Other Sources are changed and updated in the blog below. Since my last post on Azure Purview announcements...

Azure SQL Data Warehouse: Reserved Capacity versus Pay as You go

How do I use my Reserved Capacity correctly? Update 11-11-2020: This also applies to Azure Synapse SQL Pools. In my previous article you were introduced, how to create a Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). Now it's time to take a look at how this...

Using Azure Automation to generate data in your WideWorldImporters database

CASE: For my test environment I want to load every day new increments into the WideWorldImporters Azure SQL Database with Azure Automation. The following Stored Procedure is available to achieve this. EXECUTE DataLoadSimulation.PopulateDataToCurrentDate...

Provision users and groups from AAD to Azure Databricks (part 3)

Creating a metastore in your Azure Databricks account In the previous blog you learned how to sync and assign users and groups to the Enterprise Application. In this blog, you will learn how to create a metastore and assign it to Azure Databricks workspaces to. This...

Azure Synapse Analyics costs analyis for Integration Runtime

AutoResolveIntegrationRuntime! The last few days I've been following some discussions on Twitter on using a separate Integration Runtime in Azure Synapse Analytics running in the selected region instead of auto-resolve. The AutoResolveIntegrationRuntime is...

Azure Purview announcements and new functionalities

This week the Azure Purview Product team added some new functionalities, new connectors(these connectors where added during my holiday), Azure Synapse Data Lineage, a better Power BI integration and the introduction of Elastics Data Map. Slowly we are on our way to a...

Azure Data Factory and Azure Synapse Analytics Naming Conventions

Naming Conventions More and more projects are using Azure Data Factory and Azure Synapse Analytics, the more important it is to apply a correct and standard naming convention. When using standard naming conventions you create recognizable results across different...