How to setup Code Repository in Azure Data Factory

by Nov 5, 2020

Why activate a Git Configuration?

The main reasons are:

  1. Source Control: Ensures that all your changes are saved and traceable, but also that you can easily go back to a previous version in case of a bug.
  2. Continuous Integration and Continuous Delivery (CI/CD): Allows you to Create build and release pipelines for easy release to other Data Factory instance, manually or triggered(DTAP).
  3. Collaboration: You have the ability to easily collaborate in the same Data Factory with different colleagues.
  4. Performance: Your Data Factory from Git is 10 times faster then loading directly from the Data Factory Service.

So enough reasons to start enabling your Git Configuration.

How to setup your Code Repository in Azure Data Factory!

During the configuration/set up of your Data Factory you have the possibility to select either Azure DevOps or GitHub as your Git Configuration. If you haven’t done that, you can still configure this integration in Azure Data Factory. The procedure for both options are the same.
Create ADF Version Control
In my previous article, Creating an Azure Data Factory Instance, I skipped the Git Configuration. In this article I will explain how to do this in an already created Data Factory.

Azure Data Factory Source Control

On the right of your splash screen when opening your Data Factory select the Setup Code Repository. Other options to start configuring your Code Repository are through the Management Hub or in the UX on the top left in the authoring canvas. If you don’t see the option, Code Repository is already configured. You can check this in the Management Hub or UX.

We have the option to configure Azure DevOps or GitHub.

Azure DevOps integration

First I will take you through the configuration of Azure DevOps and then also create a similar configuration in GitHub. If you want to start directly in GitHub, click here.

Select Azure DevOps Git:

Azure Dev Ops Config

  1. Azure Active Directory: Select the AAD where your Azure DevOps environment is located. If you use another AAD, make sure that this account has rights to that environment.
  2. Azure DevOps Account: Select your Account.
  3. Project Name: Select the Project Name where you want to store your repository in.
  4. Git Repository: Create a new Project.
  5. Collaboration Branch: Change this to Main.
  6. Publish Branch: Leave this on adf_publish.
  7. Root folder: If you want to create a complete project with SQL,Azure Analysis Service, Azure DataBricks etc etc, you define a root folder and create your repository into that folder.
  8. Import: When this is a blank Data Factory, you can disable this option. When you have create already resources in your Data Factory, you should enable this so already created resources are committed to the repository.

Click on apply and you will see that you repository is connected.

Repo Connected

When you log in to your Azure Dev Ops Environment, you will see that a new Repository is created Main Branch. Azure Dev Ops Main

Go back to your Data Factory and click on Publish.

Data Factory Publish

In Azure DevOps the adf_publish Branch is now also created.Azure Dev Ops Publish

GitHub Integration

In the repository screen, select GitHub:

Github login

The first time you connect with your Data Factory you need to login in GitHub.

Github authorize

Once connect you to need to Authorize your Data Factory.

Github Configuration

All the settings are almost the same as in Azure DevOps:

  1. Use GitHub Server Enterprise: If enabled fill the The GitHub Enterprise root URL.
  2. GitHub Account: Select your Account.
  3. Project Name: Select the Project Name where you want to store your repository in.
  4. Git Repository: Create a new Project.
  5. Collaboration Branch: Leave this on Main.
  6. Publish Branch: Leave this on adf_publish.
  7. Root folder: If you want to create a complete project with SQL, Azure Analysis Service, Azure DataBricks etc etc, you define a root folder and create your repository into that folder.
  8. Import: When this is a blank Data Factory, you can disable this option. When you have create already resources in your Data Factory, you should enable this so already created resources are committed to the repository.

Click on apply and you will see that you repository is connected.

Repo Connected

Log in to your GitHub, a new Repository is created Main Branch. If you go back to your Data Factory and click on Publish.

Data Factory Publish

In GitHub the adf_publish Branch is now also created.

GitHub Publish

As you can see the Setup for Azure Dev Ops and GitHub are mostly the same. You have now learned how to connect your Data Factory to a Code Repository. You’re now ready to start building your Release and build pipeline’s.

Thanks for reading and in case you have some questions, please leave them in the comments below.

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published.

five × one =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Service Healths in Azure

Creating Service Health Alerts in AzureAzure Portal In the Azure Portal go to Monitor – Service Health – Health alerts If you have created alerts before you will see them over here. Assuming you haven’t created an Alert before, we will start to create an Alert.1...

Scale SQL Database dynamically with Metadata

Scale SQL Database Dynamically with Metadata Use this template to scale up and down an Azure SQL Database in Azure Synapse Analytics or in Azure Data Factory. This article describes a solution template how you can Scale up or down a SQL Database within Azure Synapse...

Change your Action Group in Azure Monitoring

Change a Action GroupPrevious Article In my previous artcile I wrote about how to create Service Helath Alerts. In this article you will learn how to change the Action Group to add, change or Remove members(Action Group Type Email/SMS/Push/Voice) Azure Portal In the...

Scale your SQL Pool dynamically in Azure Synapse

Scale your Dedicated SQL Pool in Azure Synapse Analytics In my previous article, I explained how you can Pause and Resume your Dedicated SQL Pool with a Pipeline in Azure Synapse Analytics. In this article I will explain how to scale up and down a SQL Pool via a...

Azure Purview announcements and new functionalities

This week the Azure Purview Product team added some new functionalities, new connectors(these connectors where added during my holiday), Azure Synapse Data Lineage, a better Power BI integration and the introduction of Elastics Data Map. Slowly we are on our way to a...

Azure SQL Data Warehouse: How to setup Reserved Capacity

Purchase your Azure SQL Datawarehouse Reservation   Since a few weeks you can buy Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). This Reservation can save you up to 65% on the normal Pay as You go rates with a 3 year pre-commit. A pre-commit of 1 year...

Azure Data Factory updates June

Azure Data Factory updates There have been quite a few updates in Azure Data Factory and Azure Synapse Analytics in the last few days.Below is a summary of these updates:   Time-To-Live (TTL) on Integration Runtime with managed virtual network enabled The new TTL...

SSMS 18.1: Schedule your SSIS Packages in Azure Data Factory

Schedule your SSIS Packages with SSMS in Azure Data Factory(ADF) This week SQL Server Management Studio version 18.1 was released, which can be downloaded from here. In version 18.1 the Database diagrams are back and from now on we can also schedule SSIS Packages in...

Azure Synapse Analytics Power BI Integration

Creating a Linked Service for Power BI Open your Synapse Studio and select the Management Hub. Add a new Linked Service If you haven't connect to Power BI before, you will see the screen above. If you want to add another Power BI Linked Service(Workspace). Search for...

Azure Purview Pricing example

Azure Purview pricing? Note: Billing for Azure Purview will commence November 1, 2021. Updated October 31st, 2021 Pricing for Elastic Data Map and Scanning for Other Sources are changed and updated in the blog below. Since my last post on Azure Purview announcements...