Azure Data Factory: New functionalities and features

by May 22, 2020

New functionalities and features

Last week, a number of great new functionalities and features were added within Azure Data Factory. I would like to take you in some details in the blog below:

ADF DEBUG Report

Customer key

With this new functionality you can add extra security to your Azure Data Factory environment. Where the data was first encrypted with a randomly generated key from Microsoft, you can now use the customer-managed key feature. With this Bring Your Own Key (BYOK) you can add extra security to your Azure Data Factory environment. If you use the customer-managed key functionality, the data will be encrypted in combination with the ADF system key. You can create your own key or have it generated by the Azure Key Vault API.

You can read more in this Article which I wrote.

Pipeline Consumption Report

Last week the Azure Data Factory added the Pipeline Consumption Report.

The report can be used for your Triggered runs, just go to your Triggered runs and click on the new Icon.

ADFMonitor

The consumption of the selected Pipeline will be displayed. The data shown is only from this Pipeline and not from other Pipelines fired by this Pipeline. Would be a nice addition if the report shows the aggregation of the complete Triggered Run.

For your debug run, click on right site of your Output pane:

ADF DEBUG button

ADF DEBUG Report

The ADF consumption report is only surfacing Azure Data Factory related units. There may be additional units billed from other services that you are using and accessing which are not accounted for here including Azure SQL Database, Synapse Analytics, CosmosDB, ADLS, etc. More detailed can be found here.

Parameters from Execute Pipeline Activity

When calling a Pipeline you first had to add the parameters yourself, now they are automatically taken over from the Pipeline you select. Very handy and saves time again if you use a lot of parameters.

Define a Parameter in one of your Pipelines:

ADF Parameter

 

Create another Pipeline and add the Execute Pipeline activity. On the settings tab where you have to select the Pipeline you want to execute, you will discover that the option to add Manually the parameters is not there anymore. But, all the Parameters you had defined in your Pipeline are directly shown. Very handy and it reduces errors.

Old Situation:

ADF Parameter 3

New Situation:

ADF Parameter Pipeline

General Tab moved to new Properties Pane

Your General tab is now moved to the right site of the Canvas.

ADF Pane General

To edit it your properties, click on the pane icon located in the top-right corner of the canvas.

ADF Pane Properties

So these were some nice and useful addition to Azure Data Factory. I am very happy with it and what do you think?

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

13 + 11 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Parameterize Linked Services in ADF

Parameterize Linked Services For my Azure Data Factory solution I wanted to Parameterize properties in my Linked Services. Not all properties are Parameterized by default through the UI. But there's another way to achieve this.    Linked Service Open your existing...

Collection of all ADF Mapping Data Flow videos

ADF Mapping FlowDid you use the Dataflow preview functionality in Azure Data Factory? This has recently be renamed to Mapping Data Flows.  All video's which the ADF team has created, are collected. Start Here: ADF Data Flow: Overview ADF Data Flow: Data Flow...

SQLBits session: Microsoft Purview Data Policy App

SQLBits 2023 Thanks everyone for visiting my session during SQLBits. It's great to see such a full room and that so many people have started using Microsoft Purview.  SLIDES The slides can be downloaded via the link below, so that you can view them again at...

Restore a accidentally deleted Azure SQL Database

Help I deleted my Logical Server​OOPS Have you ever experienced that you accidentally deleted your Logical Server in Azure?  Because, for example, you made your Pipeline wrong. Surely. And of course you didn't have a backup in your storage either.Well I must confess...

Azure Purview March Updates

Azure Purview updatesAnnouncements Last week during SQLBITS, quite a few new updates were announced. I would like to include you in these announcements.March updates Support for SAP Business Warehouse (Preview) Blogpost:...

Custom comments in Azure Synapse Analytics

Add custom comments to your Azure DevOps and Github commitsFinally ​Finally and secretly hidden, we can now add a Comment to our commits in Azure Synapse Analytics and Azure Data Factory to Azure Dev Ops. How do you activate this custom comment option in your existing...

New Microsoft Azure Certifications

Microsoft Certification by Solution Area Handy overview of the new Microsoft Azure Certifications. More details can be found here Feel free to leave a comment

Updated competency exams and certifications Data Platform and Data Analytics for 2020

Retiring and new exams and certifications as of June 30 2020 A lot of Exams and certifications for Data Platform and Data Analytics are retiring on June 30 2020. All retired Exams and certifications will remain eligible for competency attainment and renewal until June...

A new year with a new job

Changing jobsAfter almost 11 years and 4 months I have decided to leave Axians and to start a new adventure in the new year. On January 2, 2020 I will start my day as Lead Data and AI at InsparkThe past years have flown by. I started at Eniac BI, which was...

Are you using Azure DevOps?

Azure DevOpsMore and more users are starting with Azure DevOps. Azure DevOps can be used for up to 5 users free of charge and is therefore a great start to start a project.    Build and Release I will not describe what Azure DevOps can do, because that is too much to...