ERWIN & BUSINESS ANALYTICS

Azure Data Factory: New functionalities and features

New functionalities and features

Last week, a number of great new functionalities and features were added within Azure Data Factory. I would like to take you in some details in the blog below:

ADF DEBUG Report

Customer key

With this new functionality you can add extra security to your Azure Data Factory environment. Where the data was first encrypted with a randomly generated key from Microsoft, you can now use the customer-managed key feature. With this Bring Your Own Key (BYOK) you can add extra security to your Azure Data Factory environment. If you use the customer-managed key functionality, the data will be encrypted in combination with the ADF system key. You can create your own key or have it generated by the Azure Key Vault API.

You can read more in this Article which I wrote.

Pipeline Consumption Report

Last week the Azure Data Factory added the Pipeline Consumption Report.

The report can be used for your Triggered runs, just go to your Triggered runs and click on the new Icon.

ADFMonitor

The consumption of the selected Pipeline will be displayed. The data shown is only from this Pipeline and not from other Pipelines fired by this Pipeline. Would be a nice addition if the report shows the aggregation of the complete Triggered Run.

For your debug run, click on right site of your Output pane:

ADF DEBUG button

ADF DEBUG Report

The ADF consumption report is only surfacing Azure Data Factory related units. There may be additional units billed from other services that you are using and accessing which are not accounted for here including Azure SQL Database, Synapse Analytics, CosmosDB, ADLS, etc. More detailed can be found here.

Parameters from Execute Pipeline Activity

When calling a Pipeline you first had to add the parameters yourself, now they are automatically taken over from the Pipeline you select. Very handy and saves time again if you use a lot of parameters.

Define a Parameter in one of your Pipelines:

ADF Parameter

 

Create another Pipeline and add the Execute Pipeline activity. On the settings tab where you have to select the Pipeline you want to execute, you will discover that the option to add Manually the parameters is not there anymore. But, all the Parameters you had defined in your Pipeline are directly shown. Very handy and it reduces errors.

Old Situation:

ADF Parameter 3

New Situation:

ADF Parameter Pipeline

General Tab moved to new Properties Pane

Your General tab is now moved to the right site of the Canvas.

ADF Pane General

To edit it your properties, click on the pane icon located in the top-right corner of the canvas.

ADF Pane Properties

So these were some nice and useful addition to Azure Data Factory. I am very happy with it and what do you think?

Feel free to leave a comment

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

five × 2 =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Working from Home: How do I get my energy and focus back on track?

Working from home It has now been exactly 7 weeks since I started working from home. In the beginning this went with full energy, but now gradually the energy and the focus on the work is starting to drain.I'm the only one in here? I'm sure, I'm not. I regularly hear...

Service Health Alerts in Azure

Maintenance and Service Healths in AzureI get often questions, is there no maintenance in Azure. Like every data center, the Azure data center also needs maintenance. You can even be notified when a specific service, in a specific data center, has scheduled...

A new year with a new job

Changing jobsAfter almost 11 years and 4 months I have decided to leave Axians and to start a new adventure in the new year. On January 2, 2020 I will start my day as Lead Data and AI at InsparkThe past years have flown by. I started at Eniac BI, which was...

ADF: Get Metadata Activity stopped working

Meta Data ActivityToday my pipelines in Azure Data Factory (ADF) suddenly stopped working. The output structure was not found. Quit strange while these pipelines have been running for weeks.    Invalid Template After debugging my Pipeline I found out the...

Calculate Workingdays including Holidays with T-SQL

Calculate Workingdays between 2 Date columnsRecently I have been getting some questions from my customers, can I calculate the number of workdays between 2 dates? Of course my answer was, yes you can. But I do want certain closing dates and holidays of our company not...

My First Blog Post SQLSatHolland 790

Yeah, my first blog is LIVEAfter a good talk with Reza Rad from RADACAD during SQLSatHolland, I decided to start my first blog!Knowledge sharing is very important, it gives me a lot of energy. But it also gives others people in the community energies to pick up new...

Collection of all ADF Mapping Data Flow videos

ADF Mapping FlowDid you use the Dataflow preview functionality in Azure Data Factory? This has recently be renamed to Mapping Data Flows.  All video's which the ADF team has created, are collected. Start Here: ADF Data Flow: Overview ADF Data Flow: Data Flow...

Rerun Pipeline activities in Azure Data Factory

Rerun Pipeline activities in ADF! As of today you can rerun or partially, yes you’re reading it correct partially, rerun you Azure Data Factory pipeline.Where you previously had to run the entire Pipeline again, you can now run a part of the Pipeline. This can save a...

Azure Data Factory: New functionalities and features

New functionalities and featuresLast week, a number of great new functionalities and features were added within Azure Data Factory. I would like to take you in some details in the blog below:Customer key With this new functionality you can add extra security to your...

Calculate the Last Day of the Month using SQL

Calculate the Last DayToday I needed to calculate the last day of the previous month for a Customer.Ever heard from the function EOMONTH? Searching on the web I came across a function I never heard from before EOMONTH, this function can be used as of SQL Server 2012....