Parameterize Linked Services in ADF

by Jul 9, 2020

Parameterize Linked Services 

For my Azure Data Factory solution I wanted to Parameterize properties in my Linked Services. Not all properties are Parameterized by default through the UI. But there’s another way to achieve this.

 

 

Linked Service Azure Data Factory

Linked Service

Open your existing Linked Services.

Linked Services ADF

In this situation I want to Parameterize my FTP connection so that I can change the Host name based on a Azure Key Vault Secret.

By default is this not possible through the UI but in the Bottom of your Linked Service there is a Advanced box

Linked Services Advanced

 

If you enable this box you can start building your own connection, but also create your own Parameters for this connection.

How to start:

As a base we will use the default code or our connection

Linked Services Code

{
    "name": "LS_FTP_SOURCE",
    "properties": {
        "annotations": [
            "stage: none",
            "scenario: demo",
            "environments: dvlm"
        ],
        "type": "FtpServer",
        "typeProperties": {
            "host": "ftp.erwindekreuk.com",
            "port": 21,
            "enableSsl": true,
            "enableServerCertificateValidation": true,
            "authenticationType": "Basic",
            "userName": "ftp_down",
            "password": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV_OXGN",
                    "type": "LinkedServiceReference"
                },
                "secretName": "secretname"
            }
        }
    },
    "type": "Microsoft.DataFactory/factories/linkedservices"
}
Copy this code to your advanced box and enable the option Specify dynamic contents in JSON format/

Now you can start adding new parameters.

Linked Services Advanced

If you want to Parameterize your HOST name  connection you have to add in the top of the code a new Parameter, under the type of your connection

    "properties": {
        "type": "FtpServer",
        "parameters": {
            "ConnectionKeyvaultSecret": {
                "type": "string"
            }

After you have done this, you need to specify for which properties you want to use this parameter. In my case I want to read the parameter form my Azure Key Vault for my HOST propertie.

The JSON code below will now use above parameter as an input.

        "typeProperties": {
            "host": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV_OXGN",
                    "type": "LinkedServiceReference"
                },
                "secretName": {
                    "value": "@linkedService().ConnectionKeyvaultSecret",
                    "type": "Expression"
                }
            }

Save your connection and you will see that your UI is changed and that you have to define all your setting through the Advanced Editor.

If you test your connection you will now see that you have to fill in a parameter.

Linked Service Test Connection

And now you can create parameters of every TypeProperties within your connection.

The code below will create Parameters for your Host, Username and Password entries with Azure Key Vault enabled. For the authenticationType you have to choose between Basic and Anonymous. But can also at this to your Azure Key Vault.

{
    "name": "LS_FTP_SOURCE",
    "properties": {
        "type": "FtpServer",
        "parameters": {
            "ConnectionKeyvaultSecret": {
                "type": "string"
            },
            "UsernameKeyvaultSecret": {
                "type": "string"
            },
            "PasswordKeyvaultSecret": {
                "type": "string"
            },
            "authenticationType": {
                "type": "string"
            }
        },
        "annotations": [ ],
        "typeProperties": {
            "host": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV_OXGN",
                    "type": "LinkedServiceReference"
                },
                "secretName": {
                    "value": "@linkedService().ConnectionKeyvaultSecret",
                    "type": "Expression"
                }
            },
            "port": 21,
            "enableSsl": false,
            "enableServerCertificateValidation": false,
            "authenticationType": "@linkedService().authenticationType",
            "userName": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV_OXGN",
                    "type": "LinkedServiceReference"
                },
                "secretName": {
                    "value": "@linkedService().UsernameKeyvaultSecret",
                    "type": "Expression"
                }
            },
            "password": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_AKV_OXGN",
                    "type": "LinkedServiceReference"
                },
                "secretName": {
                    "value": "@linkedService().PasswordKeyvaultSecret",
                    "type": "Expression"
                }
            }
        }
    }
}

Thanks for reading my blog post and have fun with Parameterization of your Linked Services in ADF.

Feel free to leave a comment

2 Comments

  1. David Laplante

    Thanks for the article! Just a note, proper indentation of the json goes a long way in helping everyone understand the logic 🙂

    Reply
    • Erwin

      Thanks for the Feedback David, looks like my editor removed the indentation. Going to look if I can better integrate/visualize this within my webpage.

      Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Updated competency exams and certifications Data Platform and Data Analytics for 2020

Retiring and new exams and certifications as of June 30 2020 A lot of Exams and certifications for Data Platform and Data Analytics are retiring on June 30 2020. All retired Exams and certifications will remain eligible for competency attainment and renewal until June...

Working from Home: How do I get my energy and focus back on track?

Working from home It has now been exactly 7 weeks since I started working from home. In the beginning this went with full energy, but now gradually the energy and the focus on the work is starting to drain.I'm the only one in here? I'm sure, I'm not. I regularly hear...

Data Factory Pricing

Data Factory pricingAre you also having problems to understand the Pricing Model for Azure Data Factory? After some research on the internet I came across an article which I wanted to share with you. ADFV2 Pricing ExamplesFeel free to leave a comment

Custom comments in Azure Synapse Analytics

Add custom comments to your Azure DevOps and Github commitsFinally ​Finally and secretly hidden, we can now add a Comment to our commits in Azure Synapse Analytics and Azure Data Factory to Azure Dev Ops. How do you activate this custom comment option in your existing...

How to check your SQL Server Quota in Azure?

Azure Subscription Usages for SQL Server  Last week we reached our Logical server Quota in Azure. By default you're only allowed to add 20 Logical Servers, but we wanted to have some more for testing purposes.Microsoft Support You can submit a support ticket trough...

Azure Purview Public Preview Starts billing

Billing for Azure Purview(Public Preview)As of January 20th 2021 0:00 UTC Azure Purview will starts billing.Preview From January 20 ,2021 Azure Purview will start billing. During the Public Preview, you will only be billed if you exceed the 4 capacity units for Azure...

Yeah My website is Finally Live

Yeah, my first built website is LIVEThe last couple of weeks I've been working hard to design my website and finally the moment is there. My website live and I'm really proud.  Building my own website The last couple of weeks/months I've been working hard... to...

ADF: Get Metadata Activity stopped working

Meta Data ActivityToday my pipelines in Azure Data Factory (ADF) suddenly stopped working. The output structure was not found. Quit strange while these pipelines have been running for weeks.    Invalid Template After debugging my Pipeline I found out the...

Goodbye 2020 Hello 2021

Goodbye 2020 Started to work for InSpark Last year was certainly an eventful year. Started with a new job at InSpark and after 10 weeks we all know what happened, the first intelligent lockdown. The Netherlands was partially locked, but our office was immediately...

Control my Nest Thermostat on my Domoticz Server running on a Synology DiskStation

Control your Nest Thermostat  in DomoticzNormally I always write Azure related, but today I'm writing about something different, on how I can control my Nest Thermostat on my Domoticz Server (Home automation). A while ago I have upgraded my  Nest Thermostat to login...

Discover more from Erwin & Data Analytics

Subscribe to get the latest posts sent to your email.