Azure Purview Pricing example

by Sep 9, 2021

Azure Purview pricing?

Note: Billing for Azure Purview will commence November 1, 2021.

Updated October 31st, 2021

Pricing for Elastic Data Map and Scanning for Other Sources are changed and updated in the blog below.

Since my last post on Azure Purview announcements and new functionalities  I got some questions regarding pricing. In the meantime the pricing page has been updated and I’ve created also a new Azure Purview instance in my subscription(after August 18th). Currently most of the Azure Purview components are still free until further Notice. To get more details I still recommend everyone to watch the Azure Purview event from September 28th 2021, https://azuredatagovernance.eventcore.com/

Updated September 29th, 2021

Yesterday Microsoft announced the General Availability of Azure Purview, more on the announcement can be found in the blog from Rohan Kumar

Since September 28, 2021, the price of Azure Purview has been adjusted. The main change is that the use of the Elastic Data Map will remain free until November 1, 2021. To encourage trial of the Elastic Data Map, we are providing all customers free usage of Data Map from August 16, 2021 to October 31, 2021. I’ve updated the pricing details below.

As a small recap:

Azure Purview Elastic Data Map

  Price
Capacity Unit €0.353 per 1 Capacity Unit Hour

Billing for Data Map capacity unit consumption will commence November 1, 2021.

When you have created your Azure Purview after Augusts 18th, you will see that you are currently not charged for the Data Map Units.

Azure_purview_pricing_datamap

As you can see, no charging anymore for Data Map, I’m only charged for my scanning, which I only do manually do save some costs.

Azure_purview_pricing_details

Automated Scanning & Classification

  Price
For Power BI online Free for a limited time
For SQL Server on-prem Free for a limited time
For other data sources €0.540 per 1 vCore Hour

 

Other features

  Price
Resource Set €0.18 per 1 vCore Hour

Billing for scanning duration will commence November 1, 2021.

Pricing Example

Based on the example which is published on the pricing page, I’ve done a Calculation:

Example Scenario:
Data Map can scale capacity elastically based on the request load. Request load is measured in terms of data map operations per second. As a cost control measure, a Data Map is configured by default to elastically scale up to a peak of 8 times the steady state capacity.

For dev/trial usage:

Data Map (Always on): 1 capacity unit x Price per capacity unit per hour x 730 hours per month

Scanning (Pay as you go): Total duration (in minutes) of all scans in a month / 60 min per hour x 32 vCore per scan x €0.540 per vCore per hour

Resource Set: Total duration (in hours) of processing resource set data assets in a month * Price per vCore per hour

The total cost per month for Azure Purview = cost of Data Map + cost of Scanning + cost of Resource Set

Assuming above Scenario that we only use 1 Capacity Unit and use not more then 2 GB of Metadata storage and we scan our data once a week for 2 hours.

Data Map 1 CPU x €0.353 X 730 hours = €257,69

Scanning 4 scans x 2 hours x 32 VCore x €0.540 per vCore per hour = €138,24

Resource Set 4 scans x 1 hour x €0.18 per vCore per hour €0,72

In Total €396,65including 4 scans. If you leave Azure Purview as is and no scanning you base fee will be €257,69.

Like always, in case you have questions, leave them in the comments or send me a message.

Useful links

 
 
 
 
 
 

 

Feel free to leave a comment

2 Comments

  1. lpribson

    where do you get that detailed breakdown of activity/cost on your purview account

    Reply
    • Erwin

      Hi,

      You can select the details in the cost analysis by selectin table view. Create a filter based on your Purview account. In the table view group by MeterSubCategory.

      Let me if it works for you

      Erwin

      Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

2 × three =

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Azure Synapse Analyics costs analyis for Integration Runtime

AutoResolveIntegrationRuntime! The last few days I've been following some discussions on Twitter on using a separate Integration Runtime in Azure Synapse Analytics running in the selected region instead of auto-resolve. The AutoResolveIntegrationRuntime is...

Scale your SQL Pool dynamically in Azure Synapse

Scale your Dedicated SQL Pool in Azure Synapse Analytics In my previous article, I explained how you can Pause and Resume your Dedicated SQL Pool with a Pipeline in Azure Synapse Analytics. In this article I will explain how to scale up and down a SQL Pool via a...

Azure Data Factory Let’s get started

Creating an Azure Data Factory Instance, let's get started Many blogs nowadays are about which functionalities we can use within Azure Data Factory. But how do we create an Azure Data Factory instance in Azure for the first time and what should you take into account? ...

Exploring Azure Synapse Analytics Studio

Azure Synapse Workspace Settings In my previous article, I walked you through "how to create your Azure Synapse Analytics Workspace". It's now time to explore the brand new Synapse Studio. Most configuration and settings can be done through the Synapse Studio. In your...

Create Virtual Machines with Azure DevTest Lab

A while ago I had to give a training. Normally I would roll out a number of virtual machines in Azure. Until someone brought my attention to an Azure Service, Azure DevTest Labs. With this Azure service you can easily create a basic image and use this image to roll...

Azure SQL Data Warehouse: Reserved Capacity versus Pay as You go

How do I use my Reserved Capacity correctly? Update 11-11-2020: This also applies to Azure Synapse SQL Pools. In my previous article you were introduced, how to create a Reserved Capacity for an Azure SQL Datawarehouse (SQLDW). Now it's time to take a look at how this...

Azure Data Factory Naming Conventions

Naming Conventions More and more projects are using Azure Data Factory and Azure Synapse Analytics, the more important it is to apply a correct and standard naming convention. When using standard naming conventions you create recognizable results across different...

Azure Data Factory: Generate Pipeline from the new Template Gallery

Last week I mentioned that we could save a Pipeline to GIT. But today I found out that you can also create a Pipeline from a predefined Solution Template.Template Gallery These template will make it easier to start with Azure Data Factory and it will reduce...

Azure Synapse Analytics overwrite live mode

Stale publish branch In Azure Synapse Analytics and Azure Data Factory is an new option available "Overwrite Live Mode", which can be found in the Management Hub-Git Configuration. With this new option your directly overwrite your Azure Synapse Live mode code with the...

Using Azure Automation to generate data in your WideWorldImporters database

CASE: For my test environment I want to load every day new increments into the WideWorldImporters Azure SQL Database with Azure Automation. The following Stored Procedure is available to achieve this. EXECUTE DataLoadSimulation.PopulateDataToCurrentDate...