Moving SharePoint Online workflow task metadata into the data warehouse using Nintex Flows and custom Web API

This post suggests the idea of automatic copying of SharePoint Online(SPO) workflow tasks’ metadata into the external data warehouse.  In this scenario, workflow tasks are becoming a subject of another workflow that performs automatic copying of task’s data into the external database using a custom Web API endpoint as the interface to that database. Commonly, the requirement to move workflow tasks data elsewhere arises from limitations of SPO. In particular, SPO throttles requests for access to workflow data making it virtually impossible to create a meaningful workflow reporting system with large amounts of workflow tasks. The easiest approach to solve the problem is to use Nintex workflow to “listen” to the changes in the workflow tasks, then request the task data via SPO REST API and, finally, send the data to external data warehouse Web API endpoint.

Some SPO solutions require creation of a reporting system that includes workflow tasks’ metadata. For example, it could be a report about documents with statuses of workflows linked to these documents. Using conventional approach (ex. SPO REST API) to obtain the data seems unfeasible as SPO throttles requests for workflow data. In fact, the throttling is so tight that generation of reports with more than a hundred of records is unrealistic. In addition to that, many companies would like to create Business Intelligence(BI) systems analysing workflow tasks data. Having data warehouse with all the workflow tasks metadata can assist in this job very well.

To be able to implement the solution a few prerequisites must be met. You must know basics of Nintex workflow creation and to be able to create a backend solution with the database of your choice and custom Web API endpoint that allows you to write the data model to that database. In this post we have used Visual Studio 2015 and created ordinary REST Web API 2.0 project with Azure SQL Database.

The solution will involve following steps:

  1. Get sample of your workflow task metadata and create your data model.
  2. Create a Web API capable of writing data model to the database.
  3. Expose one POST endpoint method of the Web REST API that accepts JSON model of the workflow task metadata.
  4. Create Nintex workflow in the SPO list storing your workflow tasks.
  5. Design Nintex workflow: call SPO REST API to get JSON metadata and pass this JSON object to your Web API hook.

Below is detailed description of each step.

We are looking here to export metadata of a workflow task. We need to find the SPO list that holds all your workflow tasks and navigate there. You will need a name of the list to be able to start calling SPO REST API. It is better to use a REST tool to perform Web API requests. Many people use Fiddler or Postman (Chrome Extension) for this job. Request SPO REST API to get a sample of JSON data that you want to put into your database. The request will look similar to this example:

Picture 1

The key element in this request is getbytitle(“list name”), where “list name” is SPO list name of your workflow tasks. Please remember to add header “Accept” with the value “application/json”. It tells SPO to return JSON instead of the HTML. As a result, you will get one JSON object that contains JSON metadata of Task 1. This JSON object is the example of data that you will need to put into your database. Not all fields are required in the data warehouse. We need to create a data model containing only fields of our choice. For example, it can look like this one in C# and all properties are based on model returned earlier:

The next step is to create a Web API that exposes a single method that accepts our model as a parameter from the body of the request. You can choose any REST Web API design. We have created a simple Web API 2.0 in Visual Studio 2015 using general wizard for MVC, Web API 2.0 project. Then, we have added an empty controller and filled  it with the code that works with the Entity Framework to write data model to the database. We have also created code-first EF database context that works with just one entity described above.

The code of the controller:

The code of the database context for Entity Framework

Once you have created the Web API, you should be able to call Web API method like this:
https://yoursite.azurewebsites.net/api/EmployeeFileReviewTasksWebHook

You will need to put your model data in the request body as a JSON object. Also don’t forget to include proper headers for your authentication and header “Accept” with “application/json” and set type of the request to POST. Once you’ve tested the method, you can move on to the next steps. For example, below is how we tested it in our project.

Picture 4

Next, we will create a new Nintex Workflow in the SPO list with our workflow tasks. It is all straightforward. Click Nintex Workflows, then create a new workflow and start designing it.

Picture 5

Picture 6

Once you’ve created a new workflow, click on Workflow Settings button. In the displayed form please set parameters as it shown on screenshot below. We set “Start when items are created” and “Start when items are modified”. In this scenario, any modifications of our Workflow task will start this workflow automatically. It also includes cases when Workflow task have been modified by other workflows.

Picture 7.1

Create 5 steps in this workflow as it shown on the following screenshots labelled as numbers 1 to 5. Please keep in mind that blocks 3 and 5 are there to assist in debugging only and not required in production use.

Picture 7

Step 1. Create a Dictionary variable that contains SPO REST API request headers. You can add any required headers, including Authentication headers. It is essential here to include Accept header with “application/json” in it to tell SPO that we want JSON in responses. We set Output variable to SPListRequestHeaders so we can use it later.

Picture 8

Step 2. Call HTTP Web Service. We call SPO REST API here. It is important to make sure that getbytitle parameter is correctly set to your Workflow Tasks list as we discussed before. The list of fields that we want to be returned is defined in the “$select=…” parameter of OData request. We need only fields that are included in our data model. Other settings are straightforward: we supply our Request Headers created in Step 1 and create two more variables for response. SPListResponseContent will get resulting JSON object that we going to need at the Step 4.

Picture 9

Step 3 is optional. We’ve added it to debug our workflow. It will send an email with the contents of our JSON response from the previous step. It will show us what was returned by SPO REST API.

Picture 10

Step 4. Here we are calling our custom API endpoint with passing JSON object model that we got from SPO REST API. We supply full URL of our Web Hook, Set method to POST and in the Request body we inject SPListResponseContent from the Step 2. We’re also capturing response code to display later in workflow history.

Picture 11

Step 5 is also optional. It writes a log message with the response code that we have received from our API Endpoint.

Picture 12

Once all five steps are completed, we publish this Nintex workflow. Now we are ready for testing.

To test the system, open list of our Workflow tasks. Click on any task and modify any of task’s properties and save the task. This will initiate our workflow automatically. You can monitor workflow execution in workflow history. Once workflow is completed, you should be able to see messages as displayed below. Notice that our workflow has also written Web API response code at the end.

Picture 13

To make sure that everything went well, open your database and check the records updated by your Web API. After every Workflow Task modification you will see corresponding changes in the database. For example:

Picture 14

 

In this post we have shown that automatic copying of Workflow Tasks metadata into your data warehouse can be done with a simple Nintex Workflow setup and performing only two REST Web API requests. The solution is quite flexible as you can select required properties from the SPO list and export into the data warehouse. We can easily add more tables in case if there are more than one workflow tasks lists. This solution enables creation of powerful reporting system  using data warehouse and also allows to employ BI data analytics tool of your choice.

Creating Accounts on Azure SQL Database through PowerShell Automation

In the previous post, Azure SQL Pro Tip – Creating Login Account and User, we have briefly walked through how to create login accounts on Azure SQL Database through SSMS. Using SSMS is of course the very convenient way. However, as a DevOps engineer, I want to automate this process through PowerShell. In this post, we’re going to walk through how to achieve this goal.

Step #1: Create Azure SQL Database

First of all, we need an Azure SQL Database. It can be easily done by running an ARM template in PowerShell like:

We’re not going to dig it further, as this is beyond our topic. Now, we’ve got an Azure SQL Database.

Step #2: Create SQL Script for Login Account

In the previous post, we used the following SQL script:

Now, we’re going to automate this process by providing username and password as parameters to an SQL script. The main part of the script above is CREATE LOGIN ..., so we slightly modify it like:

Now the SQL script is ready.

Step #3: Create PowerShell Script for Login Account

We need to execute this in PowerShell. Look at the following PowerShell script:

Looks familiar? Yes, indeed. It’s basically the same as using ADO.NET in ASP.NET applications. Let’s run this PowerShell script. Woops! Something went wrong. We can’t run the SQL script. What’s happening?

Step #4: Update SQL Script for Login Account

CREATE LOGIN won’t take variables. In other words, the SQL script above will never work unless modified to take variables. In this case, we don’t want to but should use dynamic SQL, which is ugly. Therefore, let’s update the SQL script:

Then run the PowerShell script again and it will work. Please note that using dynamic SQL here wouldn’t be a big issue, as all those scripts are not exposed to public anyway.

Step #5: Update SQL Script for User Login

In a similar way, we need to create a user in the Azure SQL Database. This also requires dynamic SQL like:

This is to create a user with a db_owner role. In order for the user to have only limited permissions, use the following dynamic SQL script:

Step #6: Modify PowerShell Script for User Login

In order to run the SQL script right above, run the following PowerShell script:

So far, we have walked through how we can use PowerShell script to create login accounts and user logins on Azure SQL Database. With this approach, DevOps engineers will be easily able to create accounts on Azure SQL Database by running PowerShell script on their build server or deployment server.

Azure SQL Pro Tip – Creating Login Account and User

With Azure Resource Manager (ARM), while creating an Azure SQL Database instance, we can only set up an admin account. As we all know, using this admin account is not safe in most cases. Therefore, we need to create another accounts with fewer privileges.

However, unlike MS SQL Server, Azure SQL Database has some restrictions. Those restrictions also apply to create login accounts and users. In this post, we are going to create login accounts with limited permissions on Azure SQL Database.

Creating Login Account

With ARM, once an admin account is ready, we need to connect to the SQL Database instance with the admin account, using its credentials.

Once you connect to the Azure SQL Database through SSMS, you will be able to see the screen like above. Make sure that you are now on the master database. Then run the following SQL query:

  • <LOGIN_ACCOUNT> is the one you want to create.
  • <ACCOUNT_PASSWORD> is the password of the account you want to create.

NOTE: We can’t use the DEFAULT_DATABASE option when creating a login account as we’re on Azure SQL Database. For more details, find this MSDN document.

The first query is to check if the login account already exists and, if exists, drop it. Of course you can skip this part. The second query is actually to create the login account.

If you are on a database other than master and run the SQL query above, you will get an error message like:

Make sure that you are on the master database. 🙂 We have created a new login account. Let’s move onto the next step.

Creating User on Database with Appropriate Permissions

In order to create a user and grant permissions on the user, in SSMS, we usually do like:

However, we are on Azure SQL Database. You will see the error message like:

According to the MSDN document, USE (Transact-SQL), we can’t use the USE statement on Azure SQL Database.

In Azure SQL Database, the database parameter can only refer to the current database. The USE statement does not switch between databases, error code 40508 is returned. To change databases, you must directly connect to the database.

Yes, we need to directly connect to the database. How can we do it?

As the admin account uses the master database as its default one, instead of using the <default> database, we should specify a particular database name like above. After directly connect to the database, run the following SQL query to create a user and give permissions to the user:

This script is to create a user and give a db_owner privilege. If you want more restricted one, try the following:

This only offers the user with SELECT, INSERT, UPDATE and DELETE permissions.

That’s it. So far, we have walked through how we could create a login account and a user of a database on Azure SQL Database. If we can create a login account with limited privileges, we can use Azure SQL Database with fewer worries.

Highly available WordPress deployment on Azure

WordPress is the leading content management system today owning more than 50% of the market share. WordPress on Microsoft Azure is becoming a very popular offering with the ability to host WordPress as an Azure WebApp. While Microsoft has made the process of spinning up a WordPress site very easy with built in gallery images, running business critical applications on a cloud platform also introduces challenges in terms on availability and scalability.

WordPress Architecture

A typical WordPress deployment consists of the following two tiers:

  • Web Frontend – PHP web site
  • Backend Data store
    • Relational data store – Hierarchical entity store
    • Object data store – Used to store uploaded images and other artefacts

In order to guarantee high availability and scalability, we need to ensure that each of these service tiers are decoupled and can be separately managed.

 Web frontend

Windows Azure supports WordPress to be deployed as an Azure WebApp. Azure WebApp is a part of the Azure App Services offering, a fully managed service which can host mission critical web applications.

Azure WebApps natively supports scaling which can be achieved by increasing the number of instances of the hosting WebApp. The native Azure load balancer takes care of distributing the traffic amongst these instances in a ‘round robin’ manner. WebApps also support schedule-driven and automatic scaling.

HAWP1

Azure Portal can be used to configure scaling rules for the WebApp.

Azure WebApps offers an uptime of 99.95% for basic, standard and premium tiers even with a single instance of deployment . Azure Load Balancer takes care of managing failover amongst instances within a region. To achieve higher availability, the application front end can be deployed across different geographical regions and Azure Traffic manager can be employed to handle load balancing, network performance optimization and failover.

HAWP2.PNG

Backend Store

WordPress back consists of two data stores. One, a relational data store which is used to store WordPress entities along with their hierarchies and second, an object store used to persist artefacts.

Azure Gallery hosts a prebaked image for Scalable WordPress, which lets you configure a scalable MySQL backend as the relational store and uses Azure Storage as the object store for your WordPress deployment. Scalable WordPress uses Azure Blob store to host uploaded artefacts.  This works well for most of the scenarios.

HAWP3.PNG

It is important to understand that by using a MySQL backend for your WordPress site you are engaging with a third party database-as-a-service provider (which in this case is Clear DB). This means that the availability SLA associated with your WordPress backend is not provided by Microsoft. Always check the SLAs associated with your chosen pricing tier with the third party provider.

An alternative is to use Project Nami, which offers a WordPress image configured to run against Azure SQL Database as the back end. You can deploy the WordPress site with a Project Nami image either from the Azure gallery or from project’s website. Nami supports WordPress version 4.4.2 with a fully configurable Azure SQL Database backend. Once deployed, the WordPress instance can be configured to use Azure Storage as the object store by employing Windows Azure Storage for WordPress plugin.

HAWP4.PNG

Using SQL Azure and Azure Storage as backend for your WordPress site has the following key advantages

  • Eligibility for Microsoft SLA – A minimum uptime of 99.9% is guaranteed on Azure Storage and an uptime of 99.99% is guaranteed on Basic, Standard, or Premium tiers of Microsoft Azure SQL Database.
  • Easily manageable – Azure SQL databased can be provisioned and managed using Azure Resource Manager templates or through the portal.
  • On-demand scaling – Azure SQL database supports on-demand scaling to meet changing business demands and traffic.
  • More secure – Azure SQL databases offers better support around auditing and compliance apart from offering highly secure connections.

 Conclusion

To conclude, you can deploy a highly available WordPress site in Microsoft Azure by ensuring that the front end and backend tiers are fault tolerant and designed for failure. The deployment strategy can be influenced by the following factors:

  • Choice of technology for the backend – Microsoft (SQL Azure)/Non-Microsoft (MySQL)
  • SLA requirements
  • Logging and Auditing requirements
  • Scaling, security and manageability requirements

References

Project Nami – http://projectnami.org/

Scalable WordPress – https://azure.microsoft.com/en-us/marketplace/partners/wordpress/scalablewordpress/

 

Azure Security Fundamentals: Azure SQL Database

Microsoft Azure has many Platform-as-a-Service (PaaS) features, with one of the oldest being Azure SQL Database (which has had many variations on that name in it’s time!)

Over the last few months Microsoft has released a raft of updates to Azure SQL Database that bolster its security chops. As a result I thought it would be good to cover off some basic best practices along with an overview of the new features and how they can help you improve your security stance when implemented.

Basics for any Azure SQL DB engine version

The first thing to note: any TDS connection to Azure SQL Database is running over an encrypted transport. This reduces the chance of your session being hijacked or suffering from Man-in-the-Middle (MitM) style attacks.

Beyond this there are a few configuration items we can cover off that are not tied to the new v12 database engine – namely Firewalls and Auditing.

Firewalls

When you create a new Azure SQL Database server it has a firewall associated with the server instance that blocks any inbound connection to your server and any databases it hosts. There are two options relating to the Azure SQL Database firewall:

  • Allow other Azure services: this option will allow any service deployed into an Azure Region to connect to your server instance. Note this is not limited to just your subscription – any service resolving to a public IP address of an Azure Region will be allowed to at least open a TDS connection to your server.
  • Allowed IP addresses: this is a way to restrict client access to being only from public IP addresses (or ranges) you explicitly list. This could include Azure services deployed using static IPs if you wanted to really lock down the clients accessing your database server.

Auditing

This feature is available in both releases of the Azure SQL Database engine and is useful for tracking activity in databases. You can enable at the server level and then set each child database to inherit the server setting, or you can turn on just for those databases you want to run auditing for. The result is a series of log files written to a storage account of your configuration.

You can report on these logs using an Excel/PowerQuery-based solution you can download for free from Microsoft. This setup also requires all your clients to connect to the database using a secured connection string.

Say hello to your new friend – the secured connection string

A lot of these newer security features are available only when clients connect using secured connection strings. More modern client applications that support TDS redirection don’t need to be changed, with others (such as JDBC and Tedious) needing an explicit change in connection string.

The old “insecure” connection strings contain:

{servername}.database.windows.net.

with the newer secured strings containing:

{servername}.database.secure.windows.net.

Next step: update to v12 engine

The newer features for Azure SQL Database are being driven out of the v12 engine that is now generally available. If you have an older “v11” database you can upgrade it to v12 with a minor service disruption. If you are unsure of the version of your database you can run this PowerShell Cmdlet to check:

# ARM Mode (pre 1.0.0.0 Cmdlets)
Get-AzureSqlServer -ServerName 'servername' `
                   -ResourceGroupName 'Default-SQL-WestUS'

Note that you can get the ResourceGroupName value by viewing the database in the newer Preview Portal (https://portal.azure.com/). Older pre-Resource Group items in Azure will have been assigned a default group based on their resource type (SQL) and Azure Region (West US).

This call will return the following metadata about your Server instance. If “ServerVersion” displays “2.0” then you are not running on the v12 engine and it’s time to upgrade!

ResourceGroupName        : Default-SQL-WestUS
ServerName               : servername
Location                 : West US
SqlAdministratorLogin    : lolcatz
SqlAdministratorPassword :
ServerVersion            : 2.0
Tags                     :

You can upgrade an existing server by running the following Cmdlet:

# ARM Mode (pre 1.0.0.0 Cmdlets)
Start-AzureSqlServerUpgrade -ServerName 'servername' `
                            -ResourceGroupName 'Default-SQL-WestUS' `
                            -ServerVersion 12.0

Microsoft’s guidance is that you can run this on in-use databases and that you will only have an outage at the very end of the upgrade process. That outage lasts between 40 seconds and 5 minutes, depending on size of the databases and load on the server.

The above Cmdlet returns output immediately the request has been made and unless you provide additional arguments to the Cmdlet the upgrade will be queued and will be undertaken ASAP.

If you want to find out the status of your upgrade request you can run the following Cmdlet which will return ‘Completed’ for the status when your server has moved to v12.

# ARM Mode (pre 1.0.0.0 Cmdlets)
Get-AzureSqlServerUpgrade -ServerName 'servername' `
                          -ResourceGroupName 'Default-SQL-WestUS'

Moar! options from the v12 engine

Now that we have upgraded to the v12 engine we have a few more things we can do to protect our data – Dynamic Data Masking, Row Level Security, Transparent Data Encryption (TDE) and Azure Active Directory user credentials.

Dynamic Data Masking (DDM)

I blogged earlier in the year about DDM, so I’m not going to repeat myself here.

The important items to note, however, about the use of DDM are the following restrictions:

  • Your clients must be using the new secured connection string mentioned earlier in this post.
  • Administrator logins will never be included in the data masking (you’re not using admin users in your connection strings anyway right?!)

Row Level Security (RLS)

Like DDM, I already blogged about RLS earlier in the year so I’m not going into detail here.

The key activity when starting to use RLS is to plan your permissions structure properly and map to users/roles or consuming applications which are then used in the predicate functions to filter results returned from the database.

Transparent Data Encryption (TDE)

This has been one of those feature requests since forever for Azure SQL Database. The good news is that it’s here and it’s already generally available. The small downside is that you must decrypt any on-premises database you might move to Azure SQL Database first and then re-encrypt once uploaded. The process to enable TDE is pretty easy – either use the new Azure Preview Portal, T-SQL:

ALTER DATABASE [YourDatabase] SET ENCRYPTION ON;
GO

or via some Azure PowerShell-fu:

# ARM Mode (pre 1.0.0.0 Cmdlets)
Set-AzureSqlDatabaseTransparentDataEncryption -ServerName 'servername' `
                               -ResourceGroupName 'Default-SQL-WestUS' `
                               -DatabaseName 'YourDatabase' `
                               -State "Enabled"

Then you need to wait while the database is initially encrypted (note that the database can be online and in use while this process occurs). You can utilise:

Get-AzureSqlDatabaseTransparentDataEncryption

to check on the progress of encryption (or view in the Portal).

Azure AD user credentials

This is a new (in preview at time of posting) feature that introduces the ability to use Azure Active Directory credentials to manage and access databases.

The important items to note about this feature are:

  • You must be using SQL Server Management Studio 2016 (currently in CTP) if you want to manage databases and connect using Azure AD credentials.
  • Currently only the most recent .Net ADO libraries support using these credentials in connection strings for applications.

Always Encrypted

I’m cheating a little with this one – it’s not actually yet available (even in preview format) but has been announced by Microsoft.

It’s interesting to see Microsoft learning from their SaaS services and driving true multi-tenant capabilities back into offerings such as SQL Server.

Always Encrypted is supported in the upcoming SQL Server 2016 release, along with Azure SQL Database and allows clients to leverage “Bring Your Own Key” (BYOK) so their data remains always encrypted and the keys used are inaccessible to DBAs and other SQL admins.

This is worth looking at if you are architecting and delivering multi-tenant applications sitting on Azure SQL Database.

But, wait, there’s more!

The new Preview Portal and its associated Role Based Access Control (RBAC) provides us with additional capabilities to lock down and control who can perform service management features on the databases and servers we deploy.

The key role for us is the SQL Security Manager role which “can manage the security related policies of SQL servers and databases” (without the need to be a server or database admin).

Azure SQL DB Roles

So, there we have it, all the ways you can help secure your Azure SQL Database instances and ensure you minimise the chances that someone is either going to break into your Azure SQL Database, view or steal your sensitive data, or take a back-up and restore to their own server. Keep an eye out in this space for future announcements from Microsoft as they are innovating quite a lot in this space at the moment.

Azure SQL Database – Row Level Security Walkthrough

siliconvalve

According to De La Soul three is the Magic Number, so here’s the third post on the new Azure SQL Database features released over the last couple of months.

Previously I looked at Dynamic Data Masking and Auditing, both features that deliver Azure-specific functionality (or that rely on Azure features). This post will cover Row-Level Security (RLS) which would be familiar to many on-prem SQL Server admins and until this release was a feature missing from Azure SQL Database.

As with my previous posts I’m going to work with the AdventureWorks sample database that I have loaded into Azure SQL Database.

Note that I’m not looking to take a “best practices” approach here to how I implement policies and permissions – that’s why you can read MSDN or other SQL Server specific blogs :).

Testing it out

The remainder of this post is mostly going to be SQL…

View original post 214 more words

Azure SQL Database – Auditing Walkthrough

siliconvalve

I enjoyed writing my post on the new Dynamic Data Masking feature in Azure SQL Database so much that I thought I’d take a look at another recently added feature – Auditing.

Getting setup

I’m going to use the same AdventureWorks sample database from last post for this one too.

Firstly, I’m going to initiate some changes at the actual Server level which individual Databases can inherit. So I’m going to open the Database Sever and switch to the Auditing & Security tab as shown below.

Server Audit Settings

On this page I’m going to make the following settings and select a storage account (which I’ve blanked out below). The Auditing feature utilises Azure Table Storage which is a great choice as it (theoretically) supports up to 500TB of storage per Subscription and as Troy Hunt blogged, easily handles millions of rows.

Server Audit Default Settings

Note: while here make sure to download the report template…

View original post 370 more words

Azure SQL Database – Dynamic Data Masking Walkthrough

siliconvalve

Microsoft recently announced the public preview of the Dynamic Data Masking (DDM) feature for Azure SQL Database that holds a lot of potential for on-the-fly data obfuscation that traditionally would have required either custom business logic or third party systems.

In this post I am going to take the opportunity to walk through how we can set this up for an existing database. For the purpose of this post I am going to utilise the AdventureWorks sample database for Azure SQL Database which you can download from Codeplex.

Our Azure SQL Database Server

Firstly we need to understand our Azure SQL Database Server instance configuration. At time of writing any existing Azure SQL Database Server or one created without specifying a Version flag will use the “2.0” Azure SQL Database engine which does not support DDM.

If this is your scenario the you will need to upgrade your server to…

View original post 790 more words

How to fix 403 errors when managing Azure SQL Database from Visual Studio

I was recently trying to manage Azure SQL Databases via Visual Studio in a new Azure subscription and was unable to open the SQL Databases node at all and received the following error message.

Screenshot of Visual Studio error dialog.

The text reads:

Error 0: Failed to retrieve all server data for subscription ‘GUID’ due to error ‘Error code: 403 Message: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.’.

and my Server Explorer window looked like this:

How Server Explorer Looked

I must admit that I don’t often manage my Azure assets via Visual Studio so it had been a while since I’d used this tooling. I tried a few ways to get this to work and double checked that I had the right details for the subscription registered on my local machine. Storage worked fine, Virtual Machines worked fine… everything looked good except SQL Databases!

(At this point I’d say… hint: I should have read the error message more closely!)

After some additional troubleshooting it turns out that unlike many other Azure offerings, Azure SQL Database does not support OAuth-based connections and instead uses certificates (you know, like the error message says…).

Unfortunately, it turns out that if you have an expired or otherwise invalid certificate for any Azure subscription registered then Visual Studio will fail to enumerate SQL Database instances in the subscription you are currently using even if its certificate is fine.

The use of a subscription GUID isn’t that helpful when troubleshooting because I completely missed that the problematic certificate wasn’t even from the subscription I was currently trying to use!

You can fix this issue by managing your registered Azure subscriptions from within Visual Studio as follows.

Manage Certificates in Visual Studio

  • Right-click on the top Azure node in Server Explorer.
  • Select Manage Subscriptions… from the menu.
  • Click on the “Certificates” tab and find the Subscription with the GUID matching the error.
  • Click “Remove…” and then close the dialog.

You should now be able to open the Azure SQL Database node in Server Explorer and manage them as you expect!

HTH.