Azure Functions Deployment Strategies

As a serverless architecture, Azure Functions gives us a great benefit, ie. we don’t have to worry about server maintenance. However, we still need to manage our codes and setup a proper strategy for deployment. In this post, I am going to describe list of deployment strategies for Azure Functions.

Git Integration

Azure Functions provides a git integration as an out-of-the-box feature. We can simply integrate local git repository, GitHub, BitBucket or Visual Studio Team Service. For more details, find the other post, Code Management in Serverless Computing – AWS Lambda and Azure Functions.

Git integration is the easiest and simplest way for Azure Functions deployment. However, there are two significant downsides of using git integration.

We can’t modify any code directly on the web editor console

Once deployed, if anything goes wrong, we MUST fix in the developer’s local environment and push it to the git repository to be deployed to Azure Functions. This is OK, as long as we all agree to update any function codes on our local environment first.

We need to have a separate git repository from main repository

Many application development environments don’t only have Azure Functions code, but also contains other code bits and pieces in a repository. In this case, if we use one single repository, pushing changes that are not related to Azure Functions is still deployed, which might cause unforseen side effects.

Then, how do we overcome this issue? We still want to use git as a version control system but also want to be free from those two restrictions. There are two other ways for us.

KUDU REST API

Azure Functions is integrated with KUDU as its backend and KUDU provides us with bunch of REST APIs. With this REST API, we can easily upload a zip file that contains our codes for deployment. Let’s have a look.

We’ve got several Azure Functions codes and they are zipped to ase-dev-fn-demo.zip in the picture above. Once this zip file is uploaded through the KUDU REST API, then all codes in the zip file will be deployed. For now, we are using PowerShell. Here’s the script.

Change [USERNAME], [PASSWORD], [FUNCTION_SITE] and $filePath value and run it. It will upload the ase-dev-fn-demo.zip file to KUDU and KUDU will unzip and deploy it automatically. Once the deployment is done, we can see all the functions are properly deployed.

Of course we can use other environment, rather than PowerShell, as long as it supports REST API call with file uploads.

With this approach, we:

  • Use our own repository structure,
  • Deploy Azure Functions by creating a zip file, and
  • Are able to use web editor in case we need.

MSdeploy.exe / WAWSDeploy.exe

Another one is using the classical msdeploy.exe utility. If we prefer, we can alternatively use WAWSDeploy to utilise Azure web app publish settings.

As we can see above, we have WAWSDeploy utility and the ase-dev-fn-demo.PublishSettings file downloaded from Azure web app. Simply run the following command on our Command Prompt Console:

With this, it will upload directly from your local machine (or build server) to Azure web app. Once it is done, we will be able to see the same result like:

This also enables us to:

  • Use our own repository structure,
  • Deploy Azure Functions by creating a zip file, and
  • Use web editor in case we need.

So far, I have briefly described what kind of strategy we can take for Azure Functions deployment. Each approach has its own benefits and drawbacks. Therefore, based on our business requirements, we can wisely choose one of them.

Interacting with Azure Web Apps Virtual File System using PowerShell and the Kudu API

Introduction

Azure Web Apps or App Services are quite flexible regarding deployment. You can deploy via FTP, OneDrive or Dropbox, different cloud-based source controls like VSTS, GitHub, or BitBucket, your on-premise Git, multiples IDEs including Visual Studio, Eclipse and Xcode, and using MSBuild via Web Deploy or FTP/FTPs. And this list is very likely to keep expanding.

However, there might be some scenarios where you just need to update some reference files and don’t need to build or update the whole solution. Additionally, it’s quite common that corporate firewalls restrictions leave you with only the HTTP or HTTPs ports open to interact with your Azure App Service. I had such a scenario where we had to automate the deployment of new public keys to an Azure App Service to support client certificate-based authentication. However, we were restricted by policies and firewalls.

The Kudu REST API provides a lot of handy features which support Azure App Services source code management and deployments operations, among others. One of these is the Virtual File System (VFS) API. This API is based on the VFS HTTP Adapter which wraps a VFS instance as an HTTP RESTful interface. The Kudu VFS API allows us to upload and download files, get a list of files in a directory, create directories, and delete files from the virtual file system of an Azure App Service; and we can use PowerShell to call it.

In this post I will show how to interact with the Azure App Service Virtual File Sytem (VFS) API via PowerShell.

Authenticating to the Kudu API.

To call any of the Kudu APIs, we need to authenticate by adding the corresponding Authorization header. To create the header value, we need to use the Kudu API credentials, as detailed here. Because we will be interacting with an API related to an App Service, we will be using site-level credentials (a.k.a. publishing profile credentials).

Getting the Publishing Profile Credentials from the Azure Portal

You can get the publishing profile credentials, by downloading the publishing profile from the portal, as shown in the figure below. Once downloaded, the XML document will contain the site-level credentials.

Getting the Publishing Profile Credentials via PowerShell

We can also get the site-level credentials via PowerShell. I’ve created a PowerShell function which returns the publishing credentials of an Azure App Service or a Deployment Slot, as shown below.

Bear in mind that you need to be logged in to Azure in your PowerShell session before calling these cmdlets.

Getting the Kudu REST API Authorisation header via PowerShell

Once we have the credentials, we are able to get the Authorization header value. The instructions to construct the header are described here. I’ve created another PowerShell function, which relies on the previous one, to get the header value, as follows.

Calling the App Service VFS API

Once we have the Authorization header, we are ready to call the VFS API. As shown in the documentation, the VFS API has the following operations:

  • GET /api/vfs/{path}    (Gets a file at path)
  • GET /api/vfs/{path}/    (Lists files at directory specified by path)
  • PUT /api/vfs/{path}    (Puts a file at path)
  • PUT /api/vfs/{path}/    (Creates a directory at path)
  • DELETE /api/vfs/{path}    (Delete the file at path)

So the URI to call the API would be something like:

  • GET https://{webAppName}.scm.azurewebsites.net/api/vfs/

To invoke the REST API operation via PowerShell we will use the Invoke-RestMethod cmdlet.

We have to bear in mind that when trying to overwrite or delete a file, the web server implements ETag behaviour to identify specific versions of files.

Uploading a File to an App Service

I have created the PowerShell function shown below which uploads a local file to a path in the virtual file system. To call this function you need to provide the App Service name, the Kudu credentials (username and password), the local path of your file and the kudu path. The function assumes that you want to upload the file under the wwwroot folder, but you can change it if needed.

As you can see in the script, we are adding the “If-Match”=”*” header to disable ETag version check on the server side.

Downloading a File from an App Service

Similarly, I have created a function to download a file on an App Service to the local file system via PowerShell.

Using the ZIP API

In addition to using the VFS API, we can also use the Kudu ZIP Api, which allows to upload zip files and expand them into folders, and compress server folders as zip files and download them.

  • GET /api/zip/{path}    (Zip up and download the specified folder)
  • PUT /api/zip/{path}    (Upload a zip file which gets expanded into the specified folder)

You could create your own PowerShell functions to interact with the ZIP API based on what we have previously shown.

Conclusion

As we have seen, in addition to the multiple deployment options we have for Azure App Services, we can also use the Kudu VFS API to interact with the App Service Virtual File System via HTTP. I have shared some functions for some of the provided operations. You could customise these functions or create your own based on your needs.

I hope this has been of help and feel free to add your comments or queries below. 🙂

Monitoring Azure WebJobs Health with Application Insights

Introduction

Azure WebJobs have been available for quite some time and have become very popular for running background tasks with programs or scripts. WebJobs are deployed as part of Azure App Services (Web Apps), which include their companion site Kudu. Kudu provides a lot of features, including a REST API, which provides operations for source code management (SCM), virtual file system, deployments, accessing logs, and for WebJob management as well. The Kudu WebJobs API provides different operations including listing WebJobs, uploading a WebJob, or triggering it. One of the operations of this API allows to get the status of a specific WebJob by name.

Another quite popular Azure service is Application Insights. This provides functionality to monitor and diagnose application issues and to analyse usage and performance as well. One of these features are web tests, which provide a way to monitor the availability and health of a web site.

In this blog post I will go through the required configuration on Application Insights to monitor the health of WebJobs using Application Insights web tests calling the Kudu WebJobs API.

Calling the Kudu WebJobs API.

For this exercise, it is worth getting familiar with the WebJobs API, particularly with the endpoint to get a WebJob status. Through this post, I will be working with a triggered WebJob scheduled with a CRON expression, but you can apply the same principles for a continuous WebJob. I will be using postman to call this API.

To get a WebJob status, we need to call the corresponding Kudu WebJob API endpoint. In the case of triggered WebJobs, the endpoint looks something like:

https://{webapp-name}.scm.azurewebsites.net/api/triggeredwebjobs/{webjob-name}/

Before calling the endpoint, we need to add the Authorization header to the GET request. To create the header value, we need to use the corresponding Kudu API credentials, as explained here. Considering we want to monitor the status of a WebJob under a particular web site, I prefer to use site-level credentials (or publishing profile credentials) instead of the user-level ones.

Getting the Publishing Profile Credentials from the Azure Portal

You can get the publishing profile credentials, by downloading the publishing profile from the portal, as shown in the figure below. Once downloaded, the XML document will contain the site-level credentials.

Getting the Publishing Profile Credentials via PowerShell

We can also get the site-level credentials via PowerShell. I’ve created a PowerShell function which returns the publishing credentials of an Azure Web App or a Deployment Slot, as shown below.

Bear in mind that you need to be logged in to Azure in your PowerShell session before calling these cmdlets.

Getting the Kudu REST API Authorisation header via PowerShell

Once we have the credentials, we are able to get the Authorization header value. The instructions to construct the header are described here. I’ve created another PowerShell function, which relies on the previous one, to get the header value, as follows.

Once we have the header value, we can call the api. Let’s call it using postman.

You should be getting a response similar to the one shown below:

Note that for this triggered WebJob, there are status and duration fields.

Now that we are familiar with the response, we can start designing an App Insights web test to monitor the health of our WebJob.

Configuring an App Insights Web Test to Monitor the Health of an Azure WebJob

You can find here detailed documentation on how to create web tests to monitor availability and responsiveness of web end points. In the following sections of this post, I will cover how to create an App Insights web test to Monitor the Health of a WebJob.

As we saw above, to call the WebJobs API we need to add an Authorization Header to the GET request. And once we get the API response, to check the status of the WebJob, we would need to interpret the response in JSON format.

To create the web test on App Insights to monitor a WebJob, I will first create a simple web test via the Azure Portal, and enrich it later.

Creating a Web Test on Application Insights.

I will create a basic web test with the following configuration. You should change it to the values which suit your scenario:

  • Test type: URL ping test
  • URL: My WebJob Rest API, e.g. https://{webapp-name}.scm.azurewebsites.net/api/triggeredwebjobs/{webjob-name}/
  • Test frequency: 5 minutes
  • Test locations: SG Singapore and AU Sydney
  • Success criteria:
    • Test timeout: 120 seconds
    • HTTP Response: (checked)
    • Status code must equal: 200
    • Content match: (checked)
    • Content must contain: “status”:”success”
  • Alerts
    • Status: Enabled
    • Alert threshold location: 1
    • Alert failure time window: 5 minutes
    • Send alert emails to these email addresses: <my email address>

You could also keep email alerts disabled or configure them later.

If you enable the web test as is, you will see that it will start failing. The reason being that we are not adding the required Authorization header to the GET request.

To add headers to the test, you could record web tests on Visual Studio Enterprise or Ultimate. This is explained in details in the Azure documentation. Additionally, in these multi-steps web tests you can add more than one validation rule.

Knowing that not everybody has access to a VS Enterprise or Ultimate license, I will explain here how to create a web test using the corresponding XML format. The first step is to extract the web test XML definition from the test manually created on the portal.

Extracting the Web Test XML Definition from a Test Manually Created on the Portal.

Once we have created the web test manually on the portal, to get its XML definition, we have to open the resource explorer on https://resources.azure.com/ and navigate to subscriptions/<subscription-guid>/resourceGroups/<resourcegroup>/providers/microsoft.insights/webtests/<webtest>-<app-insight> until you are on the definition of the web test you have just created.

Once there, you need to find the member: “WebTest”, which should be something similar to:

Now, we need to extract the XML document by removing the escape characters of the double quotes, and get something like:

which is the XML definition of the web test we created manually on the portal.

Adding a Header to the Application Insights Web Test Request by updating the Web Test XML definition.

Now we should be ready to edit our web test XML definition to add the Authorization header.

To do this, we just need to add a Headers child element to the Request record, similar to the one shown below. You would need to get the Base 64 encoded Authorization header value, similarly to how we did it previously when calling the API via Postman.

Extending the Functionality of the Web Test.

When we created the web test on the portal, we said that we wanted the status to be “success”, however, we might want to add “running” as another valid value. Additionally, in my case, I wanted to check that duration is less than 10 minutes. For this I have updated the Validation Rules to use regular expressions and to have a second rule. The final web test XML definition resulted as follows:

You could play around with the web test XML definition and update or extend it according to your needs. In case you are interested on the capabilities of web tests, here the documentation.

Once our web test XML definition is ready, we save it with a “.webtest” extension.

Uploading the (Multi-Step) Web Test to Application Insights

Having the web test XML definition ready, we can update our Application Insights web test with it. For this, on the portal, we open the Edit Test blade and:

  • Change the Test Type to: Multi-step test, and
  • Upload the web test xml definition file we just saved with the “.webtest” extension.

This will update the web test, and now with the proper Authorization header and the added validation rules, we can monitor the health of our triggered WebJob.

With Application Insights web tests, we can monitor the WebJob via the dashboard as shown above, or configuring alerts to be sent via email.

Summary

Through this post I have shown how to monitor the health of an Azure WebJob using Application Insights web tests. But on the journey, I also showed some tricks which I hope can be useful in other scenarios as well, including

  1. How to call the Azure WebJobs API via Postman, including how to get the Kudu API Authorization header via PowerShell.
  2. How to manually configure App Insights web tests,
  3. How to get the XML definition of a manually created web test using the Azure Resource Explorer,
  4. How to update the web test XML definition to add a request
    header and expand the validation rules. This without requiring Visual Studio Enterprise or Ultimate, and
  5. How to update the Application Insights web test by uploading the updated multi-step web test file.

Thanks for reading, and feel free to add your comments or queries below. 🙂 

Azure WebJobs with .NET Core RC2

With .NET Core RC2, publishing Azure WebJob is a little bit different from the traditional(?) way, even it’s different from what RC1 does. In this post, we’ll walk through how to publish Azure WebJob using a .NET Core RC2 console application.

Sample code can be found at https://github.com/devkimchi/.NET-Core-for-Azure-WebJob-Sample.

Sample Hello World Console Application

OK. First thing’s first. Let’s create a console app using .NET Core RC2. Take the latest copy from the repository above and build it on your local machine. Then run the console app like:

Then you can see the following:

Now, we’re ready to publish Azure WebJob with this.

Preparation

In order to publish this console app, we need to do some more work beforehand. Azure WebJob looks for run.cmd to run the console application as a webjob. Therefore, we need to create a file and name it as run.cmd. The content is simply:

Wait. Can we run Azure WebApp with .NET Core RC2? Let’s double check. When you go to your Azure WebApp’s KUDU, you can confirm .NET Core RC2 has already been up and running.

We now don’t have to worry about that. Let’s keep preparing for publish. When we publish this console app, the run.cmd doesn’t automatically get copied to the published directory. We need to include it by modifying project.json like:

With this publish options, the run.cmd file will be included to the published directory. The list of published files in the directory are:

We need to create a .zip file and include all into the zip file. Once we complete, we’re ready to deploy this to Azure.

We assume this web job is a triggered webhook. If it is a scheduled one, we need to create a settings.job file containing CRON schedule and include it for publish.

Deployment

Using MSDeploy.exe is the easiest way. However, it needs too many parameters. Therefore, we’re going to use WAWSDeploy. More details about WAWSDeploy can be found here. With this, we only need the zip file we prepared and a publish settings file downloaded from Azure Portal. The publish settings file can be downloaded here:

We’re all set! Now, try the following command to deploy the web job:

Target directory will be app_data\jobs\triggered\[WEBJOB_NAME]. If this web job is a continuously running one, replace triggered with continuous.

Now, we’ve deployed the web job to Azure!

Execution on Azure Portal

Once the web job is deployed to Azure, we can visually confirm that’s deployed.

Run this web job and see how the log looks like.

The console app as a web job has successfully run!

Entity Framework 7 Data Migration through KUDU

From DevOps perspective, everything needs to be automated in regards to application setup and deployment. There’s no exception for database migration. If database schema change occurs, it should be automatically applied before/after the application deployment. Unlike Entity Framework 6.x using PowerShell cmdlets for database migration, Entity Framework 7 (EF7) uses DNX for it.

Applying Database Migration with EF7

In EF7, updating database change can be done by running the following command:

If your DbContext is located in another project and your web application has a reference to it, then you can run the following command:

By running the command above within your build/deployment pipeline, your database change is easily applied to the existing database. Connectionstrings are defined in appsettings.json in your ASP.NET Core application.

Visit https://docs.efproject.net for more details.

In most cases, there’s no issue to access to Azure SQL Database from your build/deployment server, as long as Azure SQL Database has a proper firewall setup. But what if your enterprise firewall doesn’t allow to connect to Azure SQL Database, like blocking the TCP port of 1433? Then we can’t run this command from our build/deployment server.

REST API in KUDU

KUDU is basically a backend service engine for deployment tied to your Azure Website. If you are running any Azure App Service, your KUDU can be accessible via https://your-azure-website.scm.azurewebsites.net. It provides REST API for website maintenance and one of its endpoint is command. Therefore, we can write a script, say db-migration.cmd, deploy it at the same time when the application is deployed, and run it through this REST API. The db-migration.cmd might look like:

So, the application is ready for database migration. Let’s write a PowerShell script to run the command. Make sure that we are using Azure Service Management (ASM) cmdlets.

NOTE: You should login to ASM with appropriate subscription first.

NOTE: The dir property is where the actual command is run, which is the relative path to %HOME% in Azure App Service.

Running the PS script above will bring you to database migration completed within KUDU. Make sure that the $result object has an exit code of 0 by examining $result.ExitCode. If the exit code is other than 0, database migration has come to fail.

So far, we have briefly looked at KUDU for Azure SQL Database migration. KUDU actually has many useful functions for monitoring, so it would be worth taking a look.