Testing Precompiled Azure Functions

Azure Functions has recently added a new feature that allows precompiled assembly to run functions. This gives us a great confidence with regards to unit testing. In this post, we are walking through how to unit test functions with ease, like which we do tests everyday.

The sample code used in this post can be found at HERE.

Function without Dependency

We’re not digging down precompiled function too much as it’s on the document. Let’s have a quick look at the HTTP trigger function code:

Nothing special. Now, we’re writing a test code for this function using xUnit and FluentAssertions.

How does it look like? It’s the same unit test code as what we do everyday. Let’s move on.

Function with Dependency

As I wrote Managing Dependencies in Azure Functions on the other day, dependency management is a bit tricky for Azure Functions due to its static nature. Therefore, we should introduce Service Locator Pattern for dependency management (or injection). Here’s the sample function code:

As we can see the code above, value is retrieved from the service locator instance. Of course, this is just a simple implementation of service locator pattern (If we need more sophisticated one, we should consider an IoC container library like Autofac). And here’s the poorman’s service locator:

Let’s see the test code for the function with dependencies. With the service locator, we can inject mocked object for unit testing, which is convenient for developers. In order for mocking, we use Moq in the following test code.

We create a mocked instance and inject it into the service locator. Then the injected value (or instance) is consumed within the function. How different is it from the everyday testing? There’s no difference at all. In other words, implementing a service locator gives us the same development experiences on Azure Functions, from the testing point of view.

I wrote another article for testing a few months ago, using ScriptCs. This used to be one approach, when Azure Functions didn’t support the precompiled assemblies. Now, we have precompiled functions supported. Therefore, I hope this post would be useful to design functions with better testability.

stopping

How to create an Azure Function App to Simultaneously Start|Stop all Virtual Machines in a Resource Group

Just on a year ago I wrote this blog post that detailed a method to “Simultaneously Start|Stop all Azure Resource Manager Virtual Machines in a Resource Group”. It’s a simple script that I use quite a lot and I’ve received a lot of positive feedback on it.

One year on though and there are a few enhancements I’ve been wanting to make to it. Namely;

  • host the script in an environment that is a known state. Often I’m authenticated to different Azure Subscriptions, my personal, my employers and my customers.
  • prioritize the order the virtual machines startup|shutdown
  • allow for a delay between starting each VM (to account for environments where the VM’s have roles that have cross dependencies; e.g A Domain Controller, an SQL Server, Application Servers). You want the DC to be up and running before the SQL Server, and so forth
  • and if I do all those the most important;
    • secure it so not just anyone can start|stop my environments at their whim

Overview

This blog post is the first that executes the first part of implementing the script in an environment that is a known state aka implementing it as an Azure Function App. This won’t be a perfect implementation as you will see, but will set the foundation for the other enhancements. Subsequent posts (as I make time to develop the enhancements) will add the new functionality. This post covers;

  • Creating the Azure Function App
  • Creating the foundation for automating management of Virtual Machines in Azure using Azure Function Apps
  • Starting | Stopping all Virtual Machines in an Azure Resource Group

Create a New Azure Function App

First up we are going to need a Function App. Through your Azure Resource Manager Portal create a new Function App.

For mine I’ve created a new Resource Group and a new Storage Account as this solution will flesh out over time and I’d like to keep everything organised.

Now that we have the Azure App Plan setup, create a New PowerShell HTTP Trigger Function App.

Give it a name and hit Create.

 

Create Deployment Credentials

In order to get some of the dependencies into the Azure Function we need to create deployment credentials so we can upload them. Head to the Function App Settings and choose Go to App Service Settings.

Create a login and give it a password. Record the FTP/Deployment username and the FTP hostname along with your password as you’ll need this in the next step.

Upload our PowerShell  Modules and Dependencies

Just as my original PowerShell script did I’m using the brilliant Invoke Parallel Powershell Script from Rambling Cookie Monster. Download it from that link and save it to your local machine.

Connect to your Azure Function App using your favourite FTP Client using the credentials you created earlier. I’m using WinSCP. Create a new sub-directory under /site/wwwroot/ named “bin” as shown below.

Upload the Invoke-Parallel.ps1 file from wherever you extracted it to on your local machine to the bin folder you just created in the Function App.

We are also going to need the AzureRM Powershell Modules. Download those via Powershell to your local machine (eg. Save-Module -Name AzureRM -Path c:\temp\azurerm). There are a lot of modules obviously and you’re not going to need them all. At a minimum for this solution you’ll need;

  • AzureRM
  • AzureRM.profile
  • AzureRM.Compute

Upload them under the bin directory also as shown below.

Test that our script dependencies are accessible

Now that we have our dependent modules uploaded let’s test that we can load and utilise them. Below is commands to load the Invoke-Parallel script and test that it has loaded by getting the Help.

# Load the Invoke-Parallel Powershell Script
. "D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\Invoke-Parallel.ps1"

# See if it is loaded by getting some output
Get-Help Invoke-Parallel -Full

Put those lines into the code section, hit Save and Run and select Logs to see the output. If successful you’ll see the help. If you don’t you probably have a problem with the path to where you put the Invoke-Parallel script. You can use the Kudu Console from the Function App Settings to get a command line and verify your path.

Mine worked successfully. Now to test our AzureRM Module Loads. Update the Function to load the AzureRM Profile PSM as per below and test you have your path correct.

# Import the AzureRM Powershell Module
import-module 'D:\home\site\wwwroot\RG-Start-Stop-VirtualMachines\bin\AzureRM.profile\2.4.0\AzureRM.Profile.psm1'
Get-Help AzureRM

Success. Fantastic.

Create an Azure Service Principal

In order to automate the access and control of the Azure Virtual Machines we are going to need to connect to Azure using a Service Principal with the necessary permissions to manage the Virtual Machines.

The following script does just that. You only need to run this as part of the setup for the Azure Function so we have an account we can use for our automation tasks. Update line 6 for your naming and the password you want to use. I’m assigning the Service Principal the “DevTest Labs User” Azure Role (Line 17) as that allows the ability to manage the Virtual Machines. You can find a list of the available roles here.

Take note of the key outputs from this script. You will need to note the;

  • ApplicationID
  • TenantID

I’m also securing the credential that has the permissions to Start|Stop the Virtual Machines using the example detailed here in Tao’s post.

For reference here is an example to generate the keyfile. Update your path in line 5 if required and make sure the password you supply in line 18 matches the password you supplied for the line in the script (line 6) when creating the Security Principal.

Take note of the password encryption string from the end of the script to pair with the ApplicationID and TenantID from the previous steps. You’ll need these shortly in Application Settings.

Additional Dependencies

I created another sub-directory under the function app site named ‘keys’ again using WinSCP. Upload the passkey file created above into that directory.

Whilst we’re there I also created a “logs” directory for any erroneous output (aka logfiles created when you don’t specify them) from the invoke-parallel script.

Application Variables

Using the identity information you have created and generated we will populate variables on the Function App, Application Settings that we can then leverage in our Function App. Go to your Azure Function App, Application Settings and add an application setting (with the respective values you have gathered in the previous steps) for;

  • AzureAutomationPWD
  • AzureAutomationAppID
  • AzureAutomationTennatID (bad speed typing there)

Don’t forget to click Save up the top of the Application Settings screen.

 

The Function App Script

Below is the sample script for your testing purposes. If you plan to use something similar in a production environment you’ll want to add more logging and error handling.

Testing the Function

Select the Test option from the right-hand side pane and update the request body for what the Function takes (mode and resourcegroup) as below.   Select Run and watch the logs. You will need to select Expand to get more screen real estate for them.

You will see the VM’s enumerate then the script starting them all up. My script has a 30 second timeout for the Invoke-Parallel Runspace as the VM’s will take longer than 30 seconds to startup. And you pay for use, so we want to keep this lean. Increase the timeout if you have more VM’s or latency that doesn’t see all your VM’s state transitioning.

Checking in the Azure Portal I can see my VM’s all starting up (too fast on the screenshot for the spfarm-mim host).

 

Sample Remote PowerShell Invoke Script

Below is a sample PowerShell script that is remotely calling the Azure Function and providing the info the Function takes (mode and resourcegroup) the same as we did in the Test Request Body script in the Azure Function Portal.  This time to stop the VMs.

Looking in the Azure Portal and we can see all the VMs shutting down.

 

Summary

A foundational implementation of an Azure Function App to perform orchestration of Azure Virtual Machines.

The Function App is rudimentary in that the script exits (as described in the Runspace timeout) after 30 seconds which is prior to the VMs fully returning after starting|stopping. This is because the Function App will timeout after 5mins anyway.

Now to workout the enhancements to it.

Finally, yes I have renewed/changed the Function Key so no-one else can initiate my Function 🙂

Follow Darren Robinson on Twitter

Is Azure Functions over Web API Beneficial?

Whenever I meet clients and give a talk about Azure Functions, they are immediately interested in replacing their existing Web API features with Azure Functions. In this post, I’d like to discuss:

  • Can Azure Functions replace Web API?
  • Is it worth doing?

It would be a good idea to have a read through this article, Serverless Architectures, before starting.

HTTP Trigger Function == Web API Action

One of characteristics of Serverless Architecture is “event-driven”. In other words, all functions written in Azure Functions are triggered by events. Those events of course include HTTP requests. From this HTTP request point of view, both HTTP trigger function and Web API action work exactly the same way. Let’s compare both codes to each other:

How do both look like? They look pretty similar to each other. Both take an HTTP request, process it and return a response. Therefore, with minor modification, it seems that Web API can be easily migrated to Azure Functions.

HTTP Trigger Function != Web API Action

However, life is not easy. There are some major differences we should know before migration:

Functions are always static methods

Even though Azure Functions are extensions of Azure WebJobs, each function has a static modifier by design, unlike Azure WebJobs can be without the static modifier.

Actions of Web API, by the way, don’t have the static modifier. This results in a significant architectural change during the migration, especially with dependency injection (DI). We will touch it later.

Functions always receive HttpRequestMessage instance as a parameter

Within the HTTP request/response pipeline, a Web API controller internally creates an HttpContext instance to handle data like headers, cookies, sessions, querystrings and request body (of course querystrings and request body can be handled in a different way). The HttpContext instance works as an internal property so any action can directly access to it. As a result, each action only passes necessary details as its parameters.

On the other hand, each function takes a different HTTP request/response pipeline from Web API, which passes the HttpRequestMessage instance to the function as a parameter. The HttpRequestMessage instance only handles headers, querystrings and request body. It doesn’t look after cookies or sessions. This is the huge difference between Web API and Azure Functions in terms of stateless.

Functions define HTTP verbs and routes in function.json

In Web API, we put some decorators like HttpGet, HttpPost, HttpPut, HttpPatch and HttpDelete on each action to declare which HTTP verbs take which action, by combining with the Route decorator.

On the other hand, each function has a definition of HTTP verbs and routes on function.json. With this definition, different functions having the same route URI can handle requests based on HTTP verbs.

Functions define base endpoint URI in host.json

Other than the host part of URI, eg) https://api.myservice.com, the base URI is usually defined on the controller level of Web API by adding the Route decorator. This is dead simple.

However, as there’s no controller on Azure Functions, it is defined in host.json. Default value is api, but we can remove or redefine to others by modifying host.json.

While function.json can be managed at the function level through GUI or editor, unfortunately it’s not possible to edit host.json directly in the function app. There’s a workaround using Azure App Service Editor to modify host.json, by the way.

Functions should consider service locator pattern for dependency management

There are many good IoC container libraries for Web API to manage dependencies. However, we have already discussed this in my previous post, Managing Dependencies in Azure Functions, that Service Locator Pattern should be considered for DI in Azure Functions and actually this is the only way to deal with dependencies for now. This is because every Azure Function has the static modifier which prevents us from using the same way as the one in Web API.

We know different opinions against service locator patterns for Azure Functions exist out there, but this is beyond our topic, so we will discuss it later in another post.

Is Azure Functions over Web API Beneficial?

So far, we have discussed what are same and what are different between Web API and Azure Functions HTTP Trigger. Back to the initial question, is it really worth migrating Web API to Azure Functions? How does your situation fall under any of below?

  • My Web API is designed for microservices architecture: Then it’s good to go for migration to Azure Functions.
  • My Web API takes long for response: Then consider Azure Functions using empty instance in App Service Plan because it costs nothing more. Consumption Plan (or Dynamic Service Plan) would cost too much in this case.
  • My Web API is refactored to use queues: Then calculate the price carefully, not only price for Azure Functions but also price for Azure Service Bus Queue/Topic and Azure Storage Queue. In addition to this, check the number of executions as each Web API is refactored to call one Http Trigger function plus at least one Queue Trigger function (two executions in total, at least). Based on the calculations, we can make a decision to stay or move.
  • My Web API needs a significant amount of efforts for refactoring: Then it’s better to stay until it’s restructured and suitable for microservices architecture.
  • My Web API is written in ASP.NET Core: Then stay there, do not even think of migration, until Azure Functions support ASP.NET Core.

To sum up, unless your Web API requires a significant amount of refactoring or written in ASP.NET Core, it surely is worth considering migration to Azure Functions. It is much easier to use and cost-effective solution for your Web API.

Debugging Azure Functions in Our Local Box

Because of the nature of Azure Functions – Serverless Architecture, it’s a bit tricky to run it on my local machine for debugging purpose.

There is an approach related to the issue in this post, Testing Azure Functions in Emulated Environment with ScriptCs. According to the article, we can use ScriptCs for local unit testing. However, the question on debugging still remains because testing and debugging is a different story. Fortunately, Microsoft has recently released Visual Studio Tools for Azure Functions. It’s still a preview stage, but worth having a look. In this post, I’m going to walk-through how to debug Azure Functions within Visual Studio.

Azure Functions Project & Templates

After we install the toolings, we can create an Azure Functions project.

That gives us the same development experiences. Pretty straight forward. Once we create the new project, we can find nothing but only a couple of .json files   appsettings.json and host.json. The appsettings.json is only used for our local development, not for the production, to hook up the actual Azure Functions in the cloud. We are going to touch this later in this post.

Now let’s create a function in C# codes. Right mouse click on the project and add a new function.

Then we can see a list of templates to start with. We just select from HttpTrigger function in C#.

Now we have a fresh new function.

As we can see, we have a couple of another .json files for settings. function.json defines input and output, and project.json defines list of NuGet packages to import, same as what .NET Core projects do.

We’ve all got now. How to debug the function then? Let’s move on.

Debugging Functions – HTTP Trigger

Open the run.csx file. Set a break point wherever we want.

Now, it’s time for debugging! Just punch F5 key. If we haven’t installed Azure Functions CLI, we will be asked to install it.

We can manually install CLI through npm by typing:

Once the CLI is installed, a command prompt console is open. In the console, it shows a few various useful information.

  • Functions in the debugging mode only takes the 7071 port. If any of application running in our local has already taken the port, it would be in trouble.
  • The endpoint of the Function is always http://localhost:7071/api/Function_Name, which is the same format as the actual Azure Functions in the cloud. We can’t change it.

Now it’s waiting for the request coming in. It’s basically an HTTP request, we can send the request through our web browser, Postman or even curl.

Any of request above is executed, it will hit the break point within Visual Studio.

And the CLI console prints out the log like:

Super cool! Isn’t it? Now, let’s do queue trigger function.

Debugging Functions – Queue Trigger

Let’s create another function called QueueTriggerCSharp. Unlike HTTP triggered functions, this doesn’t have endpoint URL (at least not publicly exposed) as it relies on Azure Storage Queue (or Azure Service Bus Queue). We have Azure Storage Emulator that runs on our dev machine. With this emulator, we can debug our queue-triggered functions.

In order to run this function, we need to setup both appsettings.json and function.json

Here’s the appsettings.json file. We simply assign UseDevelopmentStorage=true value to AzureWebJobsStorage for now. Then, open function.json and assign AzureWebJobsStorage to the connection key.

We’re all set. Hit F5 key see how it’s going.

Seems nothing has happened. But when we see the console, certainly the queue-triggered function is up and running. How can we pass the queue value then? We have to use the CLI command here. If environment variable doesn’t recognise func.exe, we have to use the full path to run it.

Now we can see the break point at Visual Studio.

So far, we’ve briefly walked through how to debug Azure Functions in Visual Studio. There are, however, some known issues. One of those issues is, due to the nature of .csx format, IntelliSense doesn’t work as expected. Other than that, it works great! So, if your organisation was hesitating at using Azure Functions due to the lack of debugging feature, now it’s the time to play around it!

azure-function-twitter

How to use a Powershell Azure Function to Tweet IoT environment data

Overview

This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.

The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.

Prerequisites

The following are prerequisites for this solution;

Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name InvokeTwitterAPIs -Path c:\temp\twitter

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.

Create a Twitter App

Head over to http://dev.twitter.com and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.

Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.

Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”

Give your Function App a name and select Create.

Configure Azure Function App Application Settings

In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.

Deployment Credentials

If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.

Take note of your Deployment Username and FTP Hostname.

Upload the Twitter Powershell Module to the Azure Function App

Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.

From the Applications Settings option start Kudu.

Traverse the folder structure to get the path do the Twitter Powershell Module and note it.

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Twitter Module.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.

Viewing the Tweet

And here is the successful tweet.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.

Follow Darren on Twitter @darrenjrobinson

Azure Functions: Send email using SendGrid

siliconvalve

Prior to Azure Functions announcing their General Availability (GA) I had previously used SendGrid as an output binding in order to send email messages.

Since GA, however, the ability to use SendGrid remains undocumented (I assume to give the Functions team time to test and document the binding properly) and the old approach I was using no longer seems valid.

As I needed to use this feature I spent some time digging into getting this working with the GA release of Azure Functions (version ~1). Thankfully as Functions is an abstraction over WebJobs I had plenty of information on how to do it right now thanks to the WebJobs documentation and extensibility :).

Here’s how you can get this working too:

1. Register your SendGrid API key in Application Settings: you must utilise the documented approach of setting your API key in an App Setting called “AzureWebJobsSendGridApiKey”. Without this your…

View original post 151 more words

Managing Dependencies in Azure Functions

Just before the Connect(); event, Azure Functions has become GA. Now more and more developers and IT pros are interested in it. One of the main attractions of using Azure Functions is, as a developer using C# codes, we can directly import existing private assemblies into Functions. In other words, we can easily migrate our existing applications to Azure Functions with minimal changes!

However, as we all know, migration is not that easy, especially if a huge architectural change is required. Once of the biggest challenges for the migration is probably “Managing Dependencies”. In this post, we are going to have a look which approach would be suitable for us to migrate applications smoothly.

We can find the source codes used in this post here: https://github.com/devkimchi/Azure-Functions-Dependency-Injections-Sample

Azure Function Structure

Let’s see the basic function structure. Here’s a function code when we create a manual trigger function:

What can we see here? That’s right. It’s a static method! That simply means our existing IoC container won’t fit here, if we use an IoC container for dependency management. So, what can we do here? We now need to consider a service locator pattern.

Autofac and CommonServiceLocator

If we use Autofac for our application’s dependency management, we need to create a service locator object using Autofac.Extras.CommonServiceLocator. Let’s see some code bits:

First of all, like what we are doing with Autofac, we just add dependencies using builder.RegisterType(). Then, we create an AutofacServiceLocator instance and put the instance to the service locator instance. That’s it! Let’s create a function.

Azure Functions with Service Locator

We don’t need to follow the whole step, but as a best practice that Azure Functions team suggests, we’re putting a private assemblies into the Shared folder using KUDU.

And create a .csx file within the Shared folder to create the service locator instance.

Now we have a static locator instance. We all set. Let’s create a function using this instance.

If we run this function, we get the result as expected.

This also works in an async function:

So far, we have a brief look how the service locator pattern works for our Azure Functions to manage dependencies. Migration to Azure Functions from our existing application wouldn’t be that easy, of course. However, we still use existing private assemblies as long as we introduce the service locator pattern for dependency management. Service locator pattern might be old-fashioned, but surely it’s working. There might be a better solution for managing dependencies. If you guys find anything better, let’s discuss here!

overview

Using an Azure Function to search the FIM/MIM Metaverse, create a Set and update the Set membership in the the FIM/MIM Service

Introduction

This is the third and last post in this series of integrating Microsoft Identity Manager with Azure Functions.

The first detailed how to use an Azure Function to retrieve data from the MIM Service Server. The second detailed how to use an Azure Function to retrieve data from the MIM Sync (Metaverse) Server.

This third post combines the two and then performs an action in the MIM Service. The practical purpose of this could be functions like “find all users in location y” and “enable them for entitlement x” or “add an attribute value on each of their objects”.

Overview

The reasoning for the two stage approach is that in my experience it is a lot easier to search the Metaverse than the MIM Service to find an object(s), but also the Metaverse has all the information about objects whereas the MIM Service is a ShadowVerse of the Metaverse containing a subset of the managed objects metadata.

Moving forward then the architecture is a hybrid of the first two posts that introduced the concepts associated with integrating MIM with Azure Functions. As per the other two posts this is a base working example and concept.

Prerequisites

The prerequisites are the same as for the 1st and 2nd posts in this series. You’ll need to work through those examples to setup the dependencies and prerequisites. From there you can create one more Azure Function that brings everything together. That’s what I’m covering in this post.

Therefore the prerequisites are;

  • Azure Tenant and a Function Plan
  • Microsoft Identity Manager implementation
  • Remote Powershell configured for your MIM Sync Server
  • Lithnet FIM MIIS Automation Powershell Module installed on your MIM Sync Server
  • The necessary Firewall Rules on your MIM Sync Server and your Azure Network Security Group (assuming your MIM Infrastructure is in Azure) to allow Azure Functions to communicate with MIM Sync and Service Servers

 

This Example performs the following

In this example the HTTP Trigger Azure Function;

  • Takes input for ObjectType, Attribute, AttributeValue, SetName
  • Searches the MIM Sync Metaverse for the input ObjectType, with the AttributeValue in the Attribute
  • Connects to the MIM Service
  • Creates the Set based of the input SetName if it doesn’t exist
  • Adds the objects from the search to the Set
  • Returns the objects added to the Set

In a real world implementation you’d do the above with a criteria based set. This post is a concept of search and find, performing a create and updating. That has many practical applications.

Create your new Azure Function

Just like the other two posts, we’re going to create a new Powershell HTTP Trigger Azure Function.

Upload the Lithnet RMA PS Module to your new Azure Function (as per blog post 1 in this series). You should also be using protected credentials now as well. So upload your username/password encryption key.

 

Here is the Azure Function Powershell Script that performs the process detailed above.

Test it out. Looks good. 88 users matched the value of Sydney in their location attribute.

Verify that the Set was created and the membership updated.

Test calling the Azure Function remotely

Now that it is all working in the Azure Function, lets try doing it from Powershell remotely. This time I’m again looking for Person objects that have Sydney in their location attribute and I’ll create a set named Sydney-NSW and put them in it.

Brilliant, that works nicely. Let’s verify that the Set was created and has the correct number of users in it. Yes, a perfect match.

Summary

Putting Azure Functions and Powershell together along with the Lithnet Powershell Modules opens up a world of possibilities for automation and integration of the MIM Service without the need for any additional infrastructure or any considerable effort.

Experiment and let me know what you do with this style of integration.

Follow Darren on Twitter @darrenjrobinson

Azure Functions: Access KeyVault Secrets with a Cert-secured Service Principal

siliconvalve

Azure Functions is one of those services in Azure that is seeing a massive amount of uptake. People are using it for so many things, some of which require access to sensitive information at runtime.

At time of writing this post there is a pending Feature Request for Functions to support storing configuration items in Azure KeyVault. If you can’t wait for that Feature to drop here’s how you can achieve this today.

Step 1: Create a KeyVault and Register Secrets

I’m not going to step through doing this in detail as the documentation for KeyVault is pretty good, especially for adding Secrets or Keys. For our purposes we are going to store a password in a Secret in KeyVault and have the most recent version of it be available from this URI:

https://mytestvault.vault.azure.net/secrets/remotepassword

Step 2: Setup a Cert-secured Service Principal in Azure AD

a. Generate a self-signed…

View original post 423 more words

solution-arch

Get Users/Groups/Objects from Microsoft/Forefront Identity Manager with Azure Functions and the Lithnet Resource Management Powershell Module

Introduction

As an Identity Management consultant if I had a $1 for every time I’ve been asked “what is user x’s current status in IDAM”, “is user x active?”, “does user x have an account in y?”, “what is user x’s primary email address?”, particularly after Go Live of an IDAM solution my holidays would be a lot more exotic.

From a Service Desk perspective IDAM implementations are often a black box in the middle of the network that for the most part do what they were designed and implemented to do. However as soon as something doesn’t look normal for a user the Service Desk are inclined to point their finger at that black box (IDAM Solution). And the “what is the current value of ..”, “does user x ..” type questions start flying.

What if we could give the Service Desk a simple query interface into FIM/MIM without needing to give them access to another complicated application?

This is the first (of potentially a series) blog post on leveraging community libraries and Azure PaaS services to provide visibility of FIM/MIM information. This first post really just introduces the concept with a working example in an easy way to understand and replicate. It is not intended for production implementation without additional security and optimisation. 

Overview

The following graphic shows the concept of using Azure Functions to take requests from a client (web app, powershell, some other script) query the FIM/MIM Service and return the result. This post details the setup and configuration for the section in the yellow shaded box with the process outlined in the numbers 1, 2 & 3. This post assumes you already have your FIM/MIM implementation setup and configured according to your connected integrated applications/services such as Active Directory. In my example my connected datasource is actually Twitter.

Prerequisites

The prerequisites for this solution are;

 

Creating your Azure App Service

First up you’ll need to create an Azure App Service in your Azure Tenant. To keep everything logically structured for this example I created an Azure App Service in the same resource group that contains my MIM IaaS infrastructure (MIM Sync Server, MIM Service Server, SQL Server, AD Domain Controller etc).

In the Azure Marketplace select New (+) and search for Function App. Select the Function App item from the results and select Create.

Give your Azure App Service a name, choose the Resource Group where you want to locate it. Choose Dynamic for the Hosting Plan. This means you don’t have to worry about resource management for your Web App and you only pay for execution time which unless you put this into production and have gone crazy with it your costs should be zero as they will (should) be well under the free grant tier.  Put the application in the appropriate location such as close to your FIM/MIM resources that it’ll be interacting with and select Create.

Now that you have your Azure App Service setup, you need to create your Azure Function.

Create an HTTP Trigger Powershell Function App

In the Azure Portal locate your App Services Blade and select the Function App created in the steps above. Mine was named MIMMetaverseSearch in the example above. Select PowerShell as the Language and HttpTrigger-Powershell as the Function type.

Give your Function a name. I’ve kept it simple in this example and named it the same as my App Service Plan. Select Create.

Adding the Lithnet Powershell Module into your Function App.

As you’d expect the Powershell Function App by default only has a handful of core Powershell Modules. As we’re using something pretty specific we’ll need put the module into our Function App so we can load it and use the library.

Download and save the Lithnet Resource Management Powershell Module to your local machine. Something like the Powershell command below will do that.

Next follow this great blog post here from Tao to upload the Lithnet RMA PS Module you downloaded earlier into your function directory. I used WinSCP as my FTP client as I’ve shown below to upload the Lithnet RMA PS Module.

FTP to the host for your App Service and navigate to the /site/wwwroot/

Create a bin folder and upload via FTP the Lithnet RMA PS Module.

Using Kudu navigate to the path and version of the Lithnet RMA PS Module.
I’m using v1.0.6088 and my app is named MIMMetaverseSearch so MY path is D:\home\site\wwwroot\MIMMetaverseSearch\bin\LithnetRMA\1.0.6088

Note: the Lithnet RMA PS Module is 64-bit so you’ll need to configure your Web App for 64-bit as per the info in the same blog you followed to upload the module here.

Test loading the Lithnet RMA PS Module in your Function App

In your Function App select </> Develop. Remove the sample script and in your first line import the Lithnet RMA PS Module using the path from the previous step. Then, to check that it loads add a line that references a cmdlet in the module. I used Help Get-Resource. Select Save then Run.

If you’ve done everything correct when you select Run and look at the Logs you’ll see the module was loaded and the Help Get-Resource command was run in the Logs.

Allow your Function App to access your FIM/MIM Service Server

Even though you have logically placed your Function App in the same Resource Group (if you did it like I have) you’ll need to actually allow the Function App that is running in a shared PaaS environment to connect to your FIM/MIM Service Server.

Create an inbound rule in your Network Security Group to allow access to your FIM/MIM Service Server. The example below isn’t as secure as it could (and should) be as it allows access from anywhere. You should restrict the source of the request(s) accordingly. I’m just showing how to quickly get a working example. TCP Port 5725 is required to access your MIM Service Server. Enter the details as per below and select Ok.

Using an Azure Function to query FIM/MIM Service

Note: Again, this is an example to quickly show the concept. In the script below your credentials are in the script in clear text (and of course those below are not valid). For anything other than validating the concept you must protect your credentials. A great example is available here in Tao’s post.

The PS Azure Function gets the incoming request and converts it from JSON. In my request which you’ll see in the next step I’m passing in “displayName” and “objectType”.

In this example I’m using Get-Resource from the Lithnet RMA to get an object from the FIM/MIM Service. First you need to open a connection to the FIM/MIM Service Server. On my Azure IaaS MIM Service Server I’ve configured a DNS name so you can see I’m using that name in line 17 to connect to it using the unsecured credentials from earlier in the script. If you haven’t set up a DNS name for your FIM/MIM Service Server you can use the Public IP Address instead.

Line 20 queries for the ObjectType and DisplayName passed into the Function (see calling the Function in the next step) and returns the response in line 22. Again this is just an example. There is no error checking, validation or anything. I’m just introducing the concept in this post.

Testing your Function App

Now that you have the function script saved, you can test it from the Function App itself. Select Test from the options up in the right from your function. Change the Request Body for what the Function is expecting. In my case displayname and objectType. Select Run and in the Logs if you’ve got everything configured correctly (like inbound network rules, DNS name, your FIM/MIM Service Server is online, your query is for a valid resource) you should see an object returned.

Calling the Function App from a Client

Now that we have our Function App all setup and configured (and tested in the Function App) let’s send a request to the Azure Function using the Powershell Invoke-RestMethod function. The following call I did from my laptop. It is important to note that there is no authN in this example and the function app will be using whatever credentials you gave it to execute the request. In a deployed solution you’ll need to scope who can make the requests, limit on the inbound network rules who can submit requests and of course further protect the account credentials used to connect to your FIM/MIM Service Server.

Successful Response

The following screenshot shows calling the Function App and getting the responding object. Success. In a couple of lines I created a hashtable for the request, converted it to JSON and submitted it and got a response. How powerful is that!?

 

Summary

Using the awesome Lithnet Resource Management PowerShell Module with Azure Functions it is pretty quick and flexible to access a wealth of information we may want to expose for business benefit.

Now if only there was an affiliation program for Azure Functions that could deposit funds for each IDAM request to an Azure Functions App into my holiday fund.

Stay tuned for more posts on taking this concept to the next level.

Follow Darren on Twitter @darrenjrobinson