Automation and Creation of Office 365 groups using Flow, Microsoft Graph and Azure Function – Part 2

In the Part 1 blog here, we discussed an approach for the Group creation process and important considerations for provisioning groups. In this blog, we will look at getting a Graph App ID and App secret for invoking the graph service and then implementation of the group provisioning process.

MS Graph App Set up

Before we start creating groups we will need to set up a Graph App that will be used to create the group in the Office 365 tenancy. The details are in this blog here on how to create a Microsoft Graph app.

Regarding the permissions, below are the settings that are necessary to allow creating groups through the graph service.

GroupApp_Rights

Creating a Group

As discussed in Part 1 here, below are the broad level steps for automating group creation using a SharePoint inventory list, Microsoft Flow and Azure Function

1. Create a SharePoint list, with the metadata necessary for Group and SharePoint assets provisioning

We can use a SharePoint list to act as a trigger to create groups with the custom metadata necessary for provisioning the groups such as Owners and metadata necessary for creating site assets for SharePoint sites. As a best practice, I recommend you create multiple master lists to manage the details separately if there are too many to manage. In our case, we have created three separate lists for managing the Group details.

1. Group details and metadata
2. Owners and Team Members List
3. Site Assets configuration list

2. Create a Microsoft flow. The flow will validate a new or existing group and pick the unique Group Alias from the list which will allow us to find the group if it exists.

The flow will act as a trigger to start the provisioning process and call the Azure function passing the appropriate metadata as shown below. The flow also allows error handling scenarios as described in the Part 1 blog here

Note: The GroupAlias is the unique name of the Group and is not necessarily the SharePoint URL. For example, in the case where a group was created and subsequently deleted, the unique alias could be used again but the Site URL will be different (unless cleared from the SharePoint recycle bin).

Group_FlowAzureFunctionCall

3. Create the Group in an Azure Function using SharePoint Online CSOM

In order to create the group, we will need to authenticate to the Graph service using the Graph App created earlier. For authenticating the app through Azure AD, please install the NuGet Package for Microsoft.IdentityModel.Clients.ActiveDirectory.

After authenticating, we will create the group using the UnifiedGroup Utility provided through the SharePoint Online CSOM.

Below is a quick snapshot of the code. Note the inclusion of Graph module of the OfficeDevPnP class.

Note: One important bit to note is that, in the above code owners and members email array is the same. If the owners and members email array differ, then the group provisioning delays significantly. Also, it is important to keep the other parameters same as during creation in the below method because it might reset the other properties to default otherwise. For eg. if isPrivate is not set, then the group becomes public.

4. After the group is created, we can fetch the details of the group as below.

5. The group provisioning generally takes about 2-3 mins to provision. However, if there are multiple hits, then one request might override the second request causing the group creation to fail. In such cases, we can sequence the Azure Functions to run by modifying the host.json file. A quick blog covering this can be found here.

Provisioning SharePoint Assets in Azure Function after Group Creation

1. For provisioning of the SharePoint assets, we might have to wait for the Office 365 AD sync to finish granting access to the Admin account.

Sometimes, the AD sync process takes much longer, so to grant direct access to the SharePoint Site Collection using tenant admin, we could use the below code. Recommendation: Only proceed with the below code approach if the access fails for more than few mins.

Note: As a best practice, I would recommend using a Service Account when working on the SharePoint Site. We could also use an App as suggested in the Site Scripting blog here.

2. Once you have access, you can use the normal SharePoint CSOM to do the activities that are pertaining to SharePoint asset provisioning such as Libraries, Site Pages content, Lists, etc.

3. After you’re done, you can return the success from the Azure function as below.

Note: Use HttpStatusCode.Accepted instead of HttpStatusCode.Error in case there is error handling in the Flow or else Flow will trigger another instance of the flow when the Azure Function fails

Conclusion:

Above we saw how we can have a SharePoint Inventory list and create groups using Flow and Azure Functions. For a quick reference, below are the links to the other related blogs.

Part 1 – Automation and Creation of Office 365 groups approach

How to create a Microsoft Graph App

Sequencing calls in Azure Functions

Global Azure Bootcamp 2018 – Creating the Internet of YOUR Things

Today is the 6th Global Azure Bootcamp and I presented at the Sydney Microsoft Office on the Creating the Internet of YOUR Things.

In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.

I then moved on to building IoT devices and leveraging Azure, the focus of my presentation. How to get started quickly with devices, integration and automation. I provided a working example based off previous my previous posts Integrating Azure IoT Devices with MongooseOS MQTT and PowerShellBuilding a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller, and Adding a Display to the Teenager Notification Service Azure IoT Device

I provided enough information and hopefully inspiration to get you started.

Here is my presentation.

 

How to quickly copy Azure Functions between Azure Tenants and implement ‘Run From Zip’

As mentioned in this post yesterday I needed to copy a bunch of Azure WebApps and Functions from one Tenant to another. With anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well for Azure Functions. This does require a storage account associated with the Function App Plan. My Functions didn’t have the need for storage and the plan tier they were on meant that wasn’t a prerequisite. I didn’t have the desire to add a storage account to backup to then migrate.

Overview

In this post I show my method to quickly copy Azure Functions from one Azure Tenant to another. My approach is;

  • In the Source Tenant from the Azure Functions App
    • Using Kudu take a backup of the wwwroot folder (that will contain one or more Functions)
  • In the Target Tenant
    • Create an Azure Function App
    • Using Kudu locate the wwwroot archive in the new Azure Function App
    • Configure Azure Function Run From Zip

Backing up the Azure Functions in the Source Tenant

Using the Azure Portal in the Source Tenant go to your Function App => Application Settings and select Advanced Tools. Select Debug Console – Powershell and navigate to the Site Folder. Next to wwwroot select the download icon to obtain an archive of your functions.

Download WWWRoot Folder 2.PNG

Copying the Azure Functions to the Target Tenant

In the Target Tenant first create a New Azure Function App. I did this as I wanted to change the naming, the plan and a few other configuration items. Then using the Azure Portal go to your new Function App, Application Settings and select Advanced Tools.

Function Advanced Tools

Create a folder under D:\home\data named SitePackages.

Create Site Packages Folder

Drag and drop your wwwroot.zip file into the SitePackages Folder.

Drag Drop wwwroot

In the same folder select the + icon to create a file named siteversion.txt

Site Packages

Inside the file give the name of your archive file e.g.  wwwroot.zip Select Save.

Siteversion.txt.png

Back in your new Function App select Application Settings

Application Settings

Under Application Settings add a new setting for Website_Use_Zip with a setting value of ‘1’.

Website Use Zip.PNG

Refresh your Function App and you’ll notice it is now Read Only as it is running from Zip. All the Functions that were in the Zip are displayed.

Functions Migrated.PNG

Summary

This is a quick and easy method to get your functions copied from one Tenant to another. Keep in mind if your functions are using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates, credentials in the target environment.

Adding a Display to the Teenager Notification Service Azure IoT Device

Overview

A couple of weeks back I wrote this post that detailed Building a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller. 

Over the Easter break I enhanced it with the inclusion of a display. I was rummaging around in a box of parts when I found a few LCD displays I’d purchased on speculation some time ago. They are SSD1306 LCD driven units that can be found on Amazon here. A quick upgrade later and …

… scrolling text to go with rotating lights. The addition of the display requires the following changes to the previous project which are detailed in this post;

  • inclusion of the SSD1306 library
  • configure your micro controller for the display
  • a few changes in the Mongoose OS Init.JS file to have the appropriate text displayed for the notification
  • change to the Notifier Base case to integrate the display
    • it is available in the Thingiverse Project for this thing here and named NodeMCU with Display Window.stl

Incorporating the SSD1306 Library

Before starting, with your micro controller connected and using the MOS UI, take a copy of your Init.js configuration file by selecting Device Files, then Init.js and copying the content to somewhere safe. Also the Device Config by choosing Device Config, Expert View and Save Configuration.

From the MOS UI select Projects, select the AzureIoT-Neopixel-js project then from the drop down menu select mos.yml.

Add the line  – origin: https://github.com/mongoose-os-libs/arduino-adafruit-ssd1306 then select the Spanner icon to Rebuild the App. Once completed select the Flash icon to update your micro controller.

Include SSD1306 Library.PNG

Once written to your micro controller check your Init.js and copy back your backup. Check your Configuration and make sure your MQTT settings are still present. Copy your previous config back if required.

Configure your Micro Controller for the SSD1306 Display

We need to tell your micro controller which GPIO Pins we have attached the display too. I actually also moved the GPIO Pin I attached for the Neopixel as part of this. The configuration is;

  • Neopixel connected to GPIO 12
  • SSD1306 SDA connected to GPIO 4
  • SSD1306 SCL connected to GPIO 5

In the Expert Device Config mode update the I2C section as shown below. Save the configuration.

 "i2c": {
 "enable": true,
 "freq": 100000,
 "debug": false,
 "sda_gpio": 4,
 "scl_gpio": 5
 },

Wiring the SSD1306 to the Micro Controller

Looking at the NodeMCU diagram you can see where the connections need to be made for the NeoPixel and SSD1306 display. SSD1306 SCL to D1, SDA to D2. The Neopixel data connection is now on D6. Power and GND using the PWR and GND pins. I’m using them all on the same side of the NodeMCU to make it fit cleanly into the case later.

NodeMCU.png

Init.js code additions

Incorporate the display library in your Init.js by including the line below.

load('api_arduino_ssd1306.js');

With that done we to initialize the display also in the Init.js. The following lines initialize the display address, SCL pin the display is connected to, the size of the text we are going to display and color. Put them before or after the initialization for the Neopixel.

//------------ Setting up Display ----------------
let oled_addr = 0x3C; // I2C Address for SSD1306let 
oled = Adafruit_SSD1306.create_i2c(5 /* RST GPIO */, Adafruit_SSD1306.RES_128_32);

// Initialize the display. 
oled.clearDisplay();
oled.setTextSize(2);
oled.setTextColor(Adafruit_SSD1306.WHITE);

In the MQTT Subscriber section where you are looking at the MQTT message being sent from the Microsoft Flow and displaying a color on the Neopixel add the following lines to send output to the display. The following below outputs Pink to the display. If Pink indicates some task then change oled.write(‘PINK’); to oled.write(‘TASK’); or similar.

 if (msg === "Pink"){
 // PINK 
 oled.clearDisplay();
 oled.setTextSize(2);
 oled.setCursor(1, 10);
 oled.write('PINK');
 oled.display();
 oled.startScrollLeft(0x00, 0x0F);

Following the Neopixel loop after

 strip.clear();
 strip.show(strip);

add the following to clear the display as the the Neopixel has finished displaying its color notification.

 oled.clearDisplay();
 oled.display();

Repeat for the differing colors and their tasks/meanings.

Summary

Now the notifier includes both a visual color notification AND the text associated with the notification. No confusion here, or does it need a buzzer as well?

 

Evaluating the migration of Azure Functions to Microsoft Flow – Twitter IoT Integration

Introduction

Almost 18 months ago I wrote this post on integrating Twitter with Azure Functions to Tweet IoT data. A derivative of that solution has been successfully running for about the same period. Azure Functions have been bullet proof for me.

After recently implementing Microsoft Flow as detailed in my Teenager Notification Device post here I started looking at a number of the Azure Functions I have running and looked at what would be better suited to being implemented with Flow. What could I simplify by migrating to Microsoft Flow?

The IoT Twitter Function linked above was one the simpler Functions I had running that I’ve transposed and it has been running seamlessly. I chose this particular function to migrate as the functions it was performing were actions that Microsoft Flow supported. Keep in mind (see the Summary), that there isn’t a one size fits all. Flow and Functions each have their place and often work even better together.

Comparison

Transposing the IoT Twitter Function App to Microsoft Flow provided me with the same outcome, however the effort to get to that outcome is considerably less. As a quick comparison I’ve compared the key steps I needed to perform with the Azure Function to enable the integration vs what it took to implement with Microsoft Flow.

Function vs Flow.PNG

That’s pretty compelling. For the Azure Function I needed to register an App with Twitter and I needed to create an Azure Function App Plan to host my Azure Function. With Microsoft Flow I just created a Flow.

To setup and configure the Azure Function I needed to set up Deployment Options to upload the Twitter PowerShell Module (this is the third-party module), and I needed to store the two credential sets associated with the Twitter Account/App. In Microsoft Flow I just chose Twitter as an Action and provided conscent to the oAuth2 challenge.

Finally for the logic of the Azure Function I had to write the script to retrieve the data, manipulate it, and then post it to Twitter. In Microsoft Flow it was simply a case of configuring the workflow logic.

Microsoft Flow

As detailed above, the logic is still the same. On a schedule, get the data from the IoT Devices via a RestAPI, manipulate/parse the response and output a Tweet with the environment info. Doing that in Flow though means selection of an action and configuring it. No code, no modules, no keys.

Below is a resultant Flow (overview) to achieve the same result as my Azure Function that I originally implemented as an Azure Function as detailed here.

MS Flow - Twitter.PNG

The schedule part is triggered hourly. Using Recurrence it is easy to set the schedule (much easier than a CRON format in Azure Functions) complete with timezone (within the advanced section). I then get the Current time to allow me to acquire the Date and Time in a format that I will use in the resulting tweet.

Schedule

Next is to perform the first RestAPI call to get the data from the first of the IoT devices. Parse the JSON response to get the temperature value.

GET

Repeat the above step for the other IoT Device located in a different environment and parse that. Formulate the Tweet using elements of information from the Flow.

Repeat and Tweet

Looking at Twitter we see a resultant Tweet from the Flow.

Tweet.PNG

Summary

This is a relatively simple flow. Bare in mind I haven’t included any logic to validate what is returned or perform any conditional operations during processing. But very quickly it is possible to retrieve, manipulate and output to a different medium.

So why don’t I used Flow for everything? The recent post I mentioned at the beginning for the Teenager Notification Device that also used a Flow, also uses an Azure Function. For that use case the integration of the IoT Device with Azure IoT is via MQTT. There isn’t currently that capability in Flow. But Flow was used to initiate an Action of initiating a trigger for an Azure Function that in turn sent an MQTT message to an IoT Device. The combination of Flow with Functions provides a lot of flexibility and power.

 

Create Modern Pages and update metadata using SPFx Extensions, SP PnP JS and Azure Functions

Modern Site Pages (Site Page content type) have a constraint to associate custom metadata with it. In other words, the “Site Page” content type cannot have other site columns added to it as can be seen below.

SitePageContentTypeMissing

On another note, even though we can create a child content types from Site Page content type, the New Site page creation (screenshot below) process doesn’t associate the new content type when the Page is created. So, the fields from the child content type couldn’t be associated.

For eg. In the below screenshot, we have created a new site page – test.aspx using “Intranet Site Page Content Type” which is a child of “Site Page” content type. After the page is created, it gets associated to Site Page Content type instead of Intranet Site Page Content type. We can edit it again to get it associated to Intranet Page content type but that adds another step for end users to do and added training effort.

 

 

 

Solution Approach:

To overcome the above constraints, we implemented a solution to associate custom metadata into Modern Site Pages creation using SharePoint Framework (SPFx) List View Command Set extension and Azure Function. In this blog, I am going to briefly talk about the approach so it could be useful for anyone trying to do the same.

1. Create a List View Command Item for creating site pages, editing properties of site pages and promoting site pages to news
2. Create an Azure function that will create the Page using SharePoint Online CSOM
3. Call the Azure Function from the SPFx command.

A brief screenshot of the resulting SPFx extension dialog is below.

NewSitePage

Steps:

To override the process for modern page creation, we will use an Azure Function with SharePoint Online PnP core CSOM. Below is an extract of the code for the same. On a broad level, the Azure Function basically does the following

1. Get the value of the Site Url and Page name from the Query parameters
2. Check if the Site page is absent
3. Create the page if absent
4. Save the page

Note: The below code also includes the code to check if the page exists.

Next, create a SPFx extension list view command and SP dialog component that will allow us to call the Azure Function from Site Pages Library to create pages. The code uses ‘fetch api’ to call the Azure Function and pass the parameters for the Site Url and page name required for the Azure Function to create the page. After the page is created, the Azure function will respond with a success status, which can be used to confirm the page creation.

Note: Make sure that the dialog is locked while this operation is working. So, implement the code to stop closing or resubmitting the form.

After the pages are created, lets update the properties of the item using PnP JS library using the below code.

Conclusion:

As we can see above, we have overridden the Page Creation process using our own Azure Function using SPFx List View command and PnP JS. I will be detailing the SP dialog for SPFx extension in another upcoming blog, so keep an eye for it.

There are still some limitations of the above approach as below. You might have to get business approval for the same.

1. Cannot hide the out of the box ‘New Page’ option from inside the extension.
2. Cannot rearrange order of the Command control and it will be displayed at last to the order of SharePoint Out of the box elements.

Automate SharePoint Document Libraries Management using Flow, Azure Functions and SharePoint CSOM

I’ve been working on a client requirement to automate SharePoint library management via scripts to implement a document lifecycle with many document libraries that have custom content types and requires regular housekeeping for ownership and permissions.

Solution overview

To provide a seamless user experience, we decided to do the following:

1. Create a document library template (.stp) with all the prerequisite folders and content types applied.

2. Create a list to store the data about entries for said libraries. Add the owner and contributors for the library as columns in that list.

3. Whenever the title, owners or contributors are changed, the destination document library will be updated.

Technical Implementation

The solution has two main elements to automate this process

1. Microsoft Flow – Trigger when an item is created or modified

2. Two Azure Functions – Create the library and update permissions

The broad steps and code are as follows:

1. When the flow is triggered, we would check the status field to find if it is a new entry or a change.

Note: Since Microsoft flow doesn’t have conditional triggers to differentiate between create and modified list item events, use a text column in the SharePoint list which is set to start, in progress and completed values to identify create and update events.

2. The flow will call an Azure function via an HTTP Post action in a Function. Below is the configuration of this.

AzureFunctionFromFlow

3. For the “Create Library” Azure function, create a HTTP C# Function in your Azure subscription.

4. In the Azure Function, open Properties -> App Service Editor. Then add a folder called bin and then copy two files to it.

  • Microsoft.SharePoint.Client
  • Microsoft.SharePoint.Client.Runtime

KuduTools_AzureFunction

Create Lib App Service Editor

Please make sure to get the latest copy of the Nuget package for SharepointPnPOnlineCSOM. To do that, you can set up a VS solution and copy the files from there, or download the Nuget package directly and extract the files from it.

5. After copying the files, reference them in the Azure function using the below code

#r "Microsoft.SharePoint.Client.dll"
#r "Microsoft.SharePoint.Client.Runtime.dll"
#r "System.Configuration"
#r "System.Web"

6. Then create the SharePoint client context and create a connection to the source list.

7. After that, use the ListCreationInformation class to create the Document library from the library template using the code below.

8. After the library is created, break the role inheritance for the library as per the requirement

9. Update the library permissions using the role assignment object

10. To differentiate between People, SharePoint Groups and AD Groups, find the unique ID and add the group as per the script below.

Note: In case you have people objects that are not in AD anymore because they have left the organisation, please refer to this blog for validating them before updating – Resolving “User not found” issue while assigning permissions using SharePoint CSOM

      Note: Try to avoid item.Update() from the Azure Function as that will trigger a second flow run, causing an iterative loop, instead use item.SystemUpdate()

11. After the update is done, return to the Flow with the success value from the Azure Function which will complete the loop.

As shown above, we saw how we can automate document library creation from a template and permissions management using Flow and Azure Functions

Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report

 

Azure Functions Cold Start Workaround

Intro

I love Azure Functions. So much power for so little effort or cost. The only downside is that the consumption model that keeps the cost so dirt-cheap means that unless you are using your Function constantly (in which case, you might be better off with the non-consumption options anyway), you will often be hit with a long delay as your Function wakes up from hibernation.

So very cold…

This isn’t a big deal if you are dealing with a fire and forget queue trigger scenario, but if you have web app that is calling the HTTP trigger and you need to wait for the Function to do it’s job before responding with a 200 OK… that’s a long wait (well over 15 seconds in my experience with a PowerShell function that loads a bunch of modules).

Now, the blunt way to mitigate this (as suggested by some in github issues on the subject) is to set up a timer function in the same Function App to run every 5 minutes to keep things warm. This to me seems like a wasteful and potentially expensive approach. For my use-case, there was a better way that would work.

The Classic CRUD Use-case

Here’s my use case: I’m building some custom SharePoint forms for a customer and using my preferred JS framework, good old Angular 1.x. Don’t believe the hype around the newer frameworks, ng1 still gets the job done with no performance problems at the scale I’m dealing with in SharePoint. It also comes with a very strong ecosystem of libraries to support doing amazing things in the browser. But that’s a topic for another blog.

Anyway, the only thing I couldn’t do effectively on the client-side was break permissions on the list item created using the form and secure it to the creator and some other users (eg. their manager, etc). You need elevated permissions for that. I called on the awesome power of the PnP PowerShell library (specifically the Set-PnPListItemPermission cmdlet) to do this and wrapped it in a PowerShell Azure Function:

Pretty simple. Nice and clean – gets the job done. My Angular service calls this right after saving the item based on the form input by the user. If the Azure Function is starting from cold, then that adds an extra 20 seconds to save operation. Unacceptable and avoidable.

A more nuanced warmup for those who can…

This seemed like such an obvious solution once I hit on it – but I hadn’t thought of it before. When the form is first opened, it’s pretty reasonable to assume that (unless the form is a monster), the user should be hitting that submit/save button in under 5 mins. So that’s when we hit our function with a modified HTTP payload of ‘WARMUP’.

Just a simple ‘If’ statement to bypass the function if the ‘WARMUP’ payload is detected. The Function immediately responds with a 200. We ignore that – this is effectively fire-and-forget. Yes, this would be even simpler if we had a separate warmup Function that did absolutely nothing except warm up the Functions App, but I like that this ensures that my dependent PnP dlls (in the ‘modules’ folder of my Function) have been fired up on the current instance before I hit the function for real. Maybe it makes no difference. Don’t really care – this is simple enough anyway.

Here’s the Angular code that calls the Function (both as a warmup and for real):

Anyway, nothing revolutionary here, I know. But I hadn’t come across this approach before, so I thought it was worth writing up as it suits this standard CRUD forms over data scenario so nicely.

Till next time!

Azure Function Proxies for API Mocking

In my previous posts, Is Your Serverless Application Testable? – Azure Logic Apps and API Mocking for Developers, we have looked how to mock APIs with various approaches including Azure API Management, AWS API Gateway, MuleSoft and Azure Functions. Quite recently, Azure Functions Team released a new mocking feature in Azure Function Proxies. With this feature, API mocking can’t be even easier. In this post, I’m going to show how to use this mocking feature in several ways.

Enable API Mocking via Azure Portal

This is pretty straight forward. Within Azure Portal, simply enable the Azure Function Proxies feature:

If you have already enabled this feature, make sure that its runtime version is 0.2. Then click the plus sign next to Proxies (preview) and put details like:

At this stage, we might have found an interesting one. This proxy doesn’t have to link to an existing HTTP endpoint. In other words, we don’t need to have a real API for mocking. Just create a mocked HTTP endpoint and use it. Once we save this, we’ll have an endpoint like:

Use Postman to hit this mocked URL:

We receive the mocked HTTP status code and response body. How easy!

But if we have an existing Azure Function app and want to integrate this mocking feature via CI/CD pipeline, what can we do? Let’s move on.

Enable Azure Function Proxy Option via ARM Template

Azure Functions app instance is basically the same web app instance. So we can use the same ARM template definition for app settings configurations. In order to enable Azure Function Proxies, we simply add a ROUTING_EXTENSION_VERSION key with a value of ~0.2. Here’s a sample ARM template definition:

Once we update our ARM template, we can simply deploy this template to Azure.

Enable Azure Function Proxy Option via PowerShell

If ARM template is not feasible to enable this proxy option, then we can use a PowerShell script (or Azure CLI equivalent). Here’s a sample script:

Hence, this PowerShell script can be invoked with parameters like:

After either way, ARM template or PowerShell, is executed, the function app instance will have the value like:

Now, we have proxies feature enabled through code! Let’s move on.

Deploy Azure Function Proxy with Azure Function App

Defining Azure Function proxies is just to add another JSON file, proxies.json, to the app instance. The JSON schema definition can be found at http://json.schemastore.org/proxies. So, basically the proxies.json looks like:

As defined above, the mocked API has a URL of /mock/hello-world with response code of 200 and body { "message": "Hello World" }. As long as we follow this structure, we can add as many mocked APIs as we like.

Once we add this proxies.json into our Azure Functions project, deploy it to Azure. Then we’ll be able to see the list of mocked APIs.

So far, we have looked how to add mocked APIs to an Azure Function instance with various ways. Depending on preferences, any approach would be fine. But just make sure that this is the easiest and cheapest way to mock API endpoints.