Office 365 URLs and IP address updates for firewall and proxy configuration, using Flow and Azure Automation

tl;dr

To use Microsoft Office 365, an organisation must allow traffic to [and sometimes from] the respective cloud services via the internet on specific ports and protocols to various URLs and/or IP addresses, or if you meet the requirements via Azure ExpressRoute. Oh duh?!

To further expand on that, connections to trusted networks (which we assume Office 365 falls into this category) that are also high in volume (since most communication and collaborative infrastructure resides there) should be via a low latency egress that is as close to the end user as possible.

As more and more customers use the service, as more and more services and functionality is added, so to will the URLs and IP addresses need to change over time. Firewalls and proxies need to be kept up to date with the destination details of Office 365 services. This is an evergreen solution, lets not forget. So, it’s important to put the processes in-place to correctly optimise connectivity to Office 365. It’s also very important to note that these processes, around change management, if left ignored, will result in services being blocked or delivering inconsistent experiences for end users.

Change is afoot

Come October 2nd 2018, Microsoft will change the way customers can keep up to date with these changes to these URLs and IP addresses. A new web service is coming online that publishes Office 365 endpoints, making it easier for you to evaluate, configure, and stay up to date with changes.

Furthermore, the holistic overview of these URLs and IP addresses is being broken down into three new key categories: OPTIMISE, ALLOW and DEFAULT.

You can get more details on these 3x categories from the following blog post on TechNet: https://blogs.technet.microsoft.com/onthewire/2018/04/06/new-office-365-url-categories-to-help-you-optimize-the-traffic-which-really-matters/

 

It’s not all doom and gloom as your RSS feed no longer works. The new web service (still in preview, public preview, at the time of writing this blog) is rather zippy and allows for some great automation. So, that’s the target state: automation.

Microsoft wants to make it nice and easy for firewall, proxy or whatever edge security appliance vendor or service provider to programmatically interact with the web service and offer dynamic updates for Office 365 URL and IP address information. In practice, change management and governance processes will evidently still be followed. In most circumstances, organisations are following whatever ITIL or ITIL like methodologies are in place for those sorts of things.

The dream Microsoft has, though, is actually one that is quite compelling.

Before we get to this streamlined utopia where my customers edge devices update automatically, I’ve needed to come up with a process for the interim tactical state. This process runs through as follows:

  • Check daily for changes in Office 365 URLs and IP addresses
  • Download changes in a user readable format (So, ideally, no XML or JSON. Perhaps CSV for easy data manipulation or even ingestion into production systems)
  • Email intended parties that there has been a change in the global version number of the current Office 365 URLs and IP addresses
  • Allow intended parties to download the output

NOTE – for my use case here, the details for the output is purely IP addresses. That’s because the infrastructure that the teams I’ll be sending this information to only allows for that data type. If you were to tweak the web service request (details further down), you can grab both URLs and IP addresses, or one or the other.

 

Leveraging Microsoft Flow and Azure Automation

My first instinct here was to use Azure Automation and run a very long PowerShell script with If’s and Then’s and so on. However, when going through the script, 1) my PowerShell skills are not that high level to bang this out and 2) Flow is an amazing tool to run through some of the tricky bits in a more effortless way.

So, leveraging the goodness of Flow, here’s a high level rundown of what the solution looks like:

 

The workflow runs as follows:

  1. Microsoft Flow
  2. On a daily schedule, the flow is triggered at 6am
  3. Runbook #1
    1. Runbook is initiated
    2. Runbook imports CSV from Azure Blob
    3. Powershell runs comment to query web service and saves output to CSV
    4. CSV is copied to Azure Blob
  4. Runbook #2 imports a CSV
    1. Runbook is initiated
    2. Runbook imports CSV from Azure Blob
    3. The last cell in the version column is compared to the previous
    4. An Output is saved to Azure Automation if a newer version found, “NEW-VERSION-FOUND”
  5. The Output is taken from the prvious Azure Automation Runbook run
  6. A Flow Condition is triggered – YES if Output is found, NO if nothing found

Output = YES

  • 7y1 = Runbook #3 is run
    • Runbook queries web service for all 3 conditions: optimise, allow and default
    • Each query for that days IP address information is saved into 3 separate CSV files
  • 7y2 = CSV files are copied to Azure Blob
  • 7y3 = Microsoft Flow queries Azure Blob for the three files
  • 7y4 = An email template is used to email respective interested parties about change to the IP address information
    • The 3x files are added as attachments

Output = Nothing or NO

  • 7n1 = Sent an email to the service account mailbox to say there was no changes to the IP address information for that day

 

The process

Assuming, dear reader, that you have some background with Azure and Flow, here’s a detailed outlined of the process I went through (and one that you can replicate) to automate checking and providing relevant parties with updates to the Office 365 URLs and IP address data.

Lets begin!

Step 1 – Azure AD
  • I created a service account in Azure AD that has been given an Office 365 license for Exchange Online and Flow
  • The user details don’t really matter here as you can follow your own naming convention
  • My example username is as follows: svc-as-aa-01@[mytenant].onmicrosoft.com
    • Naming convention being: “Service account – Australia South East – Azure Automation – Sequence number”
Step 2 – Azure setup – Resource Group
  • I logged onto my Azure environment and created a new Resource Group
  • My solution has a couple of components (Azure Automation account and a Storage account), so I put them all in the same RG. Nice and easy
  • My Resource Group details
    • Name = [ASPRODSVCAA01RG]
    • Region = Australia South East as that’s the local Azure Automation region
    • That’s a basic naming convention of: “Australia South East – Production environment – Purpose, being for the SVC account and Azure Automation – Sequence number – Resource Group”
  • Once the group was created, I added my service account as a Contributor to the group
    • This allows the account downstream permissions to the Azure Automation and Storage accounts I’ll add to the resource group
Step 3 – Azure Setup – Storage account
  • I created a storage account and stored that in my resource group
  • The storage account details are as follows
    • Name = [asprodsvcaa01] = Again, follow your own naming convention
    • Deployment model = Resource manager
    • Storage General Purpose v2
    • Local redundant storage only
    • Standard performance
    • Hot access tier
  • Within the storage account, I’ve used Blob storage
    • There’s two containers that I used:
      • Container #1 = “daily”
      • Container #2 = “ipaddresses”
    • This is where the output CSV files will be stored
  • Again, we don’t need to assign any permissions as we assigned Contributor permissions to the resource group
Step 4 – Azure Setup – Azure Automation
  • I created a new Azure Automation account with the following parameters
    • Name = [SVCASPROD01AA] = Again, follow your own naming convention
    • Default parameters, matching my resource group settings
    • Yes, I also had a Run As account created (default option)
  • I created three Runbooks created, as per below

 

  • Step1-GetGlobalVersion = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion

#region CONNECT
$pass = ConvertTo-SecureString "[pass phrase here]" -AsPlainText –Force
$cred = New-Object -TypeName pscredential –ArgumentList "[credential account]", $pass
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId "[tenant id]"
#endregion

#region IMPORT CSV FILE FROM BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[account key here]'
Get-AzureStorageBlob -Context $storageContext -Container "[name here]" | Get-AzureStorageBlobContent -Destination . -Context $storageContext -Force
#endregion

#region GET CURRENT VERION
$DATE = $(((get-date).ToUniversalTime()).ToString("yyyy-MM-dd"))
Invoke-RestMethod -Uri https://endpoints.office.com/version/Worldwide?ClientRequestId=b10c5ed1-bad1-445f-b386-b919946339a7 | Select-Object @{Label="VERSION";Expression={($_.Latest)}},@{Label="DATE";Expression={($Date)}} | Export-Csv [daily-export.csv] -NoTypeInformation -Append 

# SAVE TO BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[account key here]'
Set-AzureStorageBlobContent -File [.\daily-export.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
#endregion

#region OUTPUT
Write-Output "SCRIPT-COMPLETE"
#endregion

 

  • Step2-CheckGlobalVersion = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion

#region CONNECT 
$pass = ConvertTo-SecureString "[pass phrase here]" -AsPlainText –Force 
$cred = New-Object -TypeName pscredential –ArgumentList "[credential account]", $pass 
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId "[tenant id]" #endregion

#region IMPORT CSV FILE FROM BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[key here]'
Get-AzureStorageBlob -Context $storageContext -Container [name here] | Get-AzureStorageBlobContent -Destination . -Context $storageContext -Force
#endregion

#region CHECK IF THERE IS A DIFFERENCE IN THE VERSION
$ExportedCsv = import-csv [.\daily-export.csv]
$Last = $ExportedCsv | Select-Object -Last 1 -ExpandProperty Version # Last value in Version column
$SecondLast = $ExportedCsv | Select-Object -Last 1 -Skip 1 -ExpandProperty Version #Second last value in version column
If ($Last –gt $SecondLast) { 
Write-Output '[NEW-VERSION-FOUND]'
}

 

  • Step3-GetURLsAndIPAddresses = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion

#region EXECUTE PROCESS TO DOWNLOAD NEW VERSION
$endpoints = Invoke-RestMethod -Uri https://endpoints.office.com/endpoints/Worldwide?ClientRequestId=b10c5ed1-bad1-445f-b386-b919946339a7
$endpoints | Foreach {if ($_.category -in ('Optimize')) {$_.IPs}} | Sort-Object -unique | Out-File [.\OptimizeFIle.csv]
$endpoints | Foreach {if ($_.category -in ('Allow')) {$_.IPs}} | Sort-Object -unique | Out-File [.\AllowFile.csv]
$endpoints | Foreach {if ($_.category -in ('Default')) {$_.IPs}} | Sort-Object -unique | Out-File [.\DefaultFile.csv]

$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[key here]'
Set-AzureStorageBlobContent -File [.\OptimizeFIle.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
Set-AzureStorageBlobContent -File [.\AllowFile.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
Set-AzureStorageBlobContent -File [.\DefaultFile.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
#endregion

#region OUTPUT
Write-Output "SCRIPT COMPLETE"
#endregion
  • Note that we don’t need to import the complete AzureRM Powershell modules
  • You’ll find that if you do something “lazy” like that, there’s a whole lot of dependencies in Azure Automation
    • You’ll need to manually add in all the sub-modules which is very time consuming
Step 5 – Microsoft Flow
  • With my service account having a Flow license, I created my Flow there
  • This means that I can pass this onto Managed Services to run with and maintain going forward
  • I started with a blank Flow
  • I added a schedule
    • The schedule runs at 6am every day

  • Step 1 is to add in an Azure Automation Create Job task
    • This is to execute the Runbook “Step1-GetGlobalVersion”
    • Flow will try and connect to Azure with our Service account
    • Because we added all the relevant permissions earlier in Azure, the Resource Group and downstream resources will come up automatically
    • Enter in the relevant details

  • Step 2 is to add in another Azure Automation Create Job task
    • This is to execute the Runbook “Step2-CheckGlobalVersion”
    • Again, Flow will connect and allow you to select resources that the service account has Contributor permissions to

  • Step 3 is to add in an Azure Automation Get Job Output
    • This is to grab the Output data from the previous Azure Automation runbook
    • The details are pretty simply
    • I selected the “JobID” from the Step 2 Azure Automation runbook job

  • Step 4 is where things get interesting
  • This is a Flow Condition
  • This is where we need to specify if a Value of “NEW-VERSION-FOUND” is found in the content of the Output from the Step 2 Job, Do something or Not do something

  • Step 5 is where I added in all the “IF YES” flow to Do something because we have an output of “NEW-VERSION-FOUND”
  • The first sub-process is another Azure Automation Create Job task
  • This is to execute the Runbook “Step3-GetURLsandIPaddresses”
  • Again, Flow will connect and allow you to select resources that the service account has Contributor permissions to

  • Step 6 is to create 3 x Get Blob Content actions
  • This will allow us to connect to Azure Blob storage and grab the 3x CSV files that the previous steps output to Blob created
  • We’re doing this so we can embed them in an email we’re going to send to relevant parties in need of this information

  • Step 7 is to create an email template
  • As we added an Exchange Online license to our service account earlier, we’ll have the ability to send email as the service accounts mailbox
  • The details are pretty straight forward here:
    • Enter in the recipient address
    • The sender
    • The subject
    • The email body
      • This can be a little tricky, but, I’ve found that if you enable HTML (last option in the Send An Email action), you can use <br> or line break to space out your email nicely
    • Lastly, we’ll attach the 3x Blobs that we picked up in the previous step
    • We just need to manually set the name of email file
    • Then select the Content via the Dynamic Content option
      • Note: if you see this error “We can’t find any outputs to match this input format.Select to see all outputs from previous actions.” – simply hit the “See more” button
      • The See more button will show you the content option in the previous step (step 6 above)

  • Step 8 is to go over to the If No condition
  • This is probably option because I believe the old saying goes “no new is good news”
  • However, for the purposes of tracking how often changes happen easily, I thought I’d email the service account and store a daily email if no action was taken
    • I’ll probably see myself as well here to keep an eye on Flow to make sure it’s running
    • I can use inbox rules to move the emails out of my inbox and into a folder to streamline it further and keep my inbox clean
  • The details are pretty much the same as the previous Step 7
    • However, there’s no attachments required here
    • This is a simple email notification where I entered the following in the body: “### NO CHANGE IN O365 URLs and IP ADDRESSES TODAY ###”

 

Final words

Having done many Office 365 email migrations, I’ve come to use Powershell and CSV’s quite a lot to make my life easier when there’s 1000’s of records to work with. This process uses that experience and that speed of working on a solution using CSV files. I’m sure there’s better ways to streamline that component, like for example using Azure Table Storage.

I’m also sure there’s better ways of storing credential information, which, for the time being isn’t a problem while I work out this new process. The overall governance will get ironed out and I’ll likely leverage Azure Automation Credential store, or even Azure Key Vault.

If you, dear reader, have found a more streamlined and novel way to achieve this that requires even less effort to setup, please share!

Best,

Lucian

#WorkSmarterNotHarder

 

Automation and Creation of Office 365 groups using Flow, Microsoft Graph and Azure Function – Part 2

In the Part 1 blog here, we discussed an approach for the Group creation process and important considerations for provisioning groups. In this blog, we will look at getting a Graph App ID and App secret for invoking the graph service and then implementation of the group provisioning process.

MS Graph App Set up

Before we start creating groups we will need to set up a Graph App that will be used to create the group in the Office 365 tenancy. The details are in this blog here on how to create a Microsoft Graph app.

Regarding the permissions, below are the settings that are necessary to allow creating groups through the graph service.

GroupApp_Rights

Creating a Group

As discussed in Part 1 here, below are the broad level steps for automating group creation using a SharePoint inventory list, Microsoft Flow and Azure Function

1. Create a SharePoint list, with the metadata necessary for Group and SharePoint assets provisioning

We can use a SharePoint list to act as a trigger to create groups with the custom metadata necessary for provisioning the groups such as Owners and metadata necessary for creating site assets for SharePoint sites. As a best practice, I recommend you create multiple master lists to manage the details separately if there are too many to manage. In our case, we have created three separate lists for managing the Group details.

1. Group details and metadata
2. Owners and Team Members List
3. Site Assets configuration list

2. Create a Microsoft flow. The flow will validate a new or existing group and pick the unique Group Alias from the list which will allow us to find the group if it exists.

The flow will act as a trigger to start the provisioning process and call the Azure function passing the appropriate metadata as shown below. The flow also allows error handling scenarios as described in the Part 1 blog here

Note: The GroupAlias is the unique name of the Group and is not necessarily the SharePoint URL. For example, in the case where a group was created and subsequently deleted, the unique alias could be used again but the Site URL will be different (unless cleared from the SharePoint recycle bin).

Group_FlowAzureFunctionCall

3. Create the Group in an Azure Function using SharePoint Online CSOM

In order to create the group, we will need to authenticate to the Graph service using the Graph App created earlier. For authenticating the app through Azure AD, please install the NuGet Package for Microsoft.IdentityModel.Clients.ActiveDirectory.

After authenticating, we will create the group using the UnifiedGroup Utility provided through the SharePoint Online CSOM.

Below is a quick snapshot of the code. Note the inclusion of Graph module of the OfficeDevPnP class.

Note: One important bit to note is that, in the above code owners and members email array is the same. If the owners and members email array differ, then the group provisioning delays significantly. Also, it is important to keep the other parameters same as during creation in the below method because it might reset the other properties to default otherwise. For eg. if isPrivate is not set, then the group becomes public.

4. After the group is created, we can fetch the details of the group as below.

5. The group provisioning generally takes about 2-3 mins to provision. However, if there are multiple hits, then one request might override the second request causing the group creation to fail. In such cases, we can sequence the Azure Functions to run by modifying the host.json file. A quick blog covering this can be found here.

Provisioning SharePoint Assets in Azure Function after Group Creation

1. For provisioning of the SharePoint assets, we might have to wait for the Office 365 AD sync to finish granting access to the Admin account.

Sometimes, the AD sync process takes much longer, so to grant direct access to the SharePoint Site Collection using tenant admin, we could use the below code. Recommendation: Only proceed with the below code approach if the access fails for more than few mins.

Note: As a best practice, I would recommend using a Service Account when working on the SharePoint Site. We could also use an App as suggested in the Site Scripting blog here.

2. Once you have access, you can use the normal SharePoint CSOM to do the activities that are pertaining to SharePoint asset provisioning such as Libraries, Site Pages content, Lists, etc.

3. After you’re done, you can return the success from the Azure function as below.

Note: Use HttpStatusCode.Accepted instead of HttpStatusCode.Error in case there is error handling in the Flow or else Flow will trigger another instance of the flow when the Azure Function fails

Conclusion:

Above we saw how we can have a SharePoint Inventory list and create groups using Flow and Azure Functions. For a quick reference, below are the links to the other related blogs.

Part 1 – Automation and Creation of Office 365 groups approach

How to create a Microsoft Graph App

Sequencing calls in Azure Functions

Protecting Application Credentials when implementing Modular Azure Functions with Microsoft Flow

This weekend I was attempting to rework some older Azure Automation tasks I wrote some time ago that were a combination of PowerShell scripts and Azure (PowerShell Functions). I was looking to leverage Microsoft Flow so that I could have them handy as ‘Buttons’ in the Microsoft Flow mobile app.

Quite quickly I realized that Microsoft Flow didn’t have the capability to perform some of the automation I required, so I handed that off to an Azure Function. The Azure Function then needed to leverage a Registered AAD Application. That required an Application ID and Secret (or a certificate).  This wasn’t going the way I wanted so I took a step back.

The Goals I was attempting to achieve were;

  • A set of Azure Functions that perform small repetitive tasks that can be re-used across multiple Flows
  • Separation of permissions associated with function/object orientated Azure Functions

The Constraints I encountered were;

  • Microsoft Flow doesn’t currently have Azure Key Vault Actions
  • The Flows I was looking to build required functionality that isn’t currently covered by available Actions within Flow

With my goal to have a series of Functions that can be re-used for multiple subscriptions I came up with the following workaround (until Flow has actions for Key Vault or Managed Service Identity).

Current working Workaround/Bodge;

  • I created an Azure Function that can be passed Key Vault URI’s for credential and subscription information
    • typically this is the Application ID, Application Secret, Azure Subscription. These are retrieved from Azure Key Vault using Managed Service Identity
    • returns to the Flow the parameters listed above
  • Flow calls another Azure Function to perform required tasks
    • that Azure Function can be leveraged for an AAD App in any Subscription as credentials are passed to it

RG VM Flow Integration 640px.png

Example Scenario (as shown above);

  1. Microsoft Flow triggered using a Flow Button in the mobile application to report on Azure Virtual Machines
  2. Flow calls Azure Function (Get-Creds) to get credentials associated with the Flow for the environment being reported on
  3. Managed Service Identity used from Azure Function to obtain credentials from Azure Key Vault
    • Application ID, Application Secret and Azure Subscription returned to Flow
  4. Flow calls Azure Function (Get-VM-Status) that authenticates to Azure AD based of credentials and subscription passed to it
  5. Azure Resource Group(s) and VM’s queried from the Function App with the details returned to Flow

Concerns/thoughts;

  1. Passing credentials between integration elements isn’t the best idea
    • obfuscation is that best that can be done for now
    • having the information stored in three different secrets means all information isn’t sent in one call
      • but three web requests are required to get the necessary creds
  2. A certificate for AAD App Authentication would reduce the Key Vault calls to one
    • would this be considered better or worse?
  3. At least the credentials aren’t at rest anywhere other than in the Key Vault.

Summary

We’ve come a long way in a year. Previously we just had Application Settings in Azure Functions and we were obfuscating credentials stored their using encryption techniques. Now with Managed Service Identity and Azure Key Vault we have Function sorted. Leveraging modular Azure Functions to perform actions not possible in Flow though still seems like a gap. How are you approaching such integration?

 

Global Azure Bootcamp 2018 – Creating the Internet of YOUR Things

Today is the 6th Global Azure Bootcamp and I presented at the Sydney Microsoft Office on the Creating the Internet of YOUR Things.

In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.

I then moved on to building IoT devices and leveraging Azure, the focus of my presentation. How to get started quickly with devices, integration and automation. I provided a working example based off previous my previous posts Integrating Azure IoT Devices with MongooseOS MQTT and PowerShellBuilding a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller, and Adding a Display to the Teenager Notification Service Azure IoT Device

I provided enough information and hopefully inspiration to get you started.

Here is my presentation.

 

Adding a Display to the Teenager Notification Service Azure IoT Device

Overview

A couple of weeks back I wrote this post that detailed Building a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller. 

Over the Easter break I enhanced it with the inclusion of a display. I was rummaging around in a box of parts when I found a few LCD displays I’d purchased on speculation some time ago. They are SSD1306 LCD driven units that can be found on Amazon here. A quick upgrade later and …

… scrolling text to go with rotating lights. The addition of the display requires the following changes to the previous project which are detailed in this post;

  • inclusion of the SSD1306 library
  • configure your micro controller for the display
  • a few changes in the Mongoose OS Init.JS file to have the appropriate text displayed for the notification
  • change to the Notifier Base case to integrate the display
    • it is available in the Thingiverse Project for this thing here and named NodeMCU with Display Window.stl

Incorporating the SSD1306 Library

Before starting, with your micro controller connected and using the MOS UI, take a copy of your Init.js configuration file by selecting Device Files, then Init.js and copying the content to somewhere safe. Also the Device Config by choosing Device Config, Expert View and Save Configuration.

From the MOS UI select Projects, select the AzureIoT-Neopixel-js project then from the drop down menu select mos.yml.

Add the line  – origin: https://github.com/mongoose-os-libs/arduino-adafruit-ssd1306 then select the Spanner icon to Rebuild the App. Once completed select the Flash icon to update your micro controller.

Include SSD1306 Library.PNG

Once written to your micro controller check your Init.js and copy back your backup. Check your Configuration and make sure your MQTT settings are still present. Copy your previous config back if required.

Configure your Micro Controller for the SSD1306 Display

We need to tell your micro controller which GPIO Pins we have attached the display too. I actually also moved the GPIO Pin I attached for the Neopixel as part of this. The configuration is;

  • Neopixel connected to GPIO 12
  • SSD1306 SDA connected to GPIO 4
  • SSD1306 SCL connected to GPIO 5

In the Expert Device Config mode update the I2C section as shown below. Save the configuration.

 "i2c": {
 "enable": true,
 "freq": 100000,
 "debug": false,
 "sda_gpio": 4,
 "scl_gpio": 5
 },

Wiring the SSD1306 to the Micro Controller

Looking at the NodeMCU diagram you can see where the connections need to be made for the NeoPixel and SSD1306 display. SSD1306 SCL to D1, SDA to D2. The Neopixel data connection is now on D6. Power and GND using the PWR and GND pins. I’m using them all on the same side of the NodeMCU to make it fit cleanly into the case later.

NodeMCU.png

Init.js code additions

Incorporate the display library in your Init.js by including the line below.

load('api_arduino_ssd1306.js');

With that done we to initialize the display also in the Init.js. The following lines initialize the display address, SCL pin the display is connected to, the size of the text we are going to display and color. Put them before or after the initialization for the Neopixel.

//------------ Setting up Display ----------------
let oled_addr = 0x3C; // I2C Address for SSD1306let 
oled = Adafruit_SSD1306.create_i2c(5 /* RST GPIO */, Adafruit_SSD1306.RES_128_32);

// Initialize the display. 
oled.clearDisplay();
oled.setTextSize(2);
oled.setTextColor(Adafruit_SSD1306.WHITE);

In the MQTT Subscriber section where you are looking at the MQTT message being sent from the Microsoft Flow and displaying a color on the Neopixel add the following lines to send output to the display. The following below outputs Pink to the display. If Pink indicates some task then change oled.write(‘PINK’); to oled.write(‘TASK’); or similar.

 if (msg === "Pink"){
 // PINK 
 oled.clearDisplay();
 oled.setTextSize(2);
 oled.setCursor(1, 10);
 oled.write('PINK');
 oled.display();
 oled.startScrollLeft(0x00, 0x0F);

Following the Neopixel loop after

 strip.clear();
 strip.show(strip);

add the following to clear the display as the the Neopixel has finished displaying its color notification.

 oled.clearDisplay();
 oled.display();

Repeat for the differing colors and their tasks/meanings.

Summary

Now the notifier includes both a visual color notification AND the text associated with the notification. No confusion here, or does it need a buzzer as well?

 

Evaluating the migration of Azure Functions to Microsoft Flow – Twitter IoT Integration

Introduction

Almost 18 months ago I wrote this post on integrating Twitter with Azure Functions to Tweet IoT data. A derivative of that solution has been successfully running for about the same period. Azure Functions have been bullet proof for me.

After recently implementing Microsoft Flow as detailed in my Teenager Notification Device post here I started looking at a number of the Azure Functions I have running and looked at what would be better suited to being implemented with Flow. What could I simplify by migrating to Microsoft Flow?

The IoT Twitter Function linked above was one the simpler Functions I had running that I’ve transposed and it has been running seamlessly. I chose this particular function to migrate as the functions it was performing were actions that Microsoft Flow supported. Keep in mind (see the Summary), that there isn’t a one size fits all. Flow and Functions each have their place and often work even better together.

Comparison

Transposing the IoT Twitter Function App to Microsoft Flow provided me with the same outcome, however the effort to get to that outcome is considerably less. As a quick comparison I’ve compared the key steps I needed to perform with the Azure Function to enable the integration vs what it took to implement with Microsoft Flow.

Function vs Flow.PNG

That’s pretty compelling. For the Azure Function I needed to register an App with Twitter and I needed to create an Azure Function App Plan to host my Azure Function. With Microsoft Flow I just created a Flow.

To setup and configure the Azure Function I needed to set up Deployment Options to upload the Twitter PowerShell Module (this is the third-party module), and I needed to store the two credential sets associated with the Twitter Account/App. In Microsoft Flow I just chose Twitter as an Action and provided conscent to the oAuth2 challenge.

Finally for the logic of the Azure Function I had to write the script to retrieve the data, manipulate it, and then post it to Twitter. In Microsoft Flow it was simply a case of configuring the workflow logic.

Microsoft Flow

As detailed above, the logic is still the same. On a schedule, get the data from the IoT Devices via a RestAPI, manipulate/parse the response and output a Tweet with the environment info. Doing that in Flow though means selection of an action and configuring it. No code, no modules, no keys.

Below is a resultant Flow (overview) to achieve the same result as my Azure Function that I originally implemented as an Azure Function as detailed here.

MS Flow - Twitter.PNG

The schedule part is triggered hourly. Using Recurrence it is easy to set the schedule (much easier than a CRON format in Azure Functions) complete with timezone (within the advanced section). I then get the Current time to allow me to acquire the Date and Time in a format that I will use in the resulting tweet.

Schedule

Next is to perform the first RestAPI call to get the data from the first of the IoT devices. Parse the JSON response to get the temperature value.

GET

Repeat the above step for the other IoT Device located in a different environment and parse that. Formulate the Tweet using elements of information from the Flow.

Repeat and Tweet

Looking at Twitter we see a resultant Tweet from the Flow.

Tweet.PNG

Summary

This is a relatively simple flow. Bare in mind I haven’t included any logic to validate what is returned or perform any conditional operations during processing. But very quickly it is possible to retrieve, manipulate and output to a different medium.

So why don’t I used Flow for everything? The recent post I mentioned at the beginning for the Teenager Notification Device that also used a Flow, also uses an Azure Function. For that use case the integration of the IoT Device with Azure IoT is via MQTT. There isn’t currently that capability in Flow. But Flow was used to initiate an Action of initiating a trigger for an Azure Function that in turn sent an MQTT message to an IoT Device. The combination of Flow with Functions provides a lot of flexibility and power.

 

Building a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller

Introduction

This is the third and final post on my recent experiments integrating small micro controllers (ESP8266) running Mongoose OS integrated with Azure IoT Services.

In the first post in this series I detailed creating the Azure IoT Hub and registering a NodeMCU (ESP8266 based) micro controller with it. The post detailing that can be found here. Automating the creation of Azure IoT Hubs and the registration of IoT Devices with PowerShell and VS Code

In the second post I detailed communicating with the micro controller (IoT device) using MQTT and PowerShell. That post can be found here. Integrating Azure IoT Devices with MongooseOS MQTT and PowerShell

Now that we have end to end functionality it’s time to do something with it.

I have two teenagers who’ve been trained well to use headphones. Whilst this is great at not having to hear the popular teen bands of today, and numerous Facetime, Skype, Snapchat and similar communications it does come with the downside of them not hearing us when we require their attention and they are at the other end of the house. I figured to avoid the need to shout to get attention, a simple visual notification could be built to achieve the desired result. Different colours for different requests? Sure why not. This is that project, and the end device looks like this.

Overview

Quite simply the solution goes like this;

  • With the Microsoft Flow App on our phones we can select the Flow that will send a notification

2018-03-25 18.56.38 500px.png

  • Choose the Notification intent which will drive the color displayed on the Teenager Notifier.

2018-03-25 18.56.54 500px

  • The IoT Device will then display the color in a revolving pattern as shown below.

The Architecture

The end to end architecture of the solution looks like this.

IoT Cloud to Device - NeoPixel - 640px

Using the Microsoft Flow App on a mobile device gives a nice way of having a simple interface that can be used to trigger the notification. Microsoft Flow sends the desired message and details of the device to send it to, to an Azure Function that puts a message into an MQTT queue associated with the Mongoose OS driven Azure IoT Device (ESP8266 based NodeMCU micro controller) connected to an Azure IoT Hub. The Mongoose OS driven Azure IoT Device takes the message and displays the visual notification in the color associated with the notification type chosen in Microsoft Flow at the beginning of the process.

The benefits of this architecture are;

  • the majority of the orchestration happens in Azure, yet thanks to Azure IoT and MQTT no inbound connection is required where the IoT device resides. No port forwarding / inbound rules to configure on your home router. The micro controller is registered with our Azure IoT Hub and makes an outbound connection to subscribe to its MQTT topic. As soon as there is a message for the device it triggers its logic and does what we’ve configured
  • You can initiate a notification from anywhere in the world (most simply using the Flow mobile app as shown above)
  • And using Mongoose OS allows for the device to be managed remote via the Mongoose OS Dashboard. This means that if I want to add an additional notification (color) I can update Flow for a new option to select and update the configuration on the Notifier device to display the new color if it receives such a command.

Solution Prerequisites

This post builds on the previous two. As such the prerequisites are;

  • you have an Azure account and have set up an IoT Hub, and registered an IoT Device with it
  • your IoT device (micro controller) can run Mongoose OS on. I’m using a NodeMCU ESP8266 that I purchased from Amazon here.
  • the RGB LED Light Ring (generic Neopixel) I used I purchased from Amazon here.
  • 3D printer if you want to print an enclosure for the IoT device

With those sorted we can;

  • Install and configure my Mongoose OS Application. It includes all the necessary libraries and sample config to integrate with a Neopixel, Azure IoT, Mongoose Dashboard etc.
  • Create the Azure PowerShell Function App that will publish the MQTT message the IoT Device will consume
  • Create the Microsoft Flow that will kick off the notifications and give use a nice interface to send what we want
  • Build an enclosure for our IoT device

How to build this project

The order I’ve detailed the elements of the architecture here is how I’d recommend approaching this project. I’d also recommend working through the previous two blog posts linked at the beginning of this one as that will get you up to speed with Mongoose OS, Azure IoT Hub, Azure IoT Devices, MQTT etc.

Installing the AzureIoT-Neopixel-js Application

I’ve made the installation of my solution easy by creating a Mongoose OS Application. It includes all the libraries required and sample code for the functionality I detail in this post.

Clone it from Github here and put it into your .mos directory that should be in the root of your Windows profile directory. e.g C:\Users\Darren.mos\apps-1.26 then from the MOS Configuration page select Projects, select AzureIoT-Neopixel-JS then select the Rebuild App spanner icon from the toolbar. When it completes select the Flash icon from the toolbar.  When your micro controller restarts select the Device Setup from the top menu bar and configure it for your WiFi network. Finally configure your device for Azure MQTT as per the details in my first post in this series (which will also require you to create an Azure IoT Hub if you don’t already have one and register your micro controller with it as an Azure IoT Device). You can then test sending a message to the device using PowerShell or Device Explorer as shown in post two in this series.

I have the Neopixel connected to D1 (GPIO 5) on the NodeMCU. If you use a different micro controller and a different GPIO then update the init.js configuration accordingly.

Creating the Azure Function App

Now that you have the micro controller configured and working with Azure IoT, lets abstract the sending of the MQTT messages into an Azure Function. We can’t send MQTT messages from Microsoft Flow, so I’ve created an Azure Function that uses the AzureIoT Powershell module to do that.

Note: You can send HTTP messages to an Azure IoT device but … 

Under current HTTPS guidelines, each device should poll for messages every 25 minutes or more. MQTT and AMQP support server push when receiving cloud-to-device messages.

….. that doesn’t suit my requirements 

I’m using the Managed Service Identity functionality to access the Azure Key Vault where credentials for the identity that can interact with my Azure IoT Hub is stored. To enable and use that (which I highly recommend) follow the instructions in my blog post here to configure MSI on an Azure Function App. If you don’t already have an Azure Key Vault then follow my blog post here to quickly set one up using PowerShell.

Azure PowerShell Function App

The Function App is an HTTP Trigger Based one using PowerShell. In order to interact with Azure IoT Hub and integrate with the IoT Device via Azure I’m using the same modules as in the previous posts. So they need to be located within the Function App.

Specifically they are;

  • AzureIoT v1.0.0.5
  • AzureRM v5.5.0
  • AzureRM.IotHub v3.1.0
  • AzureRM.profile v4.2.0

I’ve put them in a bin directory (which I created) under my Function App. Even though AzureRM.EventHub is shown below, it isn’t required for this project. I uploaded the modules from my development laptop (C:\Program Files\WindowsPowerShell\Modules) using WinSCP after configuring Deployment Credentials under Platform Features for my Azure Function App. Note the path relative to mine as you will need to update the Function App script to reflect this path so the modules can be loaded.

Azure Function PS Modules.PNG

The configuration in WinSCP to upload to the Function App for me is

WinSCP Configuration

Edit the AzureRM.IotHub.psm1 file

The AzureRM.IotHub.psm1 will locate an older version of the AzureRM.IotHub PowerShell module from within Azure Functions. As we’ve uploaded the version we need, we need to comment out the following lines in AzureRM.IotHub.psm1 so that it doesn’t do a version check. See below the lines to remark out (put a # in front of the lines indicated below) that are near the start of the module. The AzureRM.IotHub.psm1 file can be edited via WinSCP & notepad.

#$module = Get-Module AzureRM.Profile
#if ($module -ne $null -and $module.Version.ToString().CompareTo("4.2.0") -lt 0)
#{
# Write-Error "This module requires AzureRM.Profile version 4.2.0. An earlier version of AzureRM.Profile is imported in the current PowerShell session. Please open a new session before importing this module. This error could indicate that multiple incompatible versions of the Azure PowerShell cmdlets are installed on your system. Please see https://aka.ms/azps-version-error for troubleshooting information." -ErrorAction Stop
#}
#elseif ($module -eq $null)
#{
# Import-Module AzureRM.Profile -MinimumVersion 4.2.0 -Scope Global
#}

HTTP Trigger Azure PowerShell Function App

Here is my Function App Script. You’ll need to update it for the location of your PowerShell Modules (I created a bin directory under my Function App D:\home\site\wwwroot\myFunctionApp\bin), your Key Vault details and the user account you will be using. The User account will need permissions to your Key Vault to retrieve the password (credential) for the account you will run the process as and to your Azure IoT Hub.

You can test the Function App from within the Azure Portal where you created the Function App as shown below. Update for the names of the IoT Hub, IoT Device and the Resource Group in your associated environment.

Testing Function App.PNG

Microsoft Flow Configuration

The Flow is very simple. A manual button and a resulting HTTP Post.

Microsoft Flow Config 1

For the message I have configured a list. This is where you can choose the color of the notification.

Manual Trigger.PNG

The Action is an HTTP Post to the Azure Function URL. The body has the configuration for the IoTHub, IoTDevice, Resource Group Name, IoTKeyName and the Message selected from the manual button above. You will have the details for those settings from your initial testing via the Function App (or PowerShell).

The Azure Function URL you get from the top of the Azure Portal screen where you configure your Function App. Look for “Get Function URL”.

HTTP Post

Testing

Now you have all the elements configured, install the Microsoft Flow App on your mobile if you don’t already have it for Apple iOS Appstore and Android Google Play Log in with the account you created the Flow as, select the Flow, the message and done. Depending on your internet connectivity you should see the notification in < 10 seconds displayed on the Notifier device.

Case 3D Printer Files

Lastly, we need to make it look all pretty and make the notification really pop. I’ve created a housing for the neopixel that sits on top of a little case for the NodeMCU.

As you can see from the final unit, I’ve printed the neopixel holder in a white PLA that allows the RGB LED light to be diffused nicely and display prominently even in brightly lit conditions.

Neopixel Enclosure

I’ve printed the base that holds the micro controller in a different color. The top fits snugly through the hole in the micro controller case. The wires from the neopixel to connect it to the micro controller slide through the shaft of the top housing. It also has a backplate that attaches to the back of the enclosure that I secure with a little hot glue.

Here is a link to the Neopixel (WS2812) 16 RGB LED light holder I created on Thingiverse.

NodeMCU Enclosure.PNG

Depending on your micro controller you will also need an appropriately sized case for that. I’ve designed the neopixel light holder top assembly to sit on top of my micro controller case. Also available on Thingiverse here.

Summary

Using a combination of Azure IoT, Azure PaaS Services, Mongoose OS and a cheap micro controller with an RGB LED light ring we have a very versatile Internet of Things device. The application here is a simple visual notifier. A change of output device or even in conjunction with an input device could change the application, whilst still re-using all the elements of the solution that glues it all together (micro-controller, Mongoose OS, Azure IoT, Azure PaaS). Did you build one? Did you use this as inspiration to build something else? Let me know.

Creating SharePoint Modern Team sites using Site Scripts, Flow and Azure Function

With Site Scripts and Site design, it is possible to invoke custom PnP Provisioning for Modern Team Sites from a Site Script. In the previous blog, we saw how we can provision Simple modern sites using Site Scripts JSON. However, there are some scenarios where we would need a custom provisioning template or process such as listed below:

  • Auto deploy custom web components such as SPFx extension apps
  • Complex Site Templates which couldn’t be configured
  • Complex Document libs, content types that are provided by JSON schema. For an idea of support items using the OOB schema, please check here.

Hence, in this blog, we will see how we can use Flow and Azure Functions to apply more complex templates and customization on SharePoint Modern Sites.

Software Prerequisites:

  • Azure Subscription
  • Office 365 subscription or MS Flow subscription
  • PowerShell 3.0 or above
  • SharePoint Online Management Shell
  • PnP PowerShell
  • Azure Storage Emulator*
  • Postman*

* Optional, helpful for Dev and Testing

High Level Overview Steps:

1. Create an Azure Queue Storage Container
2. Create a Microsoft Flow with Request Trigger
3. Put an item into Azure Queue from Flow
4. Create an Azure Function to trigger from the Queue
5. Use the Azure Function to apply the PnP Provisioning template

Detailed Steps:

This can get quite elaborate, so hold on!!

Azure

1. Create an Azure Queue Store.

Note: For dev and testing, you can use the Azure Storage Emulator to emulate the queue requirements. For more details to configure Azure Emulator on your system, please check here.

Microsoft Flow

2. Create a Microsoft Flow with Request trigger and then add the below JSON.

Note: If you have an Office 365 enterprise E3 license, you get a Flow Free Subscription or else you can also register for a trial for this here.

3. Enter a message into the Queue in the Flow using the “Add message to Azure Queue” action.

FlowSiteDesignAzureQueue

Note: The flow trigger URL has an access key which allows it to be called from any tenant. For security reasons, please don’t share it with any third parties unless needed.

Custom SharePoint Site Template (PowerShell)

4. Next create a template site for provisioning. Make all the configurations that you will need for the initial implementation. Then create the template using PnPPowerShell, use the PnP Provisioning Command as shown below.

Get-PnPProvisioningTemplate -Out .\TestCustomTeamTemplate.xml -ExcludeHandlers Navigation, ApplicationLifecycleManagement -IncludeNativePublishingFiles

Note: The ExclueHandlers option depends on your requirement, but the configuration in the above command will save a lot of issues which you could potentially encounter while applying the template later. So, use the above as a starting template.

Note: Another quick tip, if you have any custom theme applied on the template site, then the provisioning template doesn’t carry it over. You might have to apply the theme again!

5. Export and save the PowerShell PnP Module to a local drive location. We will use it later in the Azure Function.

powershell Save-Module -Name SharePointPnPPowershellOnline -Path “[Location on your system or Shared drive]”

SharePoint
6. Register an App key and App Secret in https://yourtenant.sharepoint.com/_layouts/appregnew.aspx and provide the below settings.
7. Copy the App Id and Secret which we will use later for Step 9 and 10. Below is a screenshot of the App registration page.
8. Trust the app at https://yourtenant-admin.sharepoint.com/_layouts/appinv.aspx by providing the below xml. Fill in the App Id to get the details of the App.

Azure Function

9. Create a Queue Trigger PowerShell Azure function
10. After the function is created, go to Advance Editor (kudu) and then create a sub folder “SharePointPnPPowerShellOnline” in site -> wwwroot -> [function_name] -> modules. Upload all the files from the saved PowerShell folder in the Step above into this folder.
11. Add the below PowerShell to the Azure Function

12. Test the Function by the below input in PowerShell

$uri = "[the URI you copied in step 14 when creating the flow]"
$body = "{webUrl:'somesiteurl'}"
Invoke-RestMethod -Uri $uri -Method Post -ContentType "application/json" -Body $body

PowerShell and JSON

13. Create a Site Script with the below JSON and add it to a Site Design. For more details, please check the link here for more detailed steps.

14. After the above, you are finally ready to run the provisioning process. Yay!!

But before we finish off, one quick tip is that when you click manual refresh, the changes are not immediately updated on the site. It may take a while, but it will apply.

Conclusion:

In the above blog we saw how we can create Site templates using custom provisioning template by Flow and Azure Function using SharePoint site scripts and design.

Automate SharePoint Document Libraries Management using Flow, Azure Functions and SharePoint CSOM

I’ve been working on a client requirement to automate SharePoint library management via scripts to implement a document lifecycle with many document libraries that have custom content types and requires regular housekeeping for ownership and permissions.

Solution overview

To provide a seamless user experience, we decided to do the following:

1. Create a document library template (.stp) with all the prerequisite folders and content types applied.

2. Create a list to store the data about entries for said libraries. Add the owner and contributors for the library as columns in that list.

3. Whenever the title, owners or contributors are changed, the destination document library will be updated.

Technical Implementation

The solution has two main elements to automate this process

1. Microsoft Flow – Trigger when an item is created or modified

2. Two Azure Functions – Create the library and update permissions

The broad steps and code are as follows:

1. When the flow is triggered, we would check the status field to find if it is a new entry or a change.

Note: Since Microsoft flow doesn’t have conditional triggers to differentiate between create and modified list item events, use a text column in the SharePoint list which is set to start, in progress and completed values to identify create and update events.

2. The flow will call an Azure function via an HTTP Post action in a Function. Below is the configuration of this.

AzureFunctionFromFlow

3. For the “Create Library” Azure function, create a HTTP C# Function in your Azure subscription.

4. In the Azure Function, open Properties -> App Service Editor. Then add a folder called bin and then copy two files to it.

  • Microsoft.SharePoint.Client
  • Microsoft.SharePoint.Client.Runtime

KuduTools_AzureFunction

Create Lib App Service Editor

Please make sure to get the latest copy of the Nuget package for SharepointPnPOnlineCSOM. To do that, you can set up a VS solution and copy the files from there, or download the Nuget package directly and extract the files from it.

5. After copying the files, reference them in the Azure function using the below code

#r "Microsoft.SharePoint.Client.dll"
#r "Microsoft.SharePoint.Client.Runtime.dll"
#r "System.Configuration"
#r "System.Web"

6. Then create the SharePoint client context and create a connection to the source list.

7. After that, use the ListCreationInformation class to create the Document library from the library template using the code below.

8. After the library is created, break the role inheritance for the library as per the requirement

9. Update the library permissions using the role assignment object

10. To differentiate between People, SharePoint Groups and AD Groups, find the unique ID and add the group as per the script below.

Note: In case you have people objects that are not in AD anymore because they have left the organisation, please refer to this blog for validating them before updating – Resolving “User not found” issue while assigning permissions using SharePoint CSOM

      Note: Try to avoid item.Update() from the Azure Function as that will trigger a second flow run, causing an iterative loop, instead use item.SystemUpdate()

11. After the update is done, return to the Flow with the success value from the Azure Function which will complete the loop.

As shown above, we saw how we can automate document library creation from a template and permissions management using Flow and Azure Functions

Integrating Microsoft Flow with Azure Functions for Non-IT People

Microsoft Flow (Flow) creates automated workflows between various apps and services so that users can get notifications, collect data and more. This is similar to Azure Logic Apps (Logic Apps), but has different target audiences such as marketing, sales or all other non-IT related people. This document provides high-level comparisons between Flow, Logic Apps and Azure Functions.

Flow contains comprehensive number of pre-defined workflows called templates so we can just simply choose one of them, provide necessary information and use it. If there is no template suitable for our purpose, we can create a new template from scratch using pre-defined triggers and actions. If there is no trigger or action pre-defined, we can use a simple HTTP trigger using Azure Functions. In this post, we are going to have a look how use Azure Functions, HTTP Trigger in particular, to integrate with Flow.

As a Marketing Staff, I Want to …

Let’s say there is someone from a marketing department. They want to search all Twitter posts with a hashtag, #ausopen, for example and those posts are fetched to their marketing Slack channel. This can be easily accomplished by using a pre-defined template.

We can easily set the hashtag they want to follow and Slack channel to fetch like:

This is all set! Too easy! Now, we are with the Free plan, this Flow runs every five minutes. If we want to run the flow more frequently, we should upgrade the plan to paid ones like Flow Plan 1 (runs every 3 mins) or Flow Plan 2 (runs every minute). Once the flow runs, the marketing channel in Slack will receive all tweets like:

We’ve so far created a Flow item as an example.

As a …, I Want to Handle those Tweets in a Different Way

Probably, the marketing staff needs more sophisticated analysis by storing those tweets into database or want to do something else that pre-defined actions/triggers don’t support out-of-the-box. In this case we can introduce HTTP Trigger Functions to do so. Let’s create an HTTP Trigger Function.

Of course, we should implement more complex logic in the function. However, this is just an example, so we only log how Flow passes the data to Azure Function for now. When the function is ready like above, we know its endpoint URL like https://my-function-app.azurewebsites.net/api/TwitterWebhoook?code=XXXXXX.

Copy this endpoint URL for Flow. Now we need to modify the existing Flow item like:

When a new tweet with the hashtag #ausopen is found, the entire tweet object is passed to Azure Functions through the POST method, then the tweet is posted to the Slack channel. Wait for up to five minutes (we’re with the Free Plan!)

Slack channel has finally been updated.

This is the log from Flow:

And this is the log from Azure Functions:

So far, we have integrated Azure Functions (HTTP Trigger) with Microsoft Flow so that we can do more complex jobs through it. The code used in this post was very simple, but depending on the complexity of requirements, the function will handle jobs in more sophisticated way.