Automatically Provision Azure AD B2B Guest Accounts

Azure ‘Business to Business’ (or the catchy acronym ‘B2B’) has been an area of significant development in the last 12 months when it comes to providing access to Azure based applications and services to identities outside an organisation’s tenancy.

Recently, Ryan Murphy (who has contributed to this blog) and I have been tasked to provide an identity based architecture to share Dynamics 365 services within a large organisation, but across two ‘internal’ Azure AD tenancies.

Dynamics 365 takes its identity store from Azure AD; if you’re assigned a license for Dynamics 365 in the Azure Portal, including in a ‘B2B’ scenario, you’re granted access to the Dynamics 365 application (as outlined here).  Further information about how Dynamic 365 concepts integrate with a single or multiple Azure tenancies is outlined here.

Microsoft provide extensive guidance in completing a typical ‘invitation’ based scenario using the Azure portal (using the links above).  Essentially this involves inviting users using an email which relies on that person manually clicking on the embedded link inside the email to complete the acceptance (and the ‘guest account’ creation in the Dynamics 365 service linked to that Azure AD).

However, this obviously won’t scale when you’re requiring on inviting thousands of new users, initially, but then also having to repeatedly invite new users as part of a Business-As-Usual (BAU) process as they join the organisation (or ‘identity churn’ as we say).

Therefore, to automate the creation of new Guest Users Azure AD accounts, without involving the user at all, this process can be followed:

  1. Create a ‘service account’ Guest User from the invited Azure AD (has to have the same UPN suffix as the users you’re inviting) to be a member of the resource Azure AD.
  2. Assign the ‘service account’ Guest User to be a member of the ‘Guest Inviter’ role of the resource Azure AD.
  3. Use PowerShell to auto. provision new Guest User accounts using the credentials of the ‘service account’ Guest User.

In this blog, we’ll use the terms ‘Resource Azure AD’ or ‘Resource Tenancy’ which is the location where you’re trying to share the sources out to another Azure AD called ‘Invited Azure AD’ or ‘Invited Tenancy’ where the user accounts (including usernames & passwords) you’re inviting reside.  The invited users only ever use their credentials in their own Azure AD or tenancy – never credentials of the ‘Resource Azure AD or tenancy’.  The ‘Guest User’ object created in the ‘Resource Tenancy’ are essentially just linking objects without any stored password.

A ‘Service Account’ Azure AD account dedicated solely to the automatic creation of Guest Users in the Resource Tenancy will need to be created first in the ‘Invited Azure AD’ – for this blog, we used an existing Azure AD account sourced using a synchronised local Active Directory.  This account did not have any ‘special’ permissions in the ‘Invited Azure AD’ but according to some blogs, it requires at least ‘read’ access to the user store in the ‘Invited Azure AD’ at least (which is default).

This ‘Service Account’ Azure AD account should have a mailbox associated with it, i.e. either an Exchange Online (Office 365) mailbox, or a mail value that has a valid SMTP address for a remote mailbox.  This mailbox is needed to approve the creation of a Guest User account in the Resource Tenancy (only needed for this individual Service Account).

It is strongly recommended that this ‘Service Account’ user in the ‘Invited Azure AD’ has a very strong & complex password, and that any credential used for that account within a PowerShell script be encrypted using David Lee’s blog.

The PowerShell scripts listed below to create these Guest Accounts accounts could then be actioned by an identity management system e.g. Microsoft Identity Manager (MIM) or a ‘Runbook’ or workflow system (e.g. SharePoint).

 

Task 1: Create the ‘Service Account’ Guest User using PowerShell

Step 1: Sign into the Azure AD Resource Tenancy’s web portal: ‘portal.azure.com’, using a Global Admin credential.

Step 2:  When you’re signed in, click on the account profile picture on the top right of the screen and select the correct ‘Resource Tenancy’ (There could be more than one tenant associated with the account you’re using):

Screenshot 2017-09-19 at 9.34.18 AM

Step 3:  Once the tenancy is selected, click on the ‘Azure Active Directory’ link on the left pane.

Step 4:  Click ‘User Settings’ and verify the setting (which is set by default for new Azure AD tenancies):  ‘Members can invite’.

Screenshot 2017-09-19 11.31.51

Step 5:  Using a new PowerShell session, connect and authenticate to the Azure AD tenancy where the Guest User accounts are required to be created into (i.e. the ‘Resource Azure AD’).

Be sure to specify the correct ‘Tenant ID’ of the ‘Resource Azure AD’ using the PowerShell switch ‘-TenantId‘ followed by the GUID value of your tenancy (to find that Tenant ID, follow the instructions here).

$Creds = Get-Credential

Connect-AzureAD -Credential $creds -TenantId “aaaaa-bbbbb-ccccc-ddddd”

 

Step 6:  The following PowerShell command should be executed under a ‘Global Admin’ to create the ‘Service Account’ e.g. ‘serviceaccount@invitedtenancy.com’.

 

New-AzureADMSInvitation -InvitedUserDisplayName “Service Account Guest Inviter” -InvitedUserEmailAddress “serviceaccount@invitedtenancy.com” -SendInvitationMessage $true -InviteRedirectUrl http://myapps.microsoft.com -InvitedUserType member

 

Step 7:  The ‘Service Account’ user account will then need to locate the email invitation sent out but his command and click on the link embedded within to authorise the creation of the Guest User object in the ‘Resource Tenancy’.

 

Task 2: Assign the ‘Service Account’ Guest Inviter Role using Azure Portal

Step 1:  Sign into the Azure web portal: ‘portal.azure.com’ with the same ‘Global Admin’ (or lower permission account) credential used in Task 1 (or re-use the same ‘Global Admin’ session from Task 1).

Step 2:  Click on the ‘Azure Active Directory’ shortcut on the left pane of the Azure Portal.

Step 3:  Click on the ‘All Users’ tab and select the ‘Service Account’ Guest User.

(I’m using ‘demo.microsoft.com’ pre-canned identities in the screen shot below, any names similar to real persons is purely coincidental – an image for ‘serviceaccount@invitedtenancy’ used as the example in Task 1 could not be reproduced)

Screenshot 2017-09-19 09.44.36

Step 4:  Once the ‘Service Account’ user is selected, click on the ‘Directory Role’ on the left pane.  Click to change their ‘Directory Role’ type to ‘Limited administrator’ and select ‘Guest Inviter’ below that radio button.  Click the ‘Save’ button.

Screenshot 2017-09-19 09.43.53

Step 5:  The next step is to test to ensure that ‘Service Account’ Guest User account can invite users from the same ‘UPN/Domain suffix’.   Click on the ‘Azure Active Directory’ link on the left pane off the main Azure Portal.

Step 6:  Click ‘User and groups’ and click ‘Add a guest user’ on the right:

Screenshot 2017-09-19 09.36.02

Step 7:  On the ‘Invite a guest’ screen, send an email invitation to a user from the same Azure AD as the ‘Service Account’ Guest User.  For example, if your ‘Service Account’ Guest user UPN / Domain Suffix is: ‘serviceaccount@remotetenant.com’, then invite a user from the same UPN/domain suffix e.g. ‘jim@remotetenant.com’  (again, only an example – any coincidence to current or future email address is purely coincidental).

Screenshot 2017-09-19 09.36.03

Step 8:  When the user receives the invitation email, ensure that the following text appears at the bottom of the email:  ‘There is no action required from you at this time’:

image002

Step 9:  If that works, then PowerShell can now automate that invitation process bypassing the need for emails to be sent out.  Automatic Guest Account creation can now leverage the ‘Service Account’ Guest User.

NOTE:  If you try to invite a user from with UPN/Domain suffix that does not match the ‘Service Account’ Guest User, the invitation will still be sent but it will appear requesting the user accept the invitation.  The invitation will be in a ‘pending acceptance’ state until that is done, and the Guest User object will not be created until that is completed.

Task 3:  Auto. Provision new Guest User accounts using PowerShell

Step 1:  Open Windows PowerShell (or re-use an existing PowerShell session that has rights to the ‘Resource Tenancy’).

Step 2:  Type the following example PowerShell command to send  invitation out, and authenticate when prompted using the ‘Invited Tenancy’ credentials of the ‘Service Account’ Guest User.

In the script, again be sure to specify the ‘TenantID’ for the switch –TenantID of the ‘Resource Tenancy’, not the ‘Invited Tenancy’.

 

#Connect to Azure AD

$Creds = Get-Credential

Connect-AzureAD -Credential $creds -TenantId “aaaaa-bbbbb-ccccc-ddddd”

$messageInfo = New-Object Microsoft.Open.MSGraph.Model.InvitedUserMessageInfo

$messageInfo.customizedMessageBody = “Hey there! Check this out. I created and approved my own invitation through PowerShell”

New-AzureADMSInvitation -InvitedUserEmailAddress “ted@invitedtenancy.com” -InvitedUserDisplayName “Ted at Invited Tenancy”  -InviteRedirectUrl https://myapps.microsoft.com -InvitedUserMessageInfo $messageInfo -SendInvitationMessage $false

 

Compared to using the Azure portal, this time no email will be sent (the display name and message body will never be seen by the invited user, it’s just required for the command complete).   To send a confirmation email to the user, you can change the switch -SendInvitationMessage to: $True.

 

Step 3:  The output of the PowerShell command should have at the end of the text next to ‘Status’ as ‘Accepted’:

image001

This means the Guest User object has automatically been created and approved by the ‘Resource Tenancy’.   That Guest User object created will be associated with the actual Azure AD user object from the ‘Invited Tenancy’.

The next steps for this invited Guest User will be then to assign them a Dynamics 365 license and then a Dynamics 365 role in the ‘Resource Tenancy’ (which might be topics of future blogs).

Hope this blog has proven useful.

 

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Platform Features

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

ManagedServiceIdentity

Back in Platform Features under General Settings select Application Settings. 

General Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

App Settings

Under Platform Features select Console.

DevelopmentTools

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

Endpoint

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

NewFunction

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Vars

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Vault

Select Add new, Select Principal and locate your Function App and click Select.

Access Policy 1

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Policy - GET

We now have our Function App enabled to access the Key Vault.

Access Policy 2

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Vault Secret URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Azure Function Proxies for API Mocking

In my previous posts, Is Your Serverless Application Testable? – Azure Logic Apps and API Mocking for Developers, we have looked how to mock APIs with various approaches including Azure API Management, AWS API Gateway, MuleSoft and Azure Functions. Quite recently, Azure Functions Team released a new mocking feature in Azure Function Proxies. With this feature, API mocking can’t be even easier. In this post, I’m going to show how to use this mocking feature in several ways.

Enable API Mocking via Azure Portal

This is pretty straight forward. Within Azure Portal, simply enable the Azure Function Proxies feature:

If you have already enabled this feature, make sure that its runtime version is 0.2. Then click the plus sign next to Proxies (preview) and put details like:

At this stage, we might have found an interesting one. This proxy doesn’t have to link to an existing HTTP endpoint. In other words, we don’t need to have a real API for mocking. Just create a mocked HTTP endpoint and use it. Once we save this, we’ll have an endpoint like:

Use Postman to hit this mocked URL:

We receive the mocked HTTP status code and response body. How easy!

But if we have an existing Azure Function app and want to integrate this mocking feature via CI/CD pipeline, what can we do? Let’s move on.

Enable Azure Function Proxy Option via ARM Template

Azure Functions app instance is basically the same web app instance. So we can use the same ARM template definition for app settings configurations. In order to enable Azure Function Proxies, we simply add a ROUTING_EXTENSION_VERSION key with a value of ~0.2. Here’s a sample ARM template definition:

Once we update our ARM template, we can simply deploy this template to Azure.

Enable Azure Function Proxy Option via PowerShell

If ARM template is not feasible to enable this proxy option, then we can use a PowerShell script (or Azure CLI equivalent). Here’s a sample script:

Hence, this PowerShell script can be invoked with parameters like:

After either way, ARM template or PowerShell, is executed, the function app instance will have the value like:

Now, we have proxies feature enabled through code! Let’s move on.

Deploy Azure Function Proxy with Azure Function App

Defining Azure Function proxies is just to add another JSON file, proxies.json, to the app instance. The JSON schema definition can be found at http://json.schemastore.org/proxies. So, basically the proxies.json looks like:

As defined above, the mocked API has a URL of /mock/hello-world with response code of 200 and body { "message": "Hello World" }. As long as we follow this structure, we can add as many mocked APIs as we like.

Once we add this proxies.json into our Azure Functions project, deploy it to Azure. Then we’ll be able to see the list of mocked APIs.

So far, we have looked how to add mocked APIs to an Azure Function instance with various ways. Depending on preferences, any approach would be fine. But just make sure that this is the easiest and cheapest way to mock API endpoints.

Mobile-first development with Xamarin

 

Modern application users have high expectations for applications, even for in-house enterprise apps.  IT leaders realising this have adopted a mobile-first development approach, which ensures great user experience, reduce overall development and maintenance cost. In this post I will provide an example of a mobile-first development project for an enterprise application.

 Business case

A retailer is using Windows CE devices for performing daily tasks in it stores and is planning to move to the latest Android / iOS devices. Each particular task is performed using a single independent application, developed with different technologies (.Net Compact, C++), and each application communicates with many different back-end services and systems.

Apart from maintaining applications on variant technologies, the company also spent heavily on training employees to use these applications.

The road ahead is to develop these applications for Android powered devices.

Road ahead

The application was initially to be developed only for Android, with a future scope of developing a similar iOS application.

The company relied a lot on the daily activities performed through these applications, which meant that these should be powerful and simple enough for end users, minimising training and update costs.

Xamarin was chosen as the development platform, for the fact that it will prove beneficial for porting later to iOS, single code base with C# language (fully matured async patterns, better type-safety, front and back-end teams work more collaboratively), and also, it can realise the full potential of the native platforms (native UI, Geo-location, Notifications).

Mobile-first approach

Mobile-first approach in enterprise means allowing user to perform complex tasks, with ease. This means interactive and intuitive UI, build around the use-case of solving a problem from a user perspective. This also means developing new infrastructure and services that cater for specific use cases in the application.

Consider a case when the store manager of a hardware store wishes to order new stock for an item. This previously involved getting following information:

  • Current store stock
  • Stock availability for delivery
  • Back store capacity
  • Shelf capacity
  • Upcoming sales
  • Pending deliveries
  • Pending customer orders
  • Wasted / discarded stock
  • Understanding of stores’ sale patterns (daily / seasonal / monthly)

 

Some of the services interact with the in-store server, some will require the central server, and a manual review of previous orders, while some of the tasks are completely dependent on a person’s experience. The final task is to manually calculate the final stock to be ordered.

With the new system, a scheduled task interacts with all of the above services, uses machine learning (items bought together/ customer ordered / quantity etc.) to find out sale patterns, cache results daily and in case an order needs to be placed, pushes a notification to the user who has the authority to submit orders.

On the application side, the user can interact with the notification, review the order and submit or cancel with just 3 clicks.

Thus, focusing on the user-first mobile app, made a complex task very easy and efficient for the user.

Additional use case: An item in a store is to be recalled, or is towards the end of its life and needs to be discarded. Since the new web service caches all of the data every day, a new service can be added, that identifies such products and utilises the existing notification module to notify the user.

The mobile-first development can turn a ‘pull’ experience into a ‘push’ experience – making use of mobile platform features like push notifications. With powerful services in place, new functionality is easily added greatly improving store operations.

Code-sharing

The application was designed using Model View Presenter (MVP) architecture. In this pattern the presenter layer acts as the supervising controller for the view. The view itself is quite passive and is responsible for displaying the data provided to it by the presenter and to route UI interactions to the presenter. This allows for moving most of the logic into the presentation layer. The presentation layer is shared across platforms reducing platform specific code. Test coverage is also greatly improved as most of the presentation code is shared.

Use of Xamarin combined with MVP architecture allowed for more than 95% of the code to be shared across different platform, with emphasis on Unit and automation testing. This is evident from the fact that the whole project was delivered in under 9 months and the iOS application was started in the 7th month, and delivered with the Android app.

The components of the application were broken into individual packages, so that they could be used for all future enterprise or customer facing applications.

An excerpt from the OrderService shows how a sync component (LiveDataService) written to work with a live database could be used for a consumer application as well. In below case, it will notify user of current promotional stock.

It is also worth noting here that the use of Xamarin allowed for sharing components between mobile and back-end systems (as separate NuGet packages). This was the case with custom synchronisation framework, model classes, ORM components, utility libraries etc.

 

ADFS Service Communication Certificate Renewal Steps

Hi Guys, adfs service comprises of certificates which serve different purpose for federation service. In this blog post I will share a brief description of these certificates, their purpose and will discuss renewal process of service communication certificate.

 

Type of ADFS Certificates and their purpose

 

Certificate Type Description Purpose
Service Communication certificate

 

Standard Secure Sockets Layer (SSL) certificate that is used for securing communications between federation servers, clients, Web Application Proxy, and federation server proxy computers. Ensures the identity of a remote computer

Proves your identity to a remote computer

 

Encryption Certificates

 

  Token decryption
Signing Certificates

 

Standard X.509 certificate that is used for securely signing all tokens Token signing

 

 

Renewal Steps

Service Communication certificate

In comparison this certificate is very similar to IIS certificate used to secure a website. It is generally issued by a trusted CA authority and can be either SAN or wild card certificate. This certificate is installed an all ADFS servers in the farm and update procedure should be done on primary ADFS server. Below is the list of steps involved in renewal.

 

  1. Generate CSR from primary ADFs server. This can be done via IIS.
  2. Once certificate is issued, add new certificate in Certificate store.
  3. Verify Private Key on the certificate. Make sure new certificate has the private key.
  4. Assign Permissions to the Private Key for ADFS service account. Right click on the certificate, click manage private keys, add adfs service account and assign permissions as shown in below screenshot.

 

 adfs

  1. From ADFS console select “Set Service Communication Certificate”
  2. Select new certificate from prompted list of certificates.
  3. Run Get-AdfsSslCertificate. Make a note of the thumbprint of the new certificate.
  4. If it’s unclear which certificate is new, open MMC snapin, locate the new certificate and scroll down in the list of properties to see the thumbprint.
  5. Run

 

  1. Restart the ADFS service
  2. Copy and import the new certificate to the Web Application Proxy/Proxies
  3. On each wap server run following cmdlet.

 

That’s it you are all done. You can verify that new certificate has been assigned to adfs service by executing Run Get-AdfsSslCertificate. Another verification step would be to open the browser and navigate to federation page. Here you should be able to see the new certificate in the browser. I will further discuss encryption and signing certificate renewal process in upcoming blogs.

 

 

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed. There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights 🙂

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

How LUIS can help BOTs in understanding natural language

Since bots are evolving, you need a mechanism to better understand what user wants from his/her language and take actions or respond to user queries appropriately. In the days of increasing automation, bots can certainly help provided they are backed by tools to understand user language both naturally and contextually.

Azure Cognitive Services has an API that can help to identify what user wants, extracts concepts and entities from a sentence (user input) using an intelligent service name Language Understanding Intelligent Service (LUIS). It can help process natural language using custom trained language models and can incorporate Active learning concept based on how it was trained.

In this blog post, we will be building a LUIS app that can be utilised in a Bot or any other client application to respond to the user in a more meaningful way.

Create a LUIS app

  1. Go to https://www.luis.ai/ and sign up.
  2. You need to create a LUIS app by clicking ‘New App’ – this is the app you will be using in Bot Framework
  3. Fill out a form and give your app a unique name
  4. Your app will be created, and you can see details as below (page will be redirected to Overview)
  5. You need to create entities to identify the concept, and is very important part of utterances (input from a user). Let’s create few simple entities using the form below
  6. You can also reuse pre built entity like email, URL, date etc.
  7. Next step is to build intent which represents a task or an action from utterance (input from a user). By default, you will have None which is for irrelevant utterances to your LUIS app.
  8. Once you have defined the series of intents, you need to add possible utterances against each intent which forms the basis of Active Learning. You need to make sure to include varied terminology and different phrases to help LUIS identify.You can build Phrase list to include words that must be treated similarly like company name or phone models etc.
  9. As you write utterances, you need to identify or tag entities like we selected $service-request in your utterance.Remember: you are identifying possible phrases to help LUIS extract intents and entities from utterances.
  10. Next step is to train your LUIS app to help it identify entities and intents from utterances. Ensure you click Train Application when you are done with enough training (you can also do such training on per entity or per intent basis)
  11. You can repeat step 10 as much time as you like to ensure LUIS app is trained well enough on your language model.
  12. Publish the app once you have identified all possible entities, intents, utterances and have trained LUIS well to extract them from user input.
  13. Keep a note of Programmatic API key from MyKey section and Application ID from Settings menu of your LUIS app, you will need these two keys when integrating LUIS with your client application.

Now you are ready to go ahead and use your LUIS app in your Bot or any other client application to process natural language in a meaningful manner – Cheers!

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report

Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Quickly creating and using an Azure Key Vault with PowerShell

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault

Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name

Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault

Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credential

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Resolving presence not up-to-date & unable to dial-in the conferences via PSTN issues Lync 2013

 

Recently I’ve been working with one SFB customer recently. I met some unique issue and I would like to share the experience of what I did to solve the problem

Issue Description: After SQL patching on Lync servers, all users’ presence was not up-to-date and people are unable to dial in to the scheduled conference.

Investigation:

when I used Lync shell moving testing user from SBA pool A to pool B on one FE server, but I checked the user pool info on the SBA pool A, the result still showed the testing user is under pool A. This indicates either the FE Lync databases are not syncing with each other properly or there are database corruptions.

I checked all the Lync FE servers, all the Lync services are running. all look good. I re-tested the conference scenarios, the PSTN conference bridge number is unavailable while people can still make incoming/outgoing calls.

I decided to go back to check the logs on all the Lync FE servers, I noticed on one of the Lync FE servers, I got “Warning: Revocation status unknown. Cannot contact the revocation server specified in certificate”, weird, does this mean there was something wrong with the cert on this FE server? No way, I didn’t see this error on the other FE server, both FE servers are supposed to use the same certs, this means it’s not the cert issue. It is something wrong with the FE server.

Next, I tried to turn off all the Lync services on the problematic FE server to see if it made any difference. Interesting thing happened, once I did that, all users’ presence became updated and also the PSTN conference bridge number became available. I could dial in from my mobile after that. it verified it was server issue.

Root Cause:

What caused the FE server having the cert error? Which cert was used on this FE server? I manually relaunched the deployment wizard, wanted to compare the certs between the 2 FE servers. Then I noticed that the Lync server configurations are not up-to-date from the database store level. This was a surprise to me because there was no change on the topology, so I never thought about re-run the deployment wizard after FE SQL patching. On the other FE server which was working as expected, I can see all the green checks on each step of the deployment wizard. Bingo, I believed all the inconsistent issues from users end were related with the inconsistent SQL databases across all the two FE ends.

<

p style=”margin-left:36pt;”>

 

Solution:

Eventually, after the change request approved by the CAB, re-run the deployment wizard to sync the SQL store and also re-assign the certs to Lync services resolved the issue.

 

Hopefully it can help someone else who have similar issues.