Geographically Visualizing your workforce using Microsoft Identity Manager, xMatters and Power BI

Introduction

In the last couple of weeks I’ve posted about visualizing relationships of data from Microsoft Identity Manager using Power BI. Earlier this week I posted about building a Management Agent for Microsoft Identity Manger to integrate with xMatters.

In this post I combine data from the last two in order to allow us to visualise the geographic office locations for an organisation and then summary data about it (how many employees are located there, and what departments).

Prerequisites

You’ll need an Azure AD and Office 365 subscription to allow you to create a Power BI Application. Too create a Power BI Application see Registering a Power BI Application in this post here.

You’ll also need the Power BI PowerShell Module. I’m using 2.0.0.9 available from the PowerShell Gallery here and of course the Lithnet MIIS PowerShell Module available from here.

Overview

Using our registered Power BI Application we’ll create a Dataset consisting of two tables. One for the xMatters Sites (that we also get the geographic co-ordinates of from the xMatters Management Agent), and the other with our xMatters Users that contains the officeLocation that maps to an xMatters Site.

I create a relationship between the two tables on xMattersSite displayName (which is the location name) and the xMattersUsers officeLocation. We can then create a nice visual using data from both tables.

Create the Dataset (two tables with relationship)

Initially I tried to create the dataset with a relationship as I’ve previously shown here. However that didn’t work. After some debugging I got the result I wanted after some trial and error using the Power BI API Explorer. So I’ll provide you with the raw JSON format for creating a New Dataset, Two Tables (xMattersSites and xMattersUsers) and a relationship between them (where xMattersSites\displayName joins with xMattersUsers\officeLocation) as per my xMatters Management Agent detailed here.

Start by authenticating to the Power BI API Explorer with an account in the environment where you created your Power BI Application and navigate to the Create Dataset section here.

Create Dataset

Update this JSON formatted object that details the Dataset, Tables and Relationships for your environment.

Paste your validated JSON object into the Body section of the API Explorer and select Call Resource.

Dataset Body

If your JSON object is formatted corrected you’ll get a 201 response and your DataSet and Tables with Relationship will be created.

Create Success

Switching over to Power BI you’ll see the xMatters Dataset in the bottom left, then the two tables in on the right hand side with their columns.

xMatters DataSet PBI.PNG

Load xMatters User Data into Power BI

Now that we have somewhere to put the data, lets populate the dataset. I’m using the Lithnet MIIS Automation PowerShell Module (detailed in the prerequsites to query the Metaverse and return all users. Then I refine the list down to those that are Active (based on my employeeActive Boolean attribute) then finally, only those users that are connected on the xMatters Management Agent (see lines 14 & 18).

The script will drop any existing values from the xMatters Users table then upload what we have retrieved from the Metaverse (and refined).

Upload Users.PNG

Load xMatters Site Data into Power BI

Again I’m also using the Lithnet MIIS Automation PowerShell Module to query the Metaverse and return all xMatters Sites.

The script will drop any existing values from the xMatters Sites table then upload what we have retrieved from the Metaverse.

Upload Sites.PNG

Creating the Power BI Visual

Now we have data we can build the visual. I’m using the ArcGIS Maps for Power BI visual which is available in the default set of visuals. Then by selecting displayName and geo the map will automagically show all xMatters Sites in their respective co-ordinates.

xMatters Sites to Map

We can then add a Card Visual and choose officeLocation and then configure the visual for Count of officeLocation and we’ll get a count of the employees at that location. As we can see below with the Sydney location selected from the map the card updates to tell me there are 665 Employees at that officeLocation.

Count of Employees at Selected Location

Pretty quickly we can also expand out other data points, like departments at a location, employees etc as shown below (I’ve obfuscated the departments and a number of the other office locations).

Summary.PNG

Conclusion

We haven’t generated any new data. We’ve taken information we already have in Microsoft Identity Manager from connected systems and quickly visualized it via Power BI. However providing this to the business and with the ability for consumers of the information to export it from the visual can be pretty powerful.

Building a FIM/MIM Management Agent for xMatters

Introduction

A couple of weeks ago one of my customers had a requirement to provision and manage identities into xMatters. The xMatters API Documentation looked straight-forward and I figured it would be pretty quick to knock up an PowerShell Management Agent.

The identification of users (People) in xMatters was indeed pretty quick. I was quickly able to enumerate all users (that had initially been seeded independent of FIM/MIM) and join them to corresponding users in the MetaVerse.

It was then as I started digging deeper that the relationship between Sites (Locations) and Email/Mobile (Devices) attributes became apparent. This post details how I approached it and a base xMatters MA that should get you started if you need to do something similar.

Overview

A key concept to keep in mind is that at the simplest level there are 3 key Object Types in xMatters;

  • People
    • User Objects along with basic naming attributes
  • Device
    • Each contact medium is a device. Email Address, Mobile Phone, Home Phone, Text Phone (SMS) etc.
  • Site
    • Location of the entity (person)

Associated with each is an id which can be either dynamically created on provisioning (by xMatters) or specified. For People there is also targetName which is the equivalent of UID/sAMAccountName. When using the API (for people) you can use either their ID or their targetName. For all other entities you need to use the ID.

For each entity as you’d expect there are different API URI’s. They are;

Finally to retrieve devices for a person use;

Other key points to consider that I uncovered are;

  • if you are updating a Device (e.g. someones Email Address or Phone Number) don’t specify the owner attribute (as you do when you create the Device). It considers that you are trying to change the owner and won’t allow it.
  • to update a Device you need to know the ID of the Device. I catered for this on my Import by bringing through People and Device ID’s.
  • When creating/updating a users location you need to specify the Site ID and Site Name. I brought these through as a separate ObjectClass into FIM/MIM and query the MV for them when Exporting
  • In my initial testing the API returned a number of different errors 400 (Bad Request), 409 Conflict (when trying to Add a Device that already exists), 404 (Not Found) along with API Timeouts. You need to account for these and perform processing appropriately
  • On success of Update, Create or Delete the API returns the full object that you performed the operation on. You need to capture this and let MIM know that on Success a full object being returned is Success and not an error
  •  xMatters expects phone numbers to be in E164 format (e.g +61 400 123 456). I catered for this on an import on another Management Agent
  • xMatters timezone is in the format of Country/Region. For Australia these are as follows. Correct, it doesn’t accept Australia/Canberra for ACT;
    • “NSW”  = “Australia/Sydney”
      “VIC”  = “Australia/Melbourne”
      “QLD”  = “Australia/Brisbane”
      “ACT”  = “Australia/Sydney”
      “WA”  = “Australia/Perth”
      “TAS”  = “Australia/Hobart”
      “NT”  = “Australia/Darwin”

xMatters PowerShell Management Agent

With all that introduction, here is a base xMatters PowerShell MA (implemented using the Granfeldt PowerShell MA) to get you started. You’ll need to tailor for your environment and trigger Provisioning, Deletes and Flow Rules for your environment and look to handle the xMatters API for your integration.

Schema Script

I’ve created two Object Classes. User and Site. User incorporates User Devices. Site is the locations (Sites) from xMatters.

Import Script

Credentials for the Import script to connect to xMatters are flowed in from the Management Agent Username and Password attributes. This isn’t using Paged Imports. If you have a large number of users you may want to consider that. After retrieving all of the People entities each is queried to obtain their Devices. I’m only bringing through SMS and Email Devices. You’ll need to modify for additional Devices.

Ensure that you flow into the MetaVerse (onto custom attributes) the IDs associated with your Devices (e.g MobileID and EmailID). That will allow you to use the ID when updating those attributes.

For Sites, I created a custom ObjectClass (Site) in the MV and used objectID of the SiteID and displayName for the Site Name (as shown below).

Attribute Flows.png

Export Script

This is where it gets a little more complicated. As PowerShell is not good at reporting webrequest responses we have to deal with the return from each API call and determine if we were successful or not. Then let FIM/MIM know so it can report that via the UI.

The Export script below deals with Adding, Deleting and Updating users. Update line 31 for your API URI for xMatters.

Summary

The detail above will get you started and give you a working Management Agent to import Users and Sites. You’ll need to do the usual steps (Set, Workflow, Sync Rule and MPR) to trigger Provisioning on the MA along with how you handle deletes.

 

Graphically Visualizing Identity Hierarchy and Relationships

Almost 15 years ago Microsoft released Microsoft Identity Integration Server (MIIS) 2003. Microsoft also released a couple of Resource Toolkits for MIIS to assist customers and IT Integrators’ implement the product as up to that time it’s predecessor (Microsoft Metadirectory Services) was only available as part of a Microsoft Consulting engagement.

At the same time Microsoft provided a Beta product – Microsoft PolyArchy Server. For someone who’s brain is wired in highly visually way, this was a wow moment. PolyArchy Server took a dataset from the Synchronisation Server and wrapped a small IIS website around it to expose intersecting relationships between data. When you selected a datapoint the visual would flip to the new context and display a list of entities associated with that relationship.

Microsoft proposed to deliver PolyArchy Server in calendar year 2006. However the product never made it to market. The concept of visualizing identity data was seeded in my brain and something I’ve always surfaced in one method or another as part of many Identity Management projects.

In this post I’ll detail how I’ve recently used Power BI to visualize relationship data from Microsoft Identity Manager.  The graphic below is an example (with node labels turned off) that represents Managers by Department by State.

Managers by Dept by State - Graphical.png

Using filters in the same report allows whoever is viewing the report to refine the visual based on State and Dept. By selecting a State from the map the visual will dynamically update to show that state only. Selecting a department only will show that department in each state.

Managers by Dept by State - Filtered.png

Hovering over the nodes will display the detail. I’ve turned off the node labels that show each nodes label to not expose the source of my dataset.

Managers by Dept by State - NSW Detail.png

Getting MIM MV User MetaData into Power BI

My recent post here details the necessary steps to get started publishing data directly in a Power BI Dataset using PowerShell. Follow the details listed there to register a Power BI Application.

Creating the DataSet

With that done the script below will create a DataSet in Power BI. My dataset is obviously specific to the environment I developed it in. You probably won’t have some of the attributes so you will need to update accordingly. The script is desinged to run on the MIM Sync Server. The MIM Sync Server will need to be able to connect to Azure and Power BI.

Publish data to the DataSet

Now that we have a Power BI DataSet (Table) we need to extract the data from the MIM MV and push it into the table. Using the Lithnet MIIS Automation PowerShell Module makes this extremely simple. Using the table schema created above I retrieve the values for each Active User, build a PowerShell Object and use the Power BI PowerShell Module to push the data to Power BI.

Creating the Power BI Visualization

The visualisation I’m using is the Journey Chart by MAQ Software which is available in the Power BI Store (free).

Journey Visual.PNG

With the Journey Visualization selected and dropped in we just have to select the attributes we want to visualize and the order of the relationships. The screenshot below shows the data sorted by State => managerName => accountName with Measure Data being accountName.

Visual Config.PNG

Conclusion

We never got PolyArchy Server from Microsoft, but we can quickly visualize basic relationship data from MIM with Power BI.

Automate the update of the data into Power BI, embed the Power BI Reports into your MIM Portal and provide access to the appropriate personnel.

 

A modern way to track FIM/MIM Attribute Value History utilizing Power BI

Introduction

Microsoft Identity Manager is fantastic for keeping data consistent between connected systems. Often however you want to know what a previous value of an attribute was. FIM/MIM however can only tell you the current value and the Management Agent it was received on and when.

In the past where I’ve had to provide a solution to either make sure an attribute has a unique value forever (e.g email address or loginID (don’t reuse email addresses or loginID)) or just attribute value history I’ve used two different approaches;

  • Store previous values in an SQL Table and have an SQL MA that flows out the values
  • Store historical values in a Multi-Valued attribute on the user object in the Metaverse

Both are valid approaches but often fall down when you want to quickly get a report on that metadata.

Recently we had a similar request to be able to know when Employees EndDates were updated in HR. Specifically useful for contractors who have their contracts extended. Instead of stuffing the info into a Multi-Valued attribute or an SQL DB this time I used Power BI. This provides the benefit of being able to quickly develop a graphical report and embed it in the FIM/MIM Portal.

Such a report looks like the screenshot below. Power BI Report

Using the filters on the right hand side of the report you can find a user (by EmployeeID or DisplayName), select them and see attribute value history details for that user in the main part of the report. As per the screenshot below Andrew’s EndDate was originally the 8th of December (as received on the 5th of November), but was changed to the 24th of November on the 13th of November.

End Date History

In this Post I describe how I quickly built the solution.

Overview

The process to do this involves;

  • creating a Power BI Application
  • creating a Power BI Dataset
  • creating a script to retrieve the data from the MV and inject it into the Power BI Dataset
  • creating a Power BI Report for the data
  • embedding the Report in the MIM Portal

Registering a Power BI Application

Head over to Power BI for Developers and Register an Application for Power BI. Login to Power BI with an account for the tenant you’ll be reporting data for. Give your Application a name and choose Native Application. Set the Redirect URL to https://localhost

CreatePBIApp

Choose the permissions for you Application. As we’ll be writing data into Power BI you’ll need a minimum of Read and Write all Datasets. Select Register App.

Create PBI App Permissions

Record your Client ID for your Application. We’ll need this to connect to Power BI.

Register the App

We need to authenticate to Power BI the first time using a UI to provide Authorization for our Application. In order to do that we need to add another Reply URL to our application. Head to the Apps Dev Portal, select your application and Edit the Application Manifest. Add an additional Reply URL for https://login.live.com/oauth20_desktop.srf as shown below.

Add Reply URL for AuthZ

The following PowerShell commands will then allow us to Authenticate utilizing the Power BI PowerShell module. If you don’t have the Power BI PowerShell Module installed un-comment Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force  to install the PowerShell Power BI PowerShell Module.

Update for your Client ID for the App you registered in the previous steps.

# Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force
Import-Module PowerBIPS -RequiredVersion 1.2.0.9

# PowerBI App
$clientID = "4036df76-4de6-43cb-afe6-1234567890"

$authtoken = Get-PBIAuthToken -ClientId $clientID

Sign in with an account for the Tenant where you created the Power BI App.

Interactive Login for Dataset Creation

Accept the permissions you chose when registering the Power BI App.

Authorize PowerBI App

Creating the Power BI Dataset

Now we will create the Power BI Table (Dataset) that we will use when we insert the records.

My table is named Employee and the DataSet EmployeeEndDateReport.  I’m keeping the table slim to enough info for our purpose. Date added to the dataset, employees Accountname, Displayname, Active state, EndDate and EndDateReceived. The following script will create the Dataset.

Populating the Dataset

With our table created, lets populate the table with employees that have an EndDate. As this is the first time we run it, we set a watermark date to add people from. I’ve gone with the previous year.  I then query the MV for Employees with an EndDate within the last 365 days, build a PowerShell Object with the columns from our table and insert them into Power BI. I also set a watermark of the last time we had an EndDate Received from the MA and output that to the watermark file. This is so next time we can quickly get only users that have an EndDate that was received since the last time we ran the process.

NOTE: for full automation you’ll need to change line 6 for your secure method of choice of providing credentials to scripts. 

Create a Power BI Report

Now in Power BI select your Data Set and design your report. Here is a sample one that I’ve put together. I simply selected the columns from the dataset and updated the look and feel. I then added in a column (individually for AccountName, DisplayName and Active) and chose it as Filter so that I have various ways of filtering whoever I’m looking for.

Power BI Report.png

Once you have run the process for a while and you have changed values for the attribute you are keeping history for, you will see when you select a user with changed values, you will see the history.

End Date History

Summary

To complete the solution you’ll want to automate the script that queries the MV for changes (probably after each run from the MA that provides the attribute you are recording history for), and you’ll want to embed the report in the MIM Portal. In this post here I detail how to do that step by step.

 

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

‘Generic’ LDAP Connector for Azure AD Connect

I’m working for a large corporate who has a large user account store in Oracle Unified Directory (LDAP).   They want to use these existing accounts and synchronise them to Azure Active Directory for Azure application services (such as future Office 365 services).

Microsoft state here that Azure Active Directory Connect (AAD Connect) will, in a ‘Future Release’ version, provide native LDAP support (“Connect to single on-premises LDAP directory”), so timing wise I’m in a tricky position – do I guide my customer to attempt to use the current version? (at the time of writing is: v1.1.649.0) or wait for this ‘future release version’?.

This blog may not have a very large lifespan – indeed a new version of AAD Connect might be released at any time with native LDAP tree support, so be sure to research AAD Connect prior to providing a design or implementation.

My customer doesn’t have any requirement for ‘write back’ services (where data is written back from Azure Active Directory to the local directory user store) so this blog post covers just a straight export from the on-premises LDAP into Azure Active Directory.

I contacted Microsoft and they stated it’s supported ‘today’ to provide connectivity from AAD Connect to LDAP, so I’ve spun up a Proof of Concept (PoC) lab to determine how to get it working in this current version of AAD Connect.

Good news first, it works!  Bad news, it’s not very well documented so I’ve created this blog just to outline my learnings in getting it working for my PoC lab.

I don’t have access to the Oracle Unified Directory in my PoC lab, so I substituted in Active Directory Lightweight Directory Services (AD LDS) so my configuration reflects the use of AD LDS.

Tip #1 – You still need an Active Directory Domain Service (AD DS) for installation purposes

During the AAD connect installation wizard (specifically the ‘Connect your directories’ page), it expects to connect to an AD DS forest to progress the installation.  In this version of AAD Connect, the ‘Directory Type’ listbox only shows ‘Active Directory’ – which I’m expecting to include more options when the ‘Future Release’ version is available.

I created a single Domain Controller (forest root domain) and used the local HOST file of my AAD Connect Windows Server to point the forest root domain FQDN e.g. ‘forestAD.internal’ to the IP address of that Domain Controller.

I did not need to ‘join’ my AAD Connect Windows Server to that domain to complete the installation, which will make it easier to decommission this AD DS (if it’s put into Production) if Microsoft releases an AAD Connect version that does not require AD DS.

Screen Shot 2017-10-25 at 10.39.34 am

Tip #2 – Your LDAP Connector service account needs to be a part of the LDAP tree

After getting AAD Connect installed with mostly the default options, I then went into the “Synchronization Engine” to install the ‘Generic LDAP’ connector that is available by clicking on the ‘Connector’ tab and clicking ‘Create’ under the ‘Actions’ heading on the right hand side:

Screen Shot 2017-11-01 at 11.50.41 am

For the ‘User Name’ field, I created and used an account (in this example: ‘CN=FIMLDSAdmin,DC=domain,DC=internal) that was part of the LDAP tree itself instead of the installation account created by AD LDS which is part of the local Windows Server user store.

If I tried use a local Windows Server ie. ‘Server\username’ user, it would just give me generic connection errors and not bind to the LDAP tree, even if that user had full administrative rights to the AD LDS tree.  I gave up troubleshooting this – so I’ve resigned myself to needing a service account in the LDAP tree itself.

Screen Shot 2017-11-03 at 9.50.55 am

Tip #3 – Copy (and modify) the existing Inbound Rules for AD

When you create a custom Connector, the data mapping rules for that Connector are taken from the ‘Sychronization Rules Editor’ program and are not located in the Synchronization Engine.  There’s basically two choices at the moment for a custom connector: copy existing rules from another Connector, or create a brand new (blank) rule set for that connector (per object type e.g. ‘User’).

I chose the first option – I copied the three existing ‘Inbound’ Rules from the AD DS connector:

  • In from AD – User Join
  • In from AD – User AccountEnabled
  • In from AD – User Common

If you ‘edit’ each of these existing AD DS rules, you’ll get a choice to create a copy of that rule set and disable the original.  Select ‘Yes’ will create a copy of that rule, and you can then modify the ‘Connected Systen’ to use the LDAP Connector instead of the AD DS Connector:

Screen Shot 2017-11-03 at 10.10.51 am

I also modified the priority numbering of each of the rules from ‘201’ to ‘203’ to have these new rules applied last in my Synchronization Engine.

I ended up with the configuration of these three new ‘cloned’ rules for my LDAP Connector:

Screen Shot 2017-11-03 at 10.04.27 am

I found I had to edit or remove any rules that required the following:

  • Any rule looking for ‘sAMAccountName’, I modified the text of the rule to look for ‘UID’ instead (which is the schema attribute name most closely resembling that account name field in my custom LDAP)
  • I deleted the following rule from the Transformation section of the ‘In from AD – User Join’ cloned rule.  I found that it was preventing any of my LDAP accounts reaching Azure AD:

Screen Shot 2017-11-03 at 10.32.39 am

  • In the ‘In from AD – User AccountEnabled’ Rule, I modified the existing Scoping Filter to not look at the ‘userAccountControl’ bit and instead used the AD LDS attribute name:  msDS-UserAccountDisabled = FALSE’ value instead:

Screen Shot 2017-11-03 at 10.35.36 am

There are obviously many, many ways of configuring the rules for your LDAP tree but I thought I’d share how I did it with AD LDS.  The reasoning why I ‘cloned’ existing rules was that I wanted to protect the data integrity of Azure AD primarily.  There are many, many default data mapping rules for Azure AD that come with the AD DS rule set – a lot of them use ‘TRIM’ and ‘LEFT’ functions to ensure the data reaches Azure AD with the correct formatting.

It will be interesting to see how Microsoft tackles these rules sets in a more ‘wizard’ driven approach – particularly since LDAP trees can be highly customised with unique attribute names and data approaches.

Before closing the ‘Synchronization Rules Editor’, don’t forgot to ‘re-enable’ each of the (e.g AD DS) Connector rules you’ve previously cloned because the Synchronization Rules Editor assumes you’re not modifying the Connector they’re using.  Select the original rule cloned, and uncheck the ‘Disabled’ box.

Tip #4 – Create ‘Delta’ and ‘Full’ Run Profiles

Lastly, you might be wondering: how does the AAD Connector Scheduler (the one based entirely in PowerShell with seemingly no customisation commands) pickup the new LDAP Connector?

Well, it’s simply a matter of naming your ‘Run Profiles’ in the Synchronization Engine with the text: ‘Delta’ and ‘Full’ where required.  Select ‘Configure Run Profiles’ in the Engine for your LDAP Connector:

Screen Shot 2017-11-03 at 10.16.29 am.png

I then created ‘Run Profiles’ with the same naming convention as the ones created for AD DS and Azure AD:

Screen Shot 2017-11-03 at 10.17.58 am

Next time I ran an ‘Initial’ (which executes ‘Full Import’ and ‘Full Sync.’ jobs) or a ‘Delta’ AD Scheduler job (I’ve previously blogged about the AD Scheduler, but you can find the official Microsoft doc on it here), my new LDAP Connector Run Profiles were executed automatically along with the AD DS and AAD Connector Run Profiles:

Screen Shot 2017-11-03 at 10.19.57 am

Before I finish up, my colleague David Minnelli has found IDAMPundit’s blog post about a current bug upgrading to AAD Connect v1.1.649.0 version if you already have an LDAP Connector.  In a nutshell, just open up the existing LDAP Connector and step through each page and re-save it to clear the issue.

** Edit: I have had someone query how I’m authenticating with these accounts, well I’m leveraging an existing SecureAuth service that uses the WS-Federation protocol to communicate with Azure AD.  So ‘Federation’ basically – I’m not extracting passwords out of this LDAP or doing any kind of password hash synchronization.  Thanks for the question though!

Hope this helped, good luck!

 

Validate Your Authoritative Sources – Creating a Fuse for FIM/MIM Import and Sync run cycles

 

Introduction

The Microsoft Identity Manager Synchronisation Engine has been around for close to 20 years and is highly functional and very reliable.

The Achilles heal though for any IDAM Sync Engine will always be an authoritative source and the information it provides to the Sync Engine.

I’m seeing more and more SaaS services being used as the Authoritative Source for identity management systems. Think Success Factors and Workday. Connecting across the internet to these and the rate of change within organisations means the amount of change data I’m seeing as well as the common human factor of changes en-mass means it is even more important to validate your import feeds before processing through your Sync Engines business logic.

Overview

This post details the foundation of a little logic that will call an Import from the Authoritative Source and analyse what is returned, evaluate it against the previous Import to understand the number of objects expected and determine if it is within an acceptable tolerance (I’m using 1% changes of total managed objects). If it doesn’t checkout, don’t run a Sync and send an email to someone to check it out and make a rational human decision (and maybe a manual sync). If the Import is valid then run the Sync.

Simply put;

  • Query the Authoritative Management Agent and get the last Import Run
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
  • Run an Import cycle only
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
    • Evaluate each of the Staging Adds, Renames, Updates and Deletes and see if the number of changes is less that the tolerance (1%)
      • if yes proceed with a Sync Run
      • if no send a notification and don’t run the Sync

Enhancements

This will need to be tailored for each environment. What is normal for the number of changes expected in your environment? You may see a lot of Updates and the global 1% tolerance I have here doesn’t work for that. So you may want a tolerance per Adds, Renames, Updates and Deletes.

Implementation

Where I’ve used this, I’ve saved the PowerShell script below into the same directory as the other scripts that automate the MIM Sync Run Sequence. I’ve updated the previous automation script and removed the Authoritative Delta Import / Delta Sync, Full Import / Delta Sync etc and called this Auth Fuse Script instead.

The Script

As always this uses the awesome LithnetMIISAutomation PowerShell Module from Ryan. Update;

  • lines 5-11 for your Auth Source.
  • lines 20-24 for SMTP Server and Notification settings
  • $smtpBody lines for what you want the notification emails to say

Summary

A simple piece of logic to check and validate your imports can save hours/days of work.

If your Auth Source doesn’t provide a full dataset and you haven’t checked your import before processing you could be deleting a LOT of accounts.

If HR changed the Org Structure and didn’t inform you or take into account IDAM Business Logic you could be about to process a lot of AD Account Moves. If it involved redundancies and they haven’t informed anyone yet, you could be exposing that information to an entire organisation. CHECK AND VALIDATE YOUR AUTHORITATIVE SOURCE IMPORTS !!

Continuous Credential Prompt when accessing MIM Password Registration Portal

First published at https://nivleshc.wordpress.com

Recently I was at a customer site, setting up a Microsoft Identity Manager (MIM) 2016 environment, which included the deployment of the Self Service Password Registration and Self Service Password Reset portals. For additional security, I was using Kerberos instead of the default NTLM.

I finished installing the MIM Portal, Service, Password Registration and Password Reset Portals without any issues.

I then proceeded to securing all http endpoints by enabling them for SSL and after that removing the http bindings, so that you could access the MIM Portal, Password Registration and Password Reset Portals only via https. No issues there as well.

By this time I was pretty pleased with myself 😉 Everything was going as planned, no issues faced at all. Finally, lady luck was showering me with her blessings.

Having finished the installation and configuration, I proceeded to testing the solution.

The first thing to check was the MIM Portal site. I opened up a web browser and navigated to the Microsoft Identity Manager Portal. When prompted, I logged in with the mimadmin domain account credentials. I was successfully logged in and could access all the parts without any hitch.

Now kids, if you are faint at heart, be very wary of what happens next (hint. this is the time when you cover your eyes with your hands when watching a horror movie).

I then tried accessing the Self Service Password Registration Portal and got prompted for credentials.

SSPRegistration_CredentialPrompt

I entered the mimadmin account credentials and pressed enter. Just as I thought I had successfully logged in, the credential prompt returned! hmm, that is weird. I was pretty sure I had typed the username and password correctly. Oh well, maybe I didn’t.

I typed the credentials again and pressed enter. Quick as a flash, the credential prompt returned! Uh? What was happening here?{scratching my head} Hmm, I seem to be making a lot of typos today. I carefully entered the username and password again, taking my time this time, to ensure I was entering it correctly. I then pressed enter and waited.

Well, I didn’t have to wait for long since within a second, I got greeted with the Not Authorized screen!

SSPRegistration_NotAuthorised

Fascinating. It seems that lady luck had flown away because here indeed was an issue with the Self Service Password Registration Portal! Ok, Mister. Lets have a look at whats causing this kerfuffle!

I opened up the event viewer on the Self Service Password Registration server and went through each of the logged events in the Application and System logs, however I couldn’t find any clue as to why the credentials were not working. I secretly had suspicions that the issue could be due to Kerberos token errors, however I couldn’t find anything in the event logs to substantiate my suspicion. Hmm, the plot was indeed getting thicker!

I next started doing some Google searches, thinking that someone else might have encountered the same issue. Alas, it seemed that I was alone in my woes as the results seemed to be quite thin in regards to any possible solution for my issue.

Finally, I decided to follow my dear ol’ Sherlock’s advice “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”

I went through the whole Self Service Password Registration setup process, checking each and every part of the configuration, to ensure that the values were as expected. After 10 minutes, I was almost done checking and no clues so far 😦

Lastly, I opened IIS Manager and checked all the settings. Nothing here as well. Hey back up a bit. What is this?

The Self Service Password Registration Portal site had its useAppPoolCredentials set to False.

SSPRegistration_useAppPoolCredentialsFalse

Now, this should be True! Is this what was causing the issue?

I quickly changed the value for useAppPoolCredentials from False to True.

SSPRegistration_useAppPoolCredentialsTrue

I then opened my web browser again and navigated to the Self Service Password Registration Portal. Once again the familiar credential prompt came up. I entered the same credentials as before and pressed enter.

Woo hoo!! This time around I was successfully logged in.

I sincerely hope that this post helps others who might be encountering the same error.

Have a great day 😉

Automatically Provision Azure AD B2B Guest Accounts

Azure ‘Business to Business’ (or the catchy acronym ‘B2B’) has been an area of significant development in the last 12 months when it comes to providing access to Azure based applications and services to identities outside an organisation’s tenancy.

Recently, Ryan Murphy (who has contributed to this blog) and I have been tasked to provide an identity based architecture to share Dynamics 365 services within a large organisation, but across two ‘internal’ Azure AD tenancies.

Dynamics 365 takes its identity store from Azure AD; if you’re assigned a license for Dynamics 365 in the Azure Portal, including in a ‘B2B’ scenario, you’re granted access to the Dynamics 365 application (as outlined here).  Further information about how Dynamic 365 concepts integrate with a single or multiple Azure tenancies is outlined here.

Microsoft provide extensive guidance in completing a typical ‘invitation’ based scenario using the Azure portal (using the links above).  Essentially this involves inviting users using an email which relies on that person manually clicking on the embedded link inside the email to complete the acceptance (and the ‘guest account’ creation in the Dynamics 365 service linked to that Azure AD).

However, this obviously won’t scale when you’re requiring on inviting thousands of new users, initially, but then also having to repeatedly invite new users as part of a Business-As-Usual (BAU) process as they join the organisation (or ‘identity churn’ as we say).

Therefore, to automate the creation of new Guest Users Azure AD accounts, without involving the user at all, this process can be followed:

  1. Create a ‘service account’ Guest User from the invited Azure AD (has to have the same UPN suffix as the users you’re inviting) to be a member of the resource Azure AD.
  2. Assign the ‘service account’ Guest User to be a member of the ‘Guest Inviter’ role of the resource Azure AD.
  3. Use PowerShell to auto. provision new Guest User accounts using the credentials of the ‘service account’ Guest User.

In this blog, we’ll use the terms ‘Resource Azure AD’ or ‘Resource Tenancy’ which is the location where you’re trying to share the sources out to another Azure AD called ‘Invited Azure AD’ or ‘Invited Tenancy’ where the user accounts (including usernames & passwords) you’re inviting reside.  The invited users only ever use their credentials in their own Azure AD or tenancy – never credentials of the ‘Resource Azure AD or tenancy’.  The ‘Guest User’ object created in the ‘Resource Tenancy’ are essentially just linking objects without any stored password.

A ‘Service Account’ Azure AD account dedicated solely to the automatic creation of Guest Users in the Resource Tenancy will need to be created first in the ‘Invited Azure AD’ – for this blog, we used an existing Azure AD account sourced using a synchronised local Active Directory.  This account did not have any ‘special’ permissions in the ‘Invited Azure AD’ but according to some blogs, it requires at least ‘read’ access to the user store in the ‘Invited Azure AD’ at least (which is default).

This ‘Service Account’ Azure AD account should have a mailbox associated with it, i.e. either an Exchange Online (Office 365) mailbox, or a mail value that has a valid SMTP address for a remote mailbox.  This mailbox is needed to approve the creation of a Guest User account in the Resource Tenancy (only needed for this individual Service Account).

It is strongly recommended that this ‘Service Account’ user in the ‘Invited Azure AD’ has a very strong & complex password, and that any credential used for that account within a PowerShell script be encrypted using David Lee’s blog.

The PowerShell scripts listed below to create these Guest Accounts accounts could then be actioned by an identity management system e.g. Microsoft Identity Manager (MIM) or a ‘Runbook’ or workflow system (e.g. SharePoint).

 

Task 1: Create the ‘Service Account’ Guest User using PowerShell

Step 1: Sign into the Azure AD Resource Tenancy’s web portal: ‘portal.azure.com’, using a Global Admin credential.

Step 2:  When you’re signed in, click on the account profile picture on the top right of the screen and select the correct ‘Resource Tenancy’ (There could be more than one tenant associated with the account you’re using):

Screenshot 2017-09-19 at 9.34.18 AM

Step 3:  Once the tenancy is selected, click on the ‘Azure Active Directory’ link on the left pane.

Step 4:  Click ‘User Settings’ and verify the setting (which is set by default for new Azure AD tenancies):  ‘Members can invite’.

Screenshot 2017-09-19 11.31.51

Step 5:  Using a new PowerShell session, connect and authenticate to the Azure AD tenancy where the Guest User accounts are required to be created into (i.e. the ‘Resource Azure AD’).

Be sure to specify the correct ‘Tenant ID’ of the ‘Resource Azure AD’ using the PowerShell switch ‘-TenantId‘ followed by the GUID value of your tenancy (to find that Tenant ID, follow the instructions here).

$Creds = Get-Credential

Connect-AzureAD -Credential $creds -TenantId “aaaaa-bbbbb-ccccc-ddddd”

 

Step 6:  The following PowerShell command should be executed under a ‘Global Admin’ to create the ‘Service Account’ e.g. ‘serviceaccount@invitedtenancy.com’.

 

New-AzureADMSInvitation -InvitedUserDisplayName “Service Account Guest Inviter” -InvitedUserEmailAddress “serviceaccount@invitedtenancy.com” -SendInvitationMessage $true -InviteRedirectUrl http://myapps.microsoft.com -InvitedUserType member

 

Step 7:  The ‘Service Account’ user account will then need to locate the email invitation sent out but his command and click on the link embedded within to authorise the creation of the Guest User object in the ‘Resource Tenancy’.

 

Task 2: Assign the ‘Service Account’ Guest Inviter Role using Azure Portal

Step 1:  Sign into the Azure web portal: ‘portal.azure.com’ with the same ‘Global Admin’ (or lower permission account) credential used in Task 1 (or re-use the same ‘Global Admin’ session from Task 1).

Step 2:  Click on the ‘Azure Active Directory’ shortcut on the left pane of the Azure Portal.

Step 3:  Click on the ‘All Users’ tab and select the ‘Service Account’ Guest User.

(I’m using ‘demo.microsoft.com’ pre-canned identities in the screen shot below, any names similar to real persons is purely coincidental – an image for ‘serviceaccount@invitedtenancy’ used as the example in Task 1 could not be reproduced)

Screenshot 2017-09-19 09.44.36

Step 4:  Once the ‘Service Account’ user is selected, click on the ‘Directory Role’ on the left pane.  Click to change their ‘Directory Role’ type to ‘Limited administrator’ and select ‘Guest Inviter’ below that radio button.  Click the ‘Save’ button.

Screenshot 2017-09-19 09.43.53

Step 5:  The next step is to test to ensure that ‘Service Account’ Guest User account can invite users from the same ‘UPN/Domain suffix’.   Click on the ‘Azure Active Directory’ link on the left pane off the main Azure Portal.

Step 6:  Click ‘User and groups’ and click ‘Add a guest user’ on the right:

Screenshot 2017-09-19 09.36.02

Step 7:  On the ‘Invite a guest’ screen, send an email invitation to a user from the same Azure AD as the ‘Service Account’ Guest User.  For example, if your ‘Service Account’ Guest user UPN / Domain Suffix is: ‘serviceaccount@remotetenant.com’, then invite a user from the same UPN/domain suffix e.g. ‘jim@remotetenant.com’  (again, only an example – any coincidence to current or future email address is purely coincidental).

Screenshot 2017-09-19 09.36.03

Step 8:  When the user receives the invitation email, ensure that the following text appears at the bottom of the email:  ‘There is no action required from you at this time’:

image002

Step 9:  If that works, then PowerShell can now automate that invitation process bypassing the need for emails to be sent out.  Automatic Guest Account creation can now leverage the ‘Service Account’ Guest User.

NOTE:  If you try to invite a user from with UPN/Domain suffix that does not match the ‘Service Account’ Guest User, the invitation will still be sent but it will appear requesting the user accept the invitation.  The invitation will be in a ‘pending acceptance’ state until that is done, and the Guest User object will not be created until that is completed.

Task 3:  Auto. Provision new Guest User accounts using PowerShell

Step 1:  Open Windows PowerShell (or re-use an existing PowerShell session that has rights to the ‘Resource Tenancy’).

Step 2:  Type the following example PowerShell command to send  invitation out, and authenticate when prompted using the ‘Invited Tenancy’ credentials of the ‘Service Account’ Guest User.

In the script, again be sure to specify the ‘TenantID’ for the switch –TenantID of the ‘Resource Tenancy’, not the ‘Invited Tenancy’.

 

#Connect to Azure AD

$Creds = Get-Credential

Connect-AzureAD -Credential $creds -TenantId “aaaaa-bbbbb-ccccc-ddddd”

$messageInfo = New-Object Microsoft.Open.MSGraph.Model.InvitedUserMessageInfo

$messageInfo.customizedMessageBody = “Hey there! Check this out. I created and approved my own invitation through PowerShell”

New-AzureADMSInvitation -InvitedUserEmailAddress “ted@invitedtenancy.com” -InvitedUserDisplayName “Ted at Invited Tenancy”  -InviteRedirectUrl https://myapps.microsoft.com -InvitedUserMessageInfo $messageInfo -SendInvitationMessage $false

 

Compared to using the Azure portal, this time no email will be sent (the display name and message body will never be seen by the invited user, it’s just required for the command complete).   To send a confirmation email to the user, you can change the switch -SendInvitationMessage to: $True.

 

Step 3:  The output of the PowerShell command should have at the end of the text next to ‘Status’ as ‘Accepted’:

image001

This means the Guest User object has automatically been created and approved by the ‘Resource Tenancy’.   That Guest User object created will be associated with the actual Azure AD user object from the ‘Invited Tenancy’.

The next steps for this invited Guest User will be then to assign them a Dynamics 365 license and then a Dynamics 365 role in the ‘Resource Tenancy’ (which might be topics of future blogs).

Hope this blog has proven useful.

 

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Platform Features

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

ManagedServiceIdentity

Back in Platform Features under General Settings select Application Settings. 

General Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

App Settings

Under Platform Features select Console.

DevelopmentTools

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

Endpoint

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

NewFunction

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Vars

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Vault

Select Add new, Select Principal and locate your Function App and click Select.

Access Policy 1

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Policy - GET

We now have our Function App enabled to access the Key Vault.

Access Policy 2

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Vault Secret URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.