Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report

 

Provisioning Hybrid Exchange/Exchange Online Mailboxes with Microsoft Identity Manager

Introduction

Working for Kloud all our projects involve Cloud services, and all our customers have varying and unique requirements. Recently one of our customers embarked on their migration from On-Premise Exchange to Exchange Online. Nothing really groundbreaking there though, however they had a number of unique requirements including management of Litigation Hold. And that needed to be integrated with their existing Microsoft Identity Manager implementation (that currently provisions new users to their Exchange 2013 environment). They also required that management of the Exchange environment still be possible via the Exchange Management Console against a local Exchange server. This post details how I integrated the environments using MIM.

Overview

In order to integrate the Provisioning and Lifecycle management of Exchange Online Mailboxes in a Hybrid Exchange with Microsoft Identity Manager I created a custom PowerShell Management Agent simply because it was going to provide the flexibility I needed.

Provisioning is based on the following process;

  1. MIM Creates new user in Active Directory (no changes to existing MIM provisioning process)
  2. Azure Active Directory Connect synchronises the user to Azure Active Directory
  3. The Exchange Online MIM Management Agent sees the corresponding AAD account for the new user
  4. MIM Declarative Rules trigger the creation of a new Remote Mailbox for the AD/AAD user against the local Exchange 2013 On Premise Server. This allows the EMC to be used to manage mailboxes On Premise even though the mailbox resides in Office365/Exchange Online
  5. AADC/Exchange synchronises the information as part of the Hybrid Exchange topology
  6. MIM sees the EXO Mailbox configuration for the new user and enables Litigation Hold against the EXO Mailbox (if required)

The following diagram graphically depicts this process.

EXO IDM Provisioning Solution.png

Exchange Online PowerShell MA

As always I’m using my favourite PowerShell Management Agent, the Grandfeldt PS MA now available on Github here.

Schema Script

The Schema script configures the schema required for current and future EXO management requirements. The Schema is based on a single Object Class “MailUser” but pulls the information from a combination of Azure AD User and Exchange Online Mailbox object classes for an associated account. Azure AD User objects are prefixed by ‘AAD’. Non AAD prefixed attributes are EXO Mailbox attributes.

Import Script

The Import script connects to both Azure AD and Exchange Online to retrieve Azure AD User accounts and if present the associated mailbox for a user.

It retrieves all Member AAD User Accounts and puts them into a Hash Table. Connectivity to AAD is via the AzureADPreview PowerShell module. It retrieves all Mailboxes and puts them into a Hash Table. It then processes all the mailboxes first including the associated AAD User account (utilising a join via userPrincipalName).

Following processing all mailboxes the remainder of the AAD Accounts (without mailboxes) are processed.

Export Script

The Export script performs the necessary integration against OnPremise Exchange Server 2013 for Provisioning and Exchange Online for the rest of management. Both utilise Remote Powershell. It also leverages the Lithnet MIIS Automation PowerShell Module to query the Metaverse to validate current object statuses.

Wiring it all up

The scripts above will allow you to integrate a FIM/MIM implementation with AAD/EXO for management of users EXO Mailboxes. You’ll need connectivity from the MIM Sync Server to AAD/O365 in order to manage them.  Everything else I wired up using a few Sets, Workflows, Sync Rules and MPR’s.

 

Geographically Visualizing your workforce using Microsoft Identity Manager, xMatters and Power BI

Introduction

In the last couple of weeks I’ve posted about visualizing relationships of data from Microsoft Identity Manager using Power BI. Earlier this week I posted about building a Management Agent for Microsoft Identity Manger to integrate with xMatters.

In this post I combine data from the last two in order to allow us to visualise the geographic office locations for an organisation and then summary data about it (how many employees are located there, and what departments).

Prerequisites

You’ll need an Azure AD and Office 365 subscription to allow you to create a Power BI Application. Too create a Power BI Application see Registering a Power BI Application in this post here.

You’ll also need the Power BI PowerShell Module. I’m using 2.0.0.9 available from the PowerShell Gallery here and of course the Lithnet MIIS PowerShell Module available from here.

Overview

Using our registered Power BI Application we’ll create a Dataset consisting of two tables. One for the xMatters Sites (that we also get the geographic co-ordinates of from the xMatters Management Agent), and the other with our xMatters Users that contains the officeLocation that maps to an xMatters Site.

I create a relationship between the two tables on xMattersSite displayName (which is the location name) and the xMattersUsers officeLocation. We can then create a nice visual using data from both tables.

Create the Dataset (two tables with relationship)

Initially I tried to create the dataset with a relationship as I’ve previously shown here. However that didn’t work. After some debugging I got the result I wanted after some trial and error using the Power BI API Explorer. So I’ll provide you with the raw JSON format for creating a New Dataset, Two Tables (xMattersSites and xMattersUsers) and a relationship between them (where xMattersSites\displayName joins with xMattersUsers\officeLocation) as per my xMatters Management Agent detailed here.

Start by authenticating to the Power BI API Explorer with an account in the environment where you created your Power BI Application and navigate to the Create Dataset section here.

Create Dataset

Update this JSON formatted object that details the Dataset, Tables and Relationships for your environment.

Paste your validated JSON object into the Body section of the API Explorer and select Call Resource.

Dataset Body

If your JSON object is formatted corrected you’ll get a 201 response and your DataSet and Tables with Relationship will be created.

Create Success

Switching over to Power BI you’ll see the xMatters Dataset in the bottom left, then the two tables in on the right hand side with their columns.

xMatters DataSet PBI.PNG

Load xMatters User Data into Power BI

Now that we have somewhere to put the data, lets populate the dataset. I’m using the Lithnet MIIS Automation PowerShell Module (detailed in the prerequsites to query the Metaverse and return all users. Then I refine the list down to those that are Active (based on my employeeActive Boolean attribute) then finally, only those users that are connected on the xMatters Management Agent (see lines 14 & 18).

The script will drop any existing values from the xMatters Users table then upload what we have retrieved from the Metaverse (and refined).

Upload Users.PNG

Load xMatters Site Data into Power BI

Again I’m also using the Lithnet MIIS Automation PowerShell Module to query the Metaverse and return all xMatters Sites.

The script will drop any existing values from the xMatters Sites table then upload what we have retrieved from the Metaverse.

Upload Sites.PNG

Creating the Power BI Visual

Now we have data we can build the visual. I’m using the ArcGIS Maps for Power BI visual which is available in the default set of visuals. Then by selecting displayName and geo the map will automagically show all xMatters Sites in their respective co-ordinates.

xMatters Sites to Map

We can then add a Card Visual and choose officeLocation and then configure the visual for Count of officeLocation and we’ll get a count of the employees at that location. As we can see below with the Sydney location selected from the map the card updates to tell me there are 665 Employees at that officeLocation.

Count of Employees at Selected Location

Pretty quickly we can also expand out other data points, like departments at a location, employees etc as shown below (I’ve obfuscated the departments and a number of the other office locations).

Summary.PNG

Conclusion

We haven’t generated any new data. We’ve taken information we already have in Microsoft Identity Manager from connected systems and quickly visualized it via Power BI. However providing this to the business and with the ability for consumers of the information to export it from the visual can be pretty powerful.

Building a FIM/MIM Management Agent for xMatters

Introduction

A couple of weeks ago one of my customers had a requirement to provision and manage identities into xMatters. The xMatters API Documentation looked straight-forward and I figured it would be pretty quick to knock up an PowerShell Management Agent.

The identification of users (People) in xMatters was indeed pretty quick. I was quickly able to enumerate all users (that had initially been seeded independent of FIM/MIM) and join them to corresponding users in the MetaVerse.

It was then as I started digging deeper that the relationship between Sites (Locations) and Email/Mobile (Devices) attributes became apparent. This post details how I approached it and a base xMatters MA that should get you started if you need to do something similar.

Overview

A key concept to keep in mind is that at the simplest level there are 3 key Object Types in xMatters;

  • People
    • User Objects along with basic naming attributes
  • Device
    • Each contact medium is a device. Email Address, Mobile Phone, Home Phone, Text Phone (SMS) etc.
  • Site
    • Location of the entity (person)

Associated with each is an id which can be either dynamically created on provisioning (by xMatters) or specified. For People there is also targetName which is the equivalent of UID/sAMAccountName. When using the API (for people) you can use either their ID or their targetName. For all other entities you need to use the ID.

For each entity as you’d expect there are different API URI’s. They are;

Finally to retrieve devices for a person use;

Other key points to consider that I uncovered are;

  • if you are updating a Device (e.g. someones Email Address or Phone Number) don’t specify the owner attribute (as you do when you create the Device). It considers that you are trying to change the owner and won’t allow it.
  • to update a Device you need to know the ID of the Device. I catered for this on my Import by bringing through People and Device ID’s.
  • When creating/updating a users location you need to specify the Site ID and Site Name. I brought these through as a separate ObjectClass into FIM/MIM and query the MV for them when Exporting
  • In my initial testing the API returned a number of different errors 400 (Bad Request), 409 Conflict (when trying to Add a Device that already exists), 404 (Not Found) along with API Timeouts. You need to account for these and perform processing appropriately
  • On success of Update, Create or Delete the API returns the full object that you performed the operation on. You need to capture this and let MIM know that on Success a full object being returned is Success and not an error
  •  xMatters expects phone numbers to be in E164 format (e.g +61 400 123 456). I catered for this on an import on another Management Agent
  • xMatters timezone is in the format of Country/Region. For Australia these are as follows. Correct, it doesn’t accept Australia/Canberra for ACT;
    • “NSW”  = “Australia/Sydney”
      “VIC”  = “Australia/Melbourne”
      “QLD”  = “Australia/Brisbane”
      “ACT”  = “Australia/Sydney”
      “WA”  = “Australia/Perth”
      “TAS”  = “Australia/Hobart”
      “NT”  = “Australia/Darwin”

xMatters PowerShell Management Agent

With all that introduction, here is a base xMatters PowerShell MA (implemented using the Granfeldt PowerShell MA) to get you started. You’ll need to tailor for your environment and trigger Provisioning, Deletes and Flow Rules for your environment and look to handle the xMatters API for your integration.

Schema Script

I’ve created two Object Classes. User and Site. User incorporates User Devices. Site is the locations (Sites) from xMatters.

Import Script

Credentials for the Import script to connect to xMatters are flowed in from the Management Agent Username and Password attributes. This isn’t using Paged Imports. If you have a large number of users you may want to consider that. After retrieving all of the People entities each is queried to obtain their Devices. I’m only bringing through SMS and Email Devices. You’ll need to modify for additional Devices.

Ensure that you flow into the MetaVerse (onto custom attributes) the IDs associated with your Devices (e.g MobileID and EmailID). That will allow you to use the ID when updating those attributes.

For Sites, I created a custom ObjectClass (Site) in the MV and used objectID of the SiteID and displayName for the Site Name (as shown below).

Attribute Flows.png

Export Script

This is where it gets a little more complicated. As PowerShell is not good at reporting webrequest responses we have to deal with the return from each API call and determine if we were successful or not. Then let FIM/MIM know so it can report that via the UI.

The Export script below deals with Adding, Deleting and Updating users. Update line 31 for your API URI for xMatters.

Summary

The detail above will get you started and give you a working Management Agent to import Users and Sites. You’ll need to do the usual steps (Set, Workflow, Sync Rule and MPR) to trigger Provisioning on the MA along with how you handle deletes.

 

Graphically Visualizing Identity Hierarchy and Relationships

Almost 15 years ago Microsoft released Microsoft Identity Integration Server (MIIS) 2003. Microsoft also released a couple of Resource Toolkits for MIIS to assist customers and IT Integrators’ implement the product as up to that time it’s predecessor (Microsoft Metadirectory Services) was only available as part of a Microsoft Consulting engagement.

At the same time Microsoft provided a Beta product – Microsoft PolyArchy Server. For someone who’s brain is wired in highly visually way, this was a wow moment. PolyArchy Server took a dataset from the Synchronisation Server and wrapped a small IIS website around it to expose intersecting relationships between data. When you selected a datapoint the visual would flip to the new context and display a list of entities associated with that relationship.

Microsoft proposed to deliver PolyArchy Server in calendar year 2006. However the product never made it to market. The concept of visualizing identity data was seeded in my brain and something I’ve always surfaced in one method or another as part of many Identity Management projects.

In this post I’ll detail how I’ve recently used Power BI to visualize relationship data from Microsoft Identity Manager.  The graphic below is an example (with node labels turned off) that represents Managers by Department by State.

Managers by Dept by State - Graphical.png

Using filters in the same report allows whoever is viewing the report to refine the visual based on State and Dept. By selecting a State from the map the visual will dynamically update to show that state only. Selecting a department only will show that department in each state.

Managers by Dept by State - Filtered.png

Hovering over the nodes will display the detail. I’ve turned off the node labels that show each nodes label to not expose the source of my dataset.

Managers by Dept by State - NSW Detail.png

Getting MIM MV User MetaData into Power BI

My recent post here details the necessary steps to get started publishing data directly in a Power BI Dataset using PowerShell. Follow the details listed there to register a Power BI Application.

Creating the DataSet

With that done the script below will create a DataSet in Power BI. My dataset is obviously specific to the environment I developed it in. You probably won’t have some of the attributes so you will need to update accordingly. The script is desinged to run on the MIM Sync Server. The MIM Sync Server will need to be able to connect to Azure and Power BI.

Publish data to the DataSet

Now that we have a Power BI DataSet (Table) we need to extract the data from the MIM MV and push it into the table. Using the Lithnet MIIS Automation PowerShell Module makes this extremely simple. Using the table schema created above I retrieve the values for each Active User, build a PowerShell Object and use the Power BI PowerShell Module to push the data to Power BI.

Creating the Power BI Visualization

The visualisation I’m using is the Journey Chart by MAQ Software which is available in the Power BI Store (free).

Journey Visual.PNG

With the Journey Visualization selected and dropped in we just have to select the attributes we want to visualize and the order of the relationships. The screenshot below shows the data sorted by State => managerName => accountName with Measure Data being accountName.

Visual Config.PNG

Conclusion

We never got PolyArchy Server from Microsoft, but we can quickly visualize basic relationship data from MIM with Power BI.

Automate the update of the data into Power BI, embed the Power BI Reports into your MIM Portal and provide access to the appropriate personnel.

 

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report

Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Update: An element of this solution details checking passwords online (using the Have I Been Pwned API). Troy explains succinctly in his blog-post announcing the pwned passwords list why this is a bad idea. If you are looking to implement the concept I detail in this post then WE STRONGLY recommend using a local copy of the pwned password list.
THIS POST HERE details using a local SQL Database to hold the Pwned Passwords Datasets and the change to the Management Agent to query the SQL DB instead of the HIBP API.  

Introduction

Last week (3 Aug 2017) Troy Hunt released a sizeable list of Pwned Passwords. 320 Million in fact. I encourage you strongly to have a read about the details here.

Troy also extended his HaveIBeenPwned API to include the ability to query as to whether a password has been pwned and is likely to be used in a brute force attack.

Microsoft provide a premium license feature in Azure Active Directory (Azure Active Directory Identity Protection) whereby leaked credential sets are checked and Admins alerted via reports. But what if you aren’t licensed for the Azure AD Premium Features, or you want something a little more customised and you have Microsoft/Forefront Identity Manager? That is what this post covers.

Overview

The following diagram looks a little more complicated than what it really is. The essence though is that password changes can come from a multitude of different scenarios. Using Microsoft’s Password Change Notification Service (PCNS) we can capture password changes and send them to Microsoft Identity Manager so that we can synchronise the password to other systems, or for this use case we can lookup to see if the users new password is on the pwned password list.

This post will cover creating the Pwned Password FIM/MIM Management Agent and flagging a boolean attribute in the MIM Service to indicate whether a users password is on the pwned password or not.

PwnedPassword Overview.png

Prerequisites

There are a few components to this solution depicted above. You will need;

  • FIM/MIM Synchronisation Server
    • with an Active Directory Management Agent configured (most likely you will have a Projection Rule on this MA to get your users into the Metaverse)
    • not shown in the diagram above you will also need the MIM MA configured to sync users from the Metaverse to the MIM Service
  • FIM/MIM Service and Portal Server (can be on the same server as above)
  • Microsoft Password Change Notification Service (PCNS). This MS PFE PCNS implementation document covers it quite well and you will need;
    • the PCNS AD Schema Extension installed
    • the PCNS AD Password Filters installed on all your (writeable) Domain Controllers
    • PCNS configured to send password changes to your FIM/MIM Sync Server
  • Granfeldt PowerShell Management Agent (that we will use to check users passwords against the Have I Been Pwned pwned password API)
  • Lithnet Resource Management PowerShell Module
    • download it from here and install it on your FIM/MIM Server as the Pwned Password MA will use this module to populate the Pwned Password Status for users in the MIM Service
  • Windows Management Framework (PowerShell) 5.x

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Four items of note for this solution;

  • You must have an Export.ps1 file. Even though we’re not doing exports on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the On Premise Active Directory where we will be importing users from to join to our Metaverse so we can pass password changes to this Management Agent
  • The same account as above will also need to have permissions in the MIM Service as we will be using the account to update the new attribute we are going to create
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\PwnedPWD

With the Granfeldt PowerShell Management Agent downloaded from Codeplex and installed on your FIM/MIM Server we can create our Pwned Password Management Agent.

Creating the Pwned PowerShell Management Agent

On your FIM/MIM Sync Server create a new sub-directory under your Extensions Directory. eg. PwnedPWD in C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions then create a sub-directory under PwnedPWD named DebugC:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions\PwnedPWD\Debug

Copy the following scripts (schema.ps1, import.ps1, export.ps1, password.ps1) and put them into the C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Extensions\PwnedPWD directory

Schema.ps1

The following schema.ps1 script sets up the object class (user) and a handful of attributes from Active Diretory that will be useful for logic that we may implement in the future based on users password status.

Import.ps1

The import.ps1 script connects to Active Directory to import our AD users into the Pwned Password Management Agent so we can join to the Metaverse object already present for users on the Active Directory Management Agent. The user needs to be joined to the Metaverse on our new MA so they are addressable as a target for PCNS.

Export.ps1

As detailed earlier, we aren’t using an Export script in this solution.

Password.ps1

The Password script receives password changes as they occur from Active Directory and looks up the Have I Been Pwned API to see if the new password is present on the list or not and sets a boolean attribute for the pwned password status in the MIM Service.

On your FIM/MIM Sync Server from the Synchronisation Manager select Create Management Agent from the right hand side pane.  Select PowerShell from the list of Management Agents. Select Next.

PwnedPwdMA1a

Give your MA a Name and a Description. Select Next. 

PwnedPwdMA1b

Provide the 8.1 style path to your Schema.ps1 script copied from the steps earlier. Provide an AD sAMAccountName and Password that also has permissions to the MIM Service as detailed in the Prerequisites. Select Next.

PwnedPwdMA2

Provide the paths to the Import.ps1, Export.ps1 and Password.ps1 scripts copied in earlier. Select Next.

PwnedPwdMA3

Select Next.

PwnedPwdMA4

Select the user checkbox. Select Next.

PwnedPwdMA5

Select all the attributes in the list. Select Next.

PwnedPwdMA6

Select Next.

PwnedPwdMA7

Select Next.

PwnedPwdMA8

Create a Join Rule for your environment. eg.  sAMAccountName => person:Accountname  Select Next.

PwnedPwdMA9

Create an import flow rule for user:pwdLastSet => person:pwdLastSet. Select Next.

PwnedPwdMA10

Select Next.

PwnedPwdMA11

Ensure that Enabled password management is selected, then select Finish.

PwnedPwdMA12

With the Pwned Password MA created and configured we need to create at least a Stage (Full Import) and Full Sync Run Profiles and execute them to bring in the users from AD and join them to the Metaverse.

This should be something you’re already familiar with.

RunProfiles

When running the Synchronisation we get the joins we expect. In my environment PwdLastSet was configured to sync to the MIM Service and hence the Outbound Sync to on the MIM Service MA.

Sync and join

MIM Service Configuration

In the MIM Service we will create a custom boolean attribute that will hold the pwned status of the users password.

Schema

Connect to your MIM Portal Server with Administrator privileges and select Schema Management from the right hand side menu.

Select All Attributes then select New

Provide an attribute name (System name) and a Display Name with a Data Type of Boolean. Provide a Description and select Finish

Select Submit

Search for User in Resource Types then select the User checkbox from the search results and select Binding then select New.

In the Resource Type box type User then click the validate field button (the one with the green tick). In the Attribute Type box type Pwned Password then click the validate field button (the one with the green tick). Select Finish

Select  Submit

Configure the Active Directory MA to send passwords to the Pwned Passwords MA

On your existing Active Directory Management Agent select Properties. Select Configure Directory Partitions then under Password Synchronization enable the checkbox Enable this partition as a password synchronization source. Select Targets and select your newly created Pwned Password MA. Select Ok then Ok again.

Password Target2.PNG

Testing the End to End Pwned Password Check

Now you should have configured;

  • PCNS including installation of the Active Directory filters
  • The existing Active Directory Management Agent as a Password Source
  • The existing Active Directory Management Agent to send password change events to the Pwned Password MA

Select a user in Active Directory Users and Computers, right click the user and select Reset Password.

ChangePassword1

I first provided a password I know is on the pwned list, Password1

ChangePassword2

ChangePassword3

With PCNS Logging enabled on the MIM Sync Server I can see the password event come through.

ChangePassword4

Checking in the Pwned Password MA debug log we can see in the debug logging for the user we changed the password for and that when it was checked against Have I Been Pwned the password is flagged as pwned.

Note: If you implement the solution in a production environment obviously remove the password from being logged. 

ChangePassword5

In the MIM Portal search for and locate the user the password we just changed the password for.

ChangePassword7.PNG

Select the user. Scroll to the bottom and select Advanced View. Select the Extended Attributes tab. Scroll down and we can see the Pwned Password shows as checked.

ChangePassword6

Now repeating the process with a password that isn’t in the Pwned Password list. After changing the password in Active Directory Users and Computers the password went through its sync path. The log shows the password isn’t in the list.

ChangePassword8

And the MIM Portal shows the Boolean value for Pwned Password is now not selected.

ChangePassword9

Summary

Using PCNS and FIM/MIM we can check whether our Active Directory users are using passwords that aren’t in the Pwned Password list.

What we can then do if their password is in the Pwned Password list is a number of things based on what the security policy is and even what type of user it is. You’ll notice that I’ve included additional attributes in the MA that we can flow through the Metaverse and into the MIM Service that may help with some of those decisions (such as adminCount which indicates if the user is an Administrator).

Potentially for Admin users we could create a workflow in the MIM Service that forces their account to change password on next logon. For other users we could create a workflow that sends them a notification letting them know that they should change their password.

Either way, we now have visibility of the state of users passwords. Big thanks to Troy for adding Pwned Passwords to his Have I Been Pwned API.

 

Reiterating: An element of this solution details checking passwords online (using the Have I Been Pwned API). Troy explains succinctly in his blog-post announcing the pwned passwords list why this is a bad idea. If you are looking to implement the concept I detail in this post then WE STRONGLY recommend using a local copy of the pwned password list.  

Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string 
if ($attr.equals("EXOPhoto")){ 
   $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>" 
   $val = "System.Byte[]" 
}

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data 
$MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment 
$htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report" 

 

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output 
$htmlcss = "<style> 
   h1, h2, th { text-align: center; } 
   table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; } 
   th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; } 
   td { font-size: 11px; padding: 5px 20px; color: #000; } 
   tr { background: #b8d1f3; } 
   tr:nth-child(even) { background: #dae5f4; } 
   tr:nth-child(odd) { background: #b8d1f3; } 
</style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.