A modern way to track FIM/MIM Attribute Value History utilizing Power BI

Introduction

Microsoft Identity Manager is fantastic for keeping data consistent between connected systems. Often however you want to know what a previous value of an attribute was. FIM/MIM however can only tell you the current value and the Management Agent it was received on and when.

In the past where I’ve had to provide a solution to either make sure an attribute has a unique value forever (e.g email address or loginID (don’t reuse email addresses or loginID)) or just attribute value history I’ve used two different approaches;

  • Store previous values in an SQL Table and have an SQL MA that flows out the values
  • Store historical values in a Multi-Valued attribute on the user object in the Metaverse

Both are valid approaches but often fall down when you want to quickly get a report on that metadata.

Recently we had a similar request to be able to know when Employees EndDates were updated in HR. Specifically useful for contractors who have their contracts extended. Instead of stuffing the info into a Multi-Valued attribute or an SQL DB this time I used Power BI. This provides the benefit of being able to quickly develop a graphical report and embed it in the FIM/MIM Portal.

Such a report looks like the screenshot below. Power BI Report

Using the filters on the right hand side of the report you can find a user (by EmployeeID or DisplayName), select them and see attribute value history details for that user in the main part of the report. As per the screenshot below Andrew’s EndDate was originally the 8th of December (as received on the 5th of November), but was changed to the 24th of November on the 13th of November.

End Date History

In this Post I describe how I quickly built the solution.

Overview

The process to do this involves;

  • creating a Power BI Application
  • creating a Power BI Dataset
  • creating a script to retrieve the data from the MV and inject it into the Power BI Dataset
  • creating a Power BI Report for the data
  • embedding the Report in the MIM Portal

Registering a Power BI Application

Head over to Power BI for Developers and Register an Application for Power BI. Login to Power BI with an account for the tenant you’ll be reporting data for. Give your Application a name and choose Native Application. Set the Redirect URL to https://localhost

CreatePBIApp

Choose the permissions for you Application. As we’ll be writing data into Power BI you’ll need a minimum of Read and Write all Datasets. Select Register App.

Create PBI App Permissions

Record your Client ID for your Application. We’ll need this to connect to Power BI.

Register the App

We need to authenticate to Power BI the first time using a UI to provide Authorization for our Application. In order to do that we need to add another Reply URL to our application. Head to the Apps Dev Portal, select your application and Edit the Application Manifest. Add an additional Reply URL for https://login.live.com/oauth20_desktop.srf as shown below.

Add Reply URL for AuthZ

The following PowerShell commands will then allow us to Authenticate utilizing the Power BI PowerShell module. If you don’t have the Power BI PowerShell Module installed un-comment Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force  to install the PowerShell Power BI PowerShell Module.

Update for your Client ID for the App you registered in the previous steps.

# Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force
Import-Module PowerBIPS -RequiredVersion 1.2.0.9

# PowerBI App
$clientID = "4036df76-4de6-43cb-afe6-1234567890"

$authtoken = Get-PBIAuthToken -ClientId $clientID

Sign in with an account for the Tenant where you created the Power BI App.

Interactive Login for Dataset Creation

Accept the permissions you chose when registering the Power BI App.

Authorize PowerBI App

Creating the Power BI Dataset

Now we will create the Power BI Table (Dataset) that we will use when we insert the records.

My table is named Employee and the DataSet EmployeeEndDateReport.  I’m keeping the table slim to enough info for our purpose. Date added to the dataset, employees Accountname, Displayname, Active state, EndDate and EndDateReceived. The following script will create the Dataset.

Populating the Dataset

With our table created, lets populate the table with employees that have an EndDate. As this is the first time we run it, we set a watermark date to add people from. I’ve gone with the previous year.  I then query the MV for Employees with an EndDate within the last 365 days, build a PowerShell Object with the columns from our table and insert them into Power BI. I also set a watermark of the last time we had an EndDate Received from the MA and output that to the watermark file. This is so next time we can quickly get only users that have an EndDate that was received since the last time we ran the process.

NOTE: for full automation you’ll need to change line 6 for your secure method of choice of providing credentials to scripts. 

Create a Power BI Report

Now in Power BI select your Data Set and design your report. Here is a sample one that I’ve put together. I simply selected the columns from the dataset and updated the look and feel. I then added in a column (individually for AccountName, DisplayName and Active) and chose it as Filter so that I have various ways of filtering whoever I’m looking for.

Power BI Report.png

Once you have run the process for a while and you have changed values for the attribute you are keeping history for, you will see when you select a user with changed values, you will see the history.

End Date History

Summary

To complete the solution you’ll want to automate the script that queries the MV for changes (probably after each run from the MA that provides the attribute you are recording history for), and you’ll want to embed the report in the MIM Portal. In this post here I detail how to do that step by step.

 

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

Validate Your Authoritative Sources – Creating a Fuse for FIM/MIM Import and Sync run cycles

 

Introduction

The Microsoft Identity Manager Synchronisation Engine has been around for close to 20 years and is highly functional and very reliable.

The Achilles heal though for any IDAM Sync Engine will always be an authoritative source and the information it provides to the Sync Engine.

I’m seeing more and more SaaS services being used as the Authoritative Source for identity management systems. Think Success Factors and Workday. Connecting across the internet to these and the rate of change within organisations means the amount of change data I’m seeing as well as the common human factor of changes en-mass means it is even more important to validate your import feeds before processing through your Sync Engines business logic.

Overview

This post details the foundation of a little logic that will call an Import from the Authoritative Source and analyse what is returned, evaluate it against the previous Import to understand the number of objects expected and determine if it is within an acceptable tolerance (I’m using 1% changes of total managed objects). If it doesn’t checkout, don’t run a Sync and send an email to someone to check it out and make a rational human decision (and maybe a manual sync). If the Import is valid then run the Sync.

Simply put;

  • Query the Authoritative Management Agent and get the last Import Run
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
  • Run an Import cycle only
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
    • Evaluate each of the Staging Adds, Renames, Updates and Deletes and see if the number of changes is less that the tolerance (1%)
      • if yes proceed with a Sync Run
      • if no send a notification and don’t run the Sync

Enhancements

This will need to be tailored for each environment. What is normal for the number of changes expected in your environment? You may see a lot of Updates and the global 1% tolerance I have here doesn’t work for that. So you may want a tolerance per Adds, Renames, Updates and Deletes.

Implementation

Where I’ve used this, I’ve saved the PowerShell script below into the same directory as the other scripts that automate the MIM Sync Run Sequence. I’ve updated the previous automation script and removed the Authoritative Delta Import / Delta Sync, Full Import / Delta Sync etc and called this Auth Fuse Script instead.

The Script

As always this uses the awesome LithnetMIISAutomation PowerShell Module from Ryan. Update;

  • lines 5-11 for your Auth Source.
  • lines 20-24 for SMTP Server and Notification settings
  • $smtpBody lines for what you want the notification emails to say

Summary

A simple piece of logic to check and validate your imports can save hours/days of work.

If your Auth Source doesn’t provide a full dataset and you haven’t checked your import before processing you could be deleting a LOT of accounts.

If HR changed the Org Structure and didn’t inform you or take into account IDAM Business Logic you could be about to process a lot of AD Account Moves. If it involved redundancies and they haven’t informed anyone yet, you could be exposing that information to an entire organisation. CHECK AND VALIDATE YOUR AUTHORITATIVE SOURCE IMPORTS !!

Continuous Credential Prompt when accessing MIM Password Registration Portal

First published at https://nivleshc.wordpress.com

Recently I was at a customer site, setting up a Microsoft Identity Manager (MIM) 2016 environment, which included the deployment of the Self Service Password Registration and Self Service Password Reset portals. For additional security, I was using Kerberos instead of the default NTLM.

I finished installing the MIM Portal, Service, Password Registration and Password Reset Portals without any issues.

I then proceeded to securing all http endpoints by enabling them for SSL and after that removing the http bindings, so that you could access the MIM Portal, Password Registration and Password Reset Portals only via https. No issues there as well.

By this time I was pretty pleased with myself 😉 Everything was going as planned, no issues faced at all. Finally, lady luck was showering me with her blessings.

Having finished the installation and configuration, I proceeded to testing the solution.

The first thing to check was the MIM Portal site. I opened up a web browser and navigated to the Microsoft Identity Manager Portal. When prompted, I logged in with the mimadmin domain account credentials. I was successfully logged in and could access all the parts without any hitch.

Now kids, if you are faint at heart, be very wary of what happens next (hint. this is the time when you cover your eyes with your hands when watching a horror movie).

I then tried accessing the Self Service Password Registration Portal and got prompted for credentials.

SSPRegistration_CredentialPrompt

I entered the mimadmin account credentials and pressed enter. Just as I thought I had successfully logged in, the credential prompt returned! hmm, that is weird. I was pretty sure I had typed the username and password correctly. Oh well, maybe I didn’t.

I typed the credentials again and pressed enter. Quick as a flash, the credential prompt returned! Uh? What was happening here?{scratching my head} Hmm, I seem to be making a lot of typos today. I carefully entered the username and password again, taking my time this time, to ensure I was entering it correctly. I then pressed enter and waited.

Well, I didn’t have to wait for long since within a second, I got greeted with the Not Authorized screen!

SSPRegistration_NotAuthorised

Fascinating. It seems that lady luck had flown away because here indeed was an issue with the Self Service Password Registration Portal! Ok, Mister. Lets have a look at whats causing this kerfuffle!

I opened up the event viewer on the Self Service Password Registration server and went through each of the logged events in the Application and System logs, however I couldn’t find any clue as to why the credentials were not working. I secretly had suspicions that the issue could be due to Kerberos token errors, however I couldn’t find anything in the event logs to substantiate my suspicion. Hmm, the plot was indeed getting thicker!

I next started doing some Google searches, thinking that someone else might have encountered the same issue. Alas, it seemed that I was alone in my woes as the results seemed to be quite thin in regards to any possible solution for my issue.

Finally, I decided to follow my dear ol’ Sherlock’s advice “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”

I went through the whole Self Service Password Registration setup process, checking each and every part of the configuration, to ensure that the values were as expected. After 10 minutes, I was almost done checking and no clues so far 😦

Lastly, I opened IIS Manager and checked all the settings. Nothing here as well. Hey back up a bit. What is this?

The Self Service Password Registration Portal site had its useAppPoolCredentials set to False.

SSPRegistration_useAppPoolCredentialsFalse

Now, this should be True! Is this what was causing the issue?

I quickly changed the value for useAppPoolCredentials from False to True.

SSPRegistration_useAppPoolCredentialsTrue

I then opened my web browser again and navigated to the Self Service Password Registration Portal. Once again the familiar credential prompt came up. I entered the same credentials as before and pressed enter.

Woo hoo!! This time around I was successfully logged in.

I sincerely hope that this post helps others who might be encountering the same error.

Have a great day 😉

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed. There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights 🙂

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself.  This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started. Implementing a NodeJS WebApp using Visual Studio Code 

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report

Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire.  When configuring GALSync ([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to.  Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true 
  1. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also. Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true 

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

MIM2016 Upgrade Hanging on Custom Action – SetPermissionEval

I was upgrading a client’s environment from FIM2010 R2 to MIM2016, during the upgrade of the Synchronization service, the installer appeared stuck, I waited for over an hour, there was no activity and no progress update. I checked the msi installation log, and found the last activity was CustomAction = SetPermissionEval, ActionType=3073. Other than this, there was no errors or any indication of failures.

msilog

According to this TechNet article, SetPermissionEval sets access permission (ACLs) for file folders, registry, DCOM launch/access permission and WMI.

ExtensionsCache

So I opened the Process Monitor, I discovered the reason was the hidden folder Microsoft Forefront Identity Manager\2010\Synchronization Service\ExtensionsCache, this directory contained over 260,000 folders with approximately 2 million objects, the SetPermissionEval custom action was applying ACL on each of them!

procmon

I couldn’t find the exact purpose for the ExtensionsCache, there is no Microsoft documentation on it nor any mention in the official upgrade guidance or best practice, however by looking at the contents of the folder, I suspect FIM/MIM creates these folders when running synchronisation or export using custom code based extension rules.

Based on an earlier forum post, I decided to delete the contents in this folder

delete

Once all the items are deleted, I restarted the synchronization service upgrade, the upgrade continued and finished without delay.

I still don’t understand why the installer file should try to set the file permission in the cache directory, when the whole directory content could be removed without problem, why brother?

Anyway if you are upgrading or patching your FIM or MIM instance, it might be worthwhile to check your ExtensionsCache hidden directory, if you have too many folders there, stop the synchronization service and delete those cache folders to avoid this problem.

Receive Push Notifications from Microsoft Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

IMG-5194

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Set your eyes on the Target!

1015red_F1CoverStory.jpg

So in my previous posts I’ve discussed a couple of key points in what I define as the basic principles of Identity and Access Management;

Now that we have all the information needed, we can start to look at your target systems. Now in the simplest terms this could be your local Active Directory (Authentication Domain), but this could be anything, and with the adoption of cloud services, often these target systems are what drives the need for robust IAM services.

Something that we are often asked as IAM consultants is why. Why should the corporate applications be integrated with any IAM Service, and these are valid questions. Sometimes depending on what the system is and what it does, integrating with an IAM system isn’t a practical solution, but more often there are many benefits to having your applications integrated with and IAM system. These benefits include:

  1. Automated account provisioning
  2. Data consistency
  3. If supported Central Authentication services

Requirements

With any target system much like the untitled1IAM system itself, the one thing you must know before you go into any detail are the requirements. Every target system will have individual requirements. Some could be as simple as just needing basic information, first name, last name and date of birth. But for most applications there is allot more to it, and the requirements will be derived largely by the application vendor, and to a lessor extent the application owners and business requirements.

IAM Systems are for the most part extremely flexible in what they can do, they are built to be customized to an enormous degree, and the target systems used by the business will play a large part in determining the amount of customisations within the IAM system.

This could be as simple as requiring additional attributes that are not standard within both the IAM system and your source systems, or could also be the way in which you want the IAM system to interact with the application i.e. utilising web services and building custom Management Agents to connect and synchronise data sets between.

But the root of all this data is when using an IAM system you are having a constant flow of data that is all stored within the “Vault”. This helps ensure that any changes to a user is flowed to all systems, and not just the phone book, it also ensures that any changes are tracked through governance processes that have been established and implemented as part of the IAM System. Changes made to a users’ identity information within a target application can be easily identified, to the point of saying this change was made on this date/time because a change to this persons’ data occurred within the HR system at this time.

Integration

Most IAM systems will have management agents or connectors (the phases can vary depending on the vendor you use) built for the typical “Out of Box” systems, and these will for the most part satisfy the requirements of many so you don’t tend to have to worry so much about that, but if you have “bespoke” systems that have been developed and built up over the years for your business then this is where the custom management agents would play a key part, and how they are built will depend on the applications themselves, in a Microsoft IAM Service the custom management agents would be done using an Extensible Connectivity Management Agent (ECMA). How you would build and develop management agents for FIM or MIM is quite an extensive discussion and something that would be better off in a separate post.

One of the “sticky” points here is that most of the time in order to integrate applications, you need to have elevated access to the applications back end to be able to populate data to and pull data from the application, but the way this is done through any IAM system is through specific service accounts that are restricted to only perform the functions of the applications.

Authentication and SSO

Application integration is something seen to tighten the security of the data and access to applications being controlled through various mechanisms, authentication plays a large part in the IAM process.

During the provisioning process, passwords are usually set when an account is created. This is either through using random password generators (preferred), or setting a specific temporary password. When doing this though, it’s always done with the intent of the user resetting their password when they first logon. The Self Service functionality that can be introduced to do this enables the user to reset their password without ever having to know what the initial password was.

Depending on the application, separate passwords might be created that need to be managed. In most cases IAM consultants/architects will try and minimise this to not being required at all, but this isn’t always the case. In these situations, the IAM System has methods to manage this as well. In the Microsoft space this is something that can be controlled through Password Synchronisation using the “Password Change Notification Service” (PCNS) this basically means that if a user changes their main password that change can be propagated to all the systems that have separate passwords.

SONY DSCMost applications today use standard LDAP authentication to provide access to there application services, this enables the password management process to be much simpler. Cloud Services however generally need to be setup to do one of two things.

  1. Store local passwords
  2. Utilise Single Sign-On Services (SSO)

SSO uses standards based protocols to allow users to authenticate to applications with managed accounts and credentials which you control. Examples of these standard protocols are the likes of SAML, oAuth, WS-Fed/WS-Trust and many more.

There is a growing shift in the industry for these to be cloud services however, being the likes of Microsoft Azure Active Directory, or any number of other services that are available today.
The obvious benefit of SSO is that you have a single username or password to remember, this also greatly reduces the security risk that your business has from and auditing and compliance perspective having a single authentication directory can help reduce the overall exposure your business has to compromise from external or internal threats.

Well that about wraps it up, IAM for the most part is an enabler, it enables your business to be adequately prepared for the consumption of Cloud services and cloud enablement, which can help reduce the overall IT spend your business has over the coming years. But one thing I think I’ve highlighted throughout this particular series is requirements requirements requirements… repetitive I know, but for IAM so crucially important.

If you have any questions about this post or any of my others please feel free to drop a comment or contact me directly.