Validate Your Authoritative Sources – Creating a Fuse for FIM/MIM Import and Sync run cycles

 

Introduction

The Microsoft Identity Manager Synchronisation Engine has been around for close to 20 years and is highly functional and very reliable.

The Achilles heal though for any IDAM Sync Engine will always be an authoritative source and the information it provides to the Sync Engine.

I’m seeing more and more SaaS services being used as the Authoritative Source for identity management systems. Think Success Factors and Workday. Connecting across the internet to these and the rate of change within organisations means the amount of change data I’m seeing as well as the common human factor of changes en-mass means it is even more important to validate your import feeds before processing through your Sync Engines business logic.

Overview

This post details the foundation of a little logic that will call an Import from the Authoritative Source and analyse what is returned, evaluate it against the previous Import to understand the number of objects expected and determine if it is within an acceptable tolerance (I’m using 1% changes of total managed objects). If it doesn’t checkout, don’t run a Sync and send an email to someone to check it out and make a rational human decision (and maybe a manual sync). If the Import is valid then run the Sync.

Simply put;

  • Query the Authoritative Management Agent and get the last Import Run
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
  • Run an Import cycle only
    • set variables for the total number of objects processed; Adds, Renames, Updates and Deletes (and Total)
    • Evaluate each of the Staging Adds, Renames, Updates and Deletes and see if the number of changes is less that the tolerance (1%)
      • if yes proceed with a Sync Run
      • if no send a notification and don’t run the Sync

Enhancements

This will need to be tailored for each environment. What is normal for the number of changes expected in your environment? You may see a lot of Updates and the global 1% tolerance I have here doesn’t work for that. So you may want a tolerance per Adds, Renames, Updates and Deletes.

Implementation

Where I’ve used this, I’ve saved the PowerShell script below into the same directory as the other scripts that automate the MIM Sync Run Sequence. I’ve updated the previous automation script and removed the Authoritative Delta Import / Delta Sync, Full Import / Delta Sync etc and called this Auth Fuse Script instead.

The Script

As always this uses the awesome LithnetMIISAutomation PowerShell Module from Ryan. Update;

  • lines 5-11 for your Auth Source.
  • lines 20-24 for SMTP Server and Notification settings
  • $smtpBody¬†lines for what you want the notification emails to say

Summary

A simple piece of logic to check and validate your imports can save hours/days of work.

If your Auth Source doesn’t provide a full dataset and you haven’t checked your import before processing you could be deleting a LOT of accounts.

If HR changed the Org Structure and didn’t inform you or take into account IDAM Business Logic you could be about to process a lot of AD Account Moves. If it involved redundancies and they haven’t informed anyone yet, you could be exposing that information to an entire organisation. CHECK AND VALIDATE YOUR AUTHORITATIVE SOURCE IMPORTS !!

Continuous Credential Prompt when accessing MIM Password Registration Portal

First published at https://nivleshc.wordpress.com

Recently I was at a customer site, setting up a Microsoft Identity Manager (MIM) 2016 environment, which included the deployment of the Self Service Password Registration and Self Service Password Reset portals. For additional security, I was using Kerberos instead of the default NTLM.

I finished installing the MIM Portal, Service, Password Registration and Password Reset Portals without any issues.

I then proceeded to securing all http endpoints by enabling them for SSL and after that removing the http bindings, so that you could access the MIM Portal, Password Registration and Password Reset Portals only via https. No issues there as well.

By this time I was pretty pleased with myself ūüėČ Everything was going as planned, no issues faced at all. Finally, lady luck was showering me with her blessings.

Having finished the installation and configuration, I proceeded to testing the solution.

The first thing to check was the MIM Portal site. I opened up a web browser and navigated to the Microsoft Identity Manager Portal. When prompted, I logged in with the mimadmin domain account credentials. I was successfully logged in and could access all the parts without any hitch.

Now kids, if you are faint at heart, be very wary of what happens next (hint. this is the time when you cover your eyes with your hands when watching a horror movie).

I then tried accessing the Self Service Password Registration Portal and got prompted for credentials.

SSPRegistration_CredentialPrompt

I entered the mimadmin account credentials and pressed enter. Just as I thought I had successfully logged in, the credential prompt returned! hmm, that is weird. I was pretty sure I had typed the username and password correctly. Oh well, maybe I didn’t.

I typed the credentials again and pressed enter. Quick as a flash, the credential prompt returned! Uh? What was happening here?{scratching my head} Hmm, I seem to be making a lot of typos today. I carefully entered the username and password again, taking my time this time, to ensure I was entering it correctly. I then pressed enter and waited.

Well, I didn’t have to wait for long since within a second, I got greeted with the¬†Not Authorized screen!

SSPRegistration_NotAuthorised

Fascinating. It seems that lady luck had flown away because here indeed was an issue with the Self Service Password Registration Portal! Ok, Mister. Lets have a look at whats causing this kerfuffle!

I opened up the event viewer on the Self Service Password Registration server and went through each of the logged events in the Application and System logs, however I couldn’t find any clue as to why the credentials were not working. I secretly had suspicions that the issue could be due to Kerberos token errors, however I couldn’t find anything in the event logs to substantiate my suspicion. Hmm, the plot was indeed getting thicker!

I next started doing some Google searches, thinking that someone else might have encountered the same issue. Alas, it seemed that I was alone in my woes as the results seemed to be quite thin in regards to any possible solution for my issue.

Finally, I decided to follow my dear ol’ Sherlock’s advice “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”

I went through the whole Self Service Password Registration setup process, checking each and every part of the configuration, to ensure that the values were as expected. After 10 minutes, I was almost done checking and no clues so far ūüė¶

Lastly, I opened IIS Manager and checked all the settings. Nothing here as well. Hey back up a bit. What is this?

The Self Service Password Registration Portal site had its useAppPoolCredentials set to False.

SSPRegistration_useAppPoolCredentialsFalse

Now, this should be True! Is this what was causing the issue?

I quickly changed the value for useAppPoolCredentials from False to True.

SSPRegistration_useAppPoolCredentialsTrue

I then opened my web browser again and navigated to the Self Service Password Registration Portal. Once again the familiar credential prompt came up. I entered the same credentials as before and pressed enter.

Woo hoo!! This time around I was successfully logged in.

I sincerely hope that this post helps others who might be encountering the same error.

Have a great day ūüėČ

Automatically Provision Azure AD B2B Guest Accounts

Azure ‘Business to Business’ (or the catchy acronym ‘B2B’) has been an area of significant development in the last 12 months when it comes to providing access to Azure based applications and services to identities outside an organisation’s tenancy.

Recently, Ryan Murphy (who has contributed to this blog) and I have been tasked to provide an identity based architecture to share Dynamics 365 services within a large organisation, but across two ‘internal’ Azure AD tenancies.

Dynamics 365 takes its identity store from Azure AD; if you’re assigned a license for Dynamics 365 in the Azure Portal, including in a ‘B2B’ scenario, you’re granted access to the Dynamics 365 application (as outlined here). ¬†Further information about how Dynamic 365 concepts integrate with a single or multiple Azure tenancies is outlined here.

Microsoft provide extensive guidance in completing a typical ‘invitation’ based scenario using the Azure portal (using the links above). ¬†Essentially this involves inviting users using an email which relies on that person manually clicking on the embedded link inside the email to complete the acceptance (and the ‘guest account’ creation in the Dynamics 365 service linked to that Azure AD).

However, this obviously won’t scale when you’re requiring on inviting thousands of new users, initially, but then also having to repeatedly invite new users as part of a Business-As-Usual (BAU) process as they join the organisation (or ‘identity churn’ as we say).

Therefore, to automate the creation of new Guest Users Azure AD accounts, without involving the user at all, this process can be followed:

  1. Create a ‘service account’ Guest User from the invited Azure AD (has to have the same UPN suffix as the users you’re inviting) to be a member of the resource Azure AD.
  2. Assign the ‘service account’ Guest User to be a member of the ‘Guest Inviter’ role of the resource Azure AD.
  3. Use PowerShell to auto. provision new Guest User accounts using the credentials of the ‘service account’ Guest User.

In this blog, we’ll use the terms ‘Resource Azure AD’ or ‘Resource Tenancy’ which is the location where you’re trying to share the sources out to another Azure AD called ‘Invited Azure AD’ or ‘Invited Tenancy’ where the user accounts (including usernames & passwords) you’re inviting reside. ¬†The invited users only ever use their credentials in their own Azure AD or tenancy – never credentials of the ‘Resource Azure AD or tenancy’. ¬†The ‘Guest User’ object created in the ‘Resource Tenancy’ are essentially just linking objects without any stored password.

A ‘Service Account’ Azure AD account dedicated solely to the automatic creation of Guest Users in the Resource Tenancy will need to be created first in the ‘Invited Azure AD’ – for this blog, we used an existing Azure AD account sourced using a synchronised local Active Directory. ¬†This account did not have any ‘special’ permissions in the ‘Invited Azure AD’ but according to some blogs, it requires at least ‘read’ access to the user store in the ‘Invited Azure AD’ at least (which is default).

This ‘Service Account’ Azure AD account should have a mailbox associated with it, i.e. either an Exchange Online (Office 365) mailbox, or a mail value that has a valid SMTP address for a remote mailbox. ¬†This mailbox is needed to approve the creation of a Guest User account in the Resource Tenancy (only needed for this individual Service Account).

It is strongly recommended that this ‘Service Account’ user in the ‘Invited Azure AD’ has a very strong & complex password, and that any credential used for that account within a PowerShell script be encrypted using David Lee’s blog.

The PowerShell scripts listed below to create these Guest Accounts accounts could then be actioned by an identity management system e.g. Microsoft Identity Manager (MIM) or a ‘Runbook’ or workflow system (e.g. SharePoint).

 

Task 1: Create the ‘Service Account’ Guest User using PowerShell

Step 1: Sign into the Azure AD Resource Tenancy’s web portal: ‘portal.azure.com’, using a Global Admin credential.

Step 2: ¬†When you’re signed in, click on the account profile picture on the top right of the screen and select the correct ‘Resource Tenancy’ (There could be more than one tenant associated with the account you’re using):

Screenshot 2017-09-19 at 9.34.18 AM

Step 3: ¬†Once the tenancy is selected, click on the ‘Azure Active Directory’ link on the left pane.

Step 4: ¬†Click ‘User Settings’ and verify the setting (which is set by default for new Azure AD tenancies): ¬†‘Members can invite’.

Screenshot 2017-09-19 11.31.51

Step 5: ¬†Using a new PowerShell session, connect and authenticate to the Azure AD tenancy where the Guest User accounts are required to be created into (i.e. the ‘Resource Azure AD’).

Be sure to specify the correct ‘Tenant ID’ of the ‘Resource Azure AD’ using the PowerShell switch ‘-TenantId‘ followed by the GUID value of your tenancy (to find that Tenant ID, follow the instructions here).

$Creds = Get-Credential

Connect-AzureAD¬†-Credential¬†$creds¬†-TenantId¬†“aaaaa-bbbbb-ccccc-ddddd”

 

Step 6: ¬†The following PowerShell command should be executed under a ‘Global Admin’ to create the ‘Service Account’ e.g. ‘serviceaccount@invitedtenancy.com’.

 

New-AzureADMSInvitation -InvitedUserDisplayName “Service Account Guest Inviter” -InvitedUserEmailAddress “serviceaccount@invitedtenancy.com” -SendInvitationMessage $true -InviteRedirectUrl http://myapps.microsoft.com -InvitedUserType member

 

Step 7: ¬†The ‘Service Account’ user account will then need to locate the email invitation sent out but his command and click on the link embedded within to authorise the creation of the Guest User object in the ‘Resource Tenancy’.

 

Task 2: Assign the ‘Service Account’ Guest Inviter Role using Azure Portal

Step 1: ¬†Sign into the Azure web portal: ‘portal.azure.com’ with the same ‘Global Admin’ (or lower permission account) credential used in Task 1 (or re-use the same ‘Global Admin’ session from Task 1).

Step 2: ¬†Click on the ‘Azure Active Directory’ shortcut on the left pane of the Azure Portal.

Step 3: ¬†Click on the ‘All Users’ tab and select the ‘Service Account’ Guest User.

(I’m using ‘demo.microsoft.com’ pre-canned identities in the screen shot below, any names similar to real persons is purely coincidental – an image for ‘serviceaccount@invitedtenancy’ used as the example in Task 1 could not be reproduced)

Screenshot 2017-09-19 09.44.36

Step 4: ¬†Once the ‘Service Account’ user is selected, click on the ‘Directory Role’ on the left pane. ¬†Click to change their ‘Directory Role’ type to ‘Limited administrator’ and select ‘Guest Inviter’ below that radio button. ¬†Click the ‘Save’ button.

Screenshot 2017-09-19 09.43.53

Step 5: ¬†The next step is to test to ensure that ‘Service Account’ Guest User account can invite users from the same ‘UPN/Domain suffix’. ¬† Click on the ‘Azure Active Directory’ link on the left pane off the main Azure Portal.

Step 6: ¬†Click ‘User and groups’ and click ‘Add a guest user’ on the right:

Screenshot 2017-09-19 09.36.02

Step 7: ¬†On the ‘Invite a guest’ screen, send an email invitation to a user from the same Azure AD as the ‘Service Account’ Guest User. ¬†For example, if your ‘Service Account’ Guest user UPN / Domain Suffix is: ‘serviceaccount@remotetenant.com’, then invite a user from the same UPN/domain suffix e.g. ‘jim@remotetenant.com’ ¬†(again, only an example – any coincidence to current or future email address is purely coincidental).

Screenshot 2017-09-19 09.36.03

Step 8: ¬†When the user receives the invitation email, ensure that the following text appears at the bottom of the email: ¬†‘There is no action required from you at this time’:

image002

Step 9: ¬†If that works, then PowerShell can now automate that invitation process bypassing the need for emails to be sent out. ¬†Automatic Guest Account creation can now leverage the ‘Service Account’ Guest User.

NOTE: ¬†If you try to invite a user from with UPN/Domain suffix that does not match the ‘Service Account’ Guest User, the invitation will still be sent but it will appear requesting the user accept the invitation. ¬†The invitation will be in a ‘pending acceptance’ state until that is done, and the Guest User object will not be created until that is completed.

Task 3:  Auto. Provision new Guest User accounts using PowerShell

Step 1: ¬†Open Windows PowerShell (or re-use an existing PowerShell session that has rights to the ‘Resource Tenancy’).

Step 2: ¬†Type the following example PowerShell command to send ¬†invitation out, and authenticate when prompted using the ‘Invited Tenancy’ credentials of the ‘Service Account’ Guest User.

In the script, again be sure to specify the ‘TenantID’ for the switch –TenantID of the ‘Resource Tenancy’, not the ‘Invited Tenancy’.

 

#Connect to Azure AD

$Creds = Get-Credential

Connect-AzureAD¬†-Credential¬†$creds¬†-TenantId¬†“aaaaa-bbbbb-ccccc-ddddd”

$messageInfo = New-Object Microsoft.Open.MSGraph.Model.InvitedUserMessageInfo

$messageInfo.customizedMessageBody¬†=¬†‚ÄúHey there! Check this out. I created and approved my own invitation through PowerShell‚ÄĚ

New-AzureADMSInvitation¬†-InvitedUserEmailAddress¬†“ted@invitedtenancy.com”¬†-InvitedUserDisplayName¬†“Ted at Invited Tenancy”¬†¬†-InviteRedirectUrl¬†https://myapps.microsoft.com¬†-InvitedUserMessageInfo¬†$messageInfo¬†-SendInvitationMessage¬†$false

 

Compared to using the Azure portal, this time no email will be sent (the display name and message body will never be seen by the invited user, it’s just required for the command complete). ¬† To send a confirmation email to the user, you can change the switch -SendInvitationMessage to: $True.

 

Step 3: ¬†The output of the PowerShell command should have at the end of the text next to ‘Status’ as ‘Accepted’:

image001

This means the Guest User object has automatically been created and approved by the ‘Resource Tenancy’. ¬† That Guest User object created will be associated with the actual Azure AD user object from the ‘Invited Tenancy’.

The next steps for this invited Guest User will be then to assign them a Dynamics 365 license and then a Dynamics 365 role in the ‘Resource Tenancy’ (which might be topics of future blogs).

Hope this blog has proven useful.

 

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed.¬†There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights ūüôā

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself. ¬†This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started.¬†Implementing a NodeJS WebApp using Visual Studio Code¬†

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report

Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire. ¬†When configuring GALSync¬†([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found¬†about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to. ¬†Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true¬†
  1. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also.¬†Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true¬†

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

MIM2016 Upgrade Hanging on Custom Action – SetPermissionEval

I was upgrading a client’s environment from FIM2010 R2 to MIM2016, during the upgrade of the Synchronization service, the installer appeared stuck, I waited for over an hour, there was no activity and no progress update. I checked the msi installation log, and found the last activity was CustomAction = SetPermissionEval, ActionType=3073. Other than this, there was no errors or any indication of failures.

msilog

According to this TechNet article, SetPermissionEval sets access permission (ACLs) for file folders, registry, DCOM launch/access permission and WMI.

ExtensionsCache

So I opened the Process Monitor, I discovered the reason was the hidden folder Microsoft Forefront Identity Manager\2010\Synchronization Service\ExtensionsCache, this directory contained over 260,000 folders with approximately 2 million objects, the SetPermissionEval custom action was applying ACL on each of them!

procmon

I couldn’t find the exact purpose for the ExtensionsCache, there is no Microsoft documentation on it nor any mention in the official upgrade guidance or best practice, however by looking at the contents of the folder, I suspect FIM/MIM creates these folders when running synchronisation or export using custom code based extension rules.

Based on an earlier forum post, I decided to delete the contents in this folder

delete

Once all the items are deleted, I restarted the synchronization service upgrade, the upgrade continued and finished without delay.

I still don’t understand why the installer file should try to set the file permission in the cache directory, when the whole directory content could be removed without problem, why brother?

Anyway if you are upgrading or patching your FIM or MIM instance, it might be worthwhile to check your ExtensionsCache hidden directory, if you have too many folders there, stop the synchronization service and delete those cache folders to avoid this problem.

Receive Push Notifications from Microsoft Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page¬†and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

IMG-5194

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here¬†to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Set your eyes on the Target!

1015red_F1CoverStory.jpg

So in my previous posts I’ve discussed a couple of key points in what I define as the basic principles of Identity and Access Management;

Now that we have all the information needed, we can start to look at your target systems. Now in the simplest terms this could be your local Active Directory (Authentication Domain), but this could be anything, and with the adoption of cloud services, often these target systems are what drives the need for robust IAM services.

Something that we are often asked as IAM consultants is why. Why should the corporate applications be integrated with any IAM Service, and these are valid questions. Sometimes depending on what the system is and what it does, integrating with an IAM system isn’t a practical solution, but more often there are many benefits to having your applications integrated with and IAM system. These benefits include:

  1. Automated account provisioning
  2. Data consistency
  3. If supported Central Authentication services

Requirements

With any target system much like the untitled1IAM system itself, the one thing you must know before you go into any detail are the requirements. Every target system will have individual requirements. Some could be as simple as just needing basic information, first name, last name and date of birth. But for most applications there is allot more to it, and the requirements will be derived largely by the application vendor, and to a lessor extent the application owners and business requirements.

IAM Systems are for the most part extremely flexible in what they can do, they are built to be customized to an enormous degree, and the target systems used by the business will play a large part in determining the amount of customisations within the IAM system.

This could be as simple as requiring additional attributes that are not standard within both the IAM system and your source systems, or could also be the way in which you want the IAM system to interact with the application i.e. utilising web services and building custom Management Agents to connect and synchronise data sets between.

But the root of all this data is when using an IAM system you are having a constant flow of data that is all stored within the “Vault”. This helps ensure that any changes to a user is flowed to all systems, and not just the phone book, it also ensures that any changes are tracked through governance processes that have been established and implemented as part of the IAM System. Changes made to a users‚Äô identity information within a target application can be easily identified, to the point of saying this change was made on this date/time because a change to this persons‚Äô data occurred within the HR system at this time.

Integration

Most IAM systems will have management agents or connectors (the phases can vary depending on the vendor you use) built for the typical “Out of Box” systems, and these will for the most part satisfy the requirements of many so you don’t tend to have to worry so much about that, but if you have “bespoke” systems that have been developed and built up over the years for your business then this is where the custom management agents would play a key part, and how they are built will depend on the applications themselves, in a Microsoft IAM Service the custom management agents would be done using an Extensible Connectivity Management Agent (ECMA). How you would build and develop management agents for FIM or MIM is quite an extensive discussion and something that would be better off in a separate post.

One of the “sticky” points here is that most of the time in order to integrate applications, you need to have elevated access to the applications back end to be able to populate data to and pull data from the application, but the way this is done through any IAM system is through specific service accounts that are restricted to only perform the functions of the applications.

Authentication and SSO

Application integration is something seen to tighten the security of the data and access to applications being controlled through various mechanisms, authentication plays a large part in the IAM process.

During the provisioning process, passwords are usually set when an account is created. This is either through using random password generators (preferred), or setting a specific temporary password. When doing this though, it’s always done with the intent of the user resetting their password when they first logon. The Self Service functionality that can be introduced to do this enables the user to reset their password without ever having to know what the initial password was.

Depending on the application, separate passwords might be created that need to be managed. In most cases IAM consultants/architects will try and minimise this to not being required at all, but this isn’t always the case. In these situations, the IAM System has methods to manage this as well. In the Microsoft space this is something that can be controlled through Password Synchronisation using the “Password Change Notification Service” (PCNS) this basically means that if a user changes their main password that change can be propagated to all the systems that have separate passwords.

SONY DSCMost applications today use standard LDAP authentication to provide access to there application services, this enables the password management process to be much simpler. Cloud Services however generally need to be setup to do one of two things.

  1. Store local passwords
  2. Utilise Single Sign-On Services (SSO)

SSO uses standards based protocols to allow users to authenticate to applications with managed accounts and credentials which you control. Examples of these standard protocols are the likes of SAML, oAuth, WS-Fed/WS-Trust and many more.

There is a growing shift in the industry for these to be cloud services however, being the likes of Microsoft Azure Active Directory, or any number of other services that are available today.
The obvious benefit of SSO is that you have a single username or password to remember, this also greatly reduces the security risk that your business has from and auditing and compliance perspective having a single authentication directory can help reduce the overall exposure your business has to compromise from external or internal threats.

Well that about wraps it up, IAM for the most part is an enabler, it enables your business to be adequately prepared for the consumption of Cloud services and cloud enablement, which can help reduce the overall IT spend your business has over the coming years. But one thing I think I’ve highlighted throughout this particular series is requirements requirements requirements… repetitive I know, but for IAM so crucially important.

If you have any questions about this post or any of my others please feel free to drop a comment or contact me directly.

 

What’s a DEA?

In my last post I made a reference to a “Data Exchange Agreement” or DEA, and I’ve since been asked a couple of times about this. So I thought it would be worth while writing a post about what it is, why it’s of value to you and to your business.

So what’s a DEA? Well in simply terms it’s exactly what the name states, it’s an agreement that defines the parameters in which data is exchanged between Service A and Service B.¬†Service A being the Producer of Attributes X and Services B, the consumers. Now I’ve intentionally used a vague example here as a DEA is used amongst many services in business and or government and is not specifically related to IT or IAM Services. But if your business adopts a controlled data governance process, it can play a pivotal role in the how IAM Services are implemented and adopted throughout the entire enterprise.

So what does a DEA look like, well in an IAM service it’s quite simple, you specify your “Source” and your “Target” services, an example of this could be the followings;

Source
ServiceNow
AurionHR
PROD Active Directory
Microsoft Exchange
Target
PROD Active Directory
Resource Active Directory Domain
Microsoft Online Services (Office 365)
ServiceNow

As you can see this only tells you where the data is coming from and where it’s going to, it doesn’t go into any of the details around what data is being transported and in which direction. A separate section in the DEA details this and an example of this is provided below;

MIM Flow Service Now Source User Types Notes
accountName –> useraccountname MIM All ¬†
employeeID –> employeeid AurionHR All ¬†
employeeType –> employeetype AurionHR All ¬†
mail <Рemail Microsoft Exchange All  
department –> department AurionHR All
telephoneNumber –> phone PROD AD All ¬†
o365SourceAnchor –> ImmutableID Resource Domain All ¬†
employeeStatus –> status AurionHR All ¬†
dateOfBirth –> dob AurionHR CORP Staff yyyy-MM-dd
division –> region AurionHR CORP Staff ¬†
firstName –> preferredName AurionHR CORP Staff ¬†
jobTitle –> jobtitle AurionHR CORP Staff ¬†
positionNumber –> positionNumber AurionHR CORP Staff
legalGivenNames <– firstname ServiceNow Contractors
localtionCode <Рlocation ServiceNow Contractors  
ManagerID <Рmanager ServiceNow Contractors  
personalTitle <Рtitle ServiceNow Contractors  
sn <Рsn ServiceNow Contractors  
department <– department ServiceNow Contractors
employeeID <Рemployeeid ServiceNow Contractors  
employeeType <Рemployeetype ServiceNow Contractors  

This might seem like a lot of detail, but this is actually only a small section of what would be included in a DEA of this type, as the whole purpose of this agreement is to define what attributes are managed by which systems and going to which target systems, and as many IAM consultants can tell you, would be substantially more then what’s provided in this example. And this is just an example for a single system, this is something that’s done for all applications that consume data related to your organisations staff members.

One thing that you might also notice is that I’ve highlighted 2 attributes in the sample above in bold. Why might you ask? Well the point of including this was to highlight data sets that are considered “Sensitive” and within the DEA you would specify this being classified as sensitive data with specific conditions around this data set. This is something your business would define and word to appropriately express this but it could be as simple as a section stating the following;

“Two attributes are classed as sensitive data in this list and cannot be reproduced, presented or distributed under any circumstances”

One challenge that is often confronted within any business is application owners wanting “ownership” of the data they consume. Utilising a DEA provides clarity over who owns the data and what your applications can do with the data they consume removing any uncertainty.

To summarise this post, the point of this wasn’t to provide you with a template, or example DEA to use, it was to help explain what a DEA is, what its used for and examples of what parts can look like. No DEA is the same, and providing you with a full example DEA is only going to make you end up recreating it from scratch anyway. But it is intended to help you with understanding what is needed.

As with any of my posts if you have any questions please pop a comment or reach out to me directly.