Integration of Microsoft Identity Manager with Azure Platform-as-a-Service Services

Overview

This isn’t an out of the box solution. This is a bespoke solution that takes a number of elements and puts them together in a unique way. I’m not expecting anyone to implement this specific solution (but you’re more than welcome to) but to take inspiration from it to implement solutions relevant to your environment(s). This post supports a presentation I did to The MIM Team User Group on 14 June 2017.

This post describes a solution that;

  • Leverages an Azure WebApp (NodeJS) to present a simple website. That site can be integrated easily in the FIM/MIM Portal
  • The NodeJS website leverages an Azure Function App to get a list of users from the FIM/MIM Synchronization Server and allows the user to use typeahead functionality to find the user they want to generate a FIM/MIM object report on
  • On selection of a user, a request will be sent to another Azure Function App to generate and return the report to the user in a new browser window

This is shown graphically below.

 

Report Request UI

The NodeJS WebApp is integrated into the FIM/MIM portal. Bootstrap Typeahead is used to find the user to generate a report on. The Typeahead userlist if fulfilled by an Azure Function into the MIM Sync Metaverse. The Generate Report button fires off a call to FIM/MIM via another Azure Function into the MIM Sync and MIM Service to generate the report.

The returned report opens in a new tab in the users browser. The report contains details of the FIM/MIM connectors the user is represented on.

The values of all attributes for the users hologram from the Metaverse are displayed along with the MA the value came from and the last modified date.

Finally the metadata report from the MIM Service MA Connector Space and the MIM Service.

Prerequisites

These are numerous, but I’ve previously posted about them. You will need;

I encourage you to digest those posts to understand how to configure the prerequisites for this solution.

Additional Solution Requirements

To bring all the individual components together, there are a few additional tasks to enable this solution.

  • Enable CORS on your Azure Function App Configuration (see details further below)
  • If you want to display User Object Photos as part of the report, you will likely need to synchronize them into FIM/MIM from an authoritative source (e.g. Office365/Exchange Online)   Checkout this post  and additional details further below
  • In order to embed the NodeJS WebApp into the FIM/MIM Portal, this post provides the details. Change the target URL from PowerBI URL to your NodeJS site
  • Object Report Request WebApp (see below for sample site)

Azure Functions Cross Origin Resource Sharing (CORS)

You will need to configure CORS to allow the NodeJS WebApp to access the Azure Functions (from both local and Azure). Reflect your port number if it is different from 3000, and use the DNS name for your Azure WebApp.

Sample UI NodeJS HTML

Here is a sample HTML file for your NodeJS WebApp with the UI to provide Input for LoginID fulfilled by the NodeJS Javascript file further below.

Sample UI NodeJS JavaScript

The following NodeJS JavaScript supports the HTML UI above. It populates the LoginID typeahead box and takes the Submit Report button to fulfill the report for the desired object(s). Yes if you use the UI to select (individually) multiple different objects all will be returned in their separate output windows.

As the HTML file above indicates you will need to obtain and make available as part of your NodeJS project the typeahead.bundle.js library.

Azure PowerShell Trigger Function App for AccountNames Lookup

The following Azure Function takes the call from the load of the NodeJS WebApp to populate the typeahead userlist.

Azure PowerShell Trigger Function App for User Object Report

Similar in structure to the Username List Lookup Azure Function above, but in the ScriptBlock you embed the Report Generation Script that is detailed here. Modify for what you want to report on.

Photos in the Report

If you want to display images in your report, you will need to determine if the user has an image during the MV metadata report generation part of the script. Add the following lines (updating for the name of your Image attribute; mine is named EXOPhoto) after the Try {} Catch {} in this section $obj = @() ; foreach ($attr in $attributes.Keys)

 # Display the Objects Photo rather than Base64 string 
if ($attr.equals("EXOPhoto")){ 
   $objectphoto = "<img src=$([char]0x22)data:image/jpeg;base64,$($attributes.$attr.Values.Valuestring)$([char]0x22)>" 
   $val = "System.Byte[]" 
}

Then in the output of the HTML report at the end of the report generation insert the $objectphoto variable into the HTML stream.

# Output MIM Service Object Data 
$MIMServiceObjOut = $MIMServiceObjectMetaData | Sort-Object -Property Attribute | ConvertTo-Html -Fragment 
$htmlreport = ConvertTo-HTML -Body "$htmlcss<h1>Microsoft Identity Manager User Object Report</h1><h2>Query</h2>$sourcequery</br><b><center>$objectphoto</br>NOTE: Only attributes with values are displayed.</center></b><h2>Connector(s) Summary</h2>$connectorsummary<h2>MetaVerse Data</h2>$objectmetadata <h2>MIM Service CS Object Data</h2>$MIMServiceCSobjectmetadata <h2>MIM Service Object Data</h2>$MIMServiceObjOut" -Title "MIM Object Report" 

 

As you can see above I’ve also injected the CSS ($htmlcss) into the output stream at the beginning of the Body section.  Somewhere in your script block you will need to define your CSS values. e.g.

 # StyleSheet for nice pretty output 
$htmlcss = "<style> 
   h1, h2, th { text-align: center; } 
   table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; } 
   th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; } 
   td { font-size: 11px; padding: 5px 20px; color: #000; } 
   tr { background: #b8d1f3; } 
   tr:nth-child(even) { background: #dae5f4; } 
   tr:nth-child(odd) { background: #b8d1f3; } 
</style>"

Summary

An interesting solution integrating Azure PaaS Services with Microsoft Identity Manager via PowerShell and the extremely versatile Lithnet FIM/MIM PowerShell Modules.

Please share your implementations enhancing your FIM/MIM Solution.

Migrating ‘SourceAnchor’ from ‘ObjectGUID’ using new AAD Connect 1.1.524.0

I count myself lucky every now and again, for many reasons.  I have my health.  I have my wonderful family.

Today, however, it’s finding out the latest version of AAD Connect (v1.1.524.0) will probably give me back a few more months of my life.

The reason?  My customer’s chosen configuration of their AAD Connect to choose the default value of ‘ObjectGUID’ for their ‘SourceAnchor’ value.

Now, for most organizations with a single AD forest, you’re laughing.  No reason to keep reading.  Log off, go outside, enjoy the sunshine (or have a coffee if you’re in Melbourne).

But no, my customer has TWO AD forests, synchronizing to a single Azure AD tenancy.

OK? What’s the big deal?  That’s been a supported configuration for many years now.

Well…… when they configured their AAD Connect they chose to use ‘ObjectGUID’ as their ‘SourceAnchor’ value:

AADConnect.PNG

Why is this an issue? 

I’m trying to MIGRATE a user from one forest to another.   Has the penny dropped yet?

No?…

OK, if not, let me extract and BOLD these scary dot points from this Microsoft Support Article (https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-design-concepts#sourceanchor):

  • The sourceAnchor attribute can only be set during initial installation. If you rerun the installation wizard, this option is read-only. If you need to change this setting, then you must uninstall and reinstall.
  • If you install another Azure AD Connect server, then you must select the same sourceAnchor attribute as previously used. If you have earlier been using DirSync and move to Azure AD Connect, then you must use objectGUID since that is the attribute used by DirSync.
  • If the value for sourceAnchor is changed after the object has been exported to Azure AD, then Azure AD Connect sync throws an error and does not allow any more changes on that object before the issue has been fixed and the sourceAnchor is changed back in the source directory.

Okay….

Ok another link:

https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-design-concepts#using-msds-consistencyguid-as-sourceanchor

By default, Azure AD Connect (version 1.1.486.0 and older) uses objectGUID as the sourceAnchor attribute. ObjectGUID is system-generated. You cannot specify its value when creating on-premises AD objects.

OK, just un-install and re-install AAD Connect.   No big deal.  Change Window over a weekend.  Get it done.

No, no, no.  Keep reading.

https://blogs.technet.microsoft.com/markrenoden/2017/02/20/choosing-a-sourceanchor-for-multi-forest-sync-with-aad-connect-part-1-introduction/

If you browse to page 6 of this very helpful (and I’ll admit downright scary migration blog), you’ll see this text:

You need to delete your users from Azure Active Directory and you need to start again.

Come again?!. Ok.  In the word’s of the great ‘Hitchhiker’s Guide to the Galaxy’:  DON’T PANIC.

Yes. So. That is one option (sic) however the MS blog does into detail (albeit not tested by me) of another method, namely changing the ‘SourceAnchor’ value away from ‘objectGUID’ in a new installation of AAD Connect by also changing all your users UPN values to ‘onmicrosoft.com’ values, removing then installing a version of AAD connect, then changing their UPN values back to their original values.

But yeah, scary stuff.  Doing this for all users in a very large organization?  Positively terrifying (hence the start of this article).   With an Azure AD that integrates with Exchange, Skype for Business and a basically 24×7 global user base.  Well….you get my drift.

So good news?  Well, the new version supports the migration of ‘SourceAnchor’ values to the use of the positively joyous: msDS-ConsistencyGuid

https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-design-concepts#using-msds-consistencyguid-as-sourceanchor

So back to my original context, why is this important?  Well, looky here …I can see you msDS-ConsistencyGuid  (using ADSIEdit.msc):

ADSIEDIT1.PNG

The reason I’m excited – it’s a ‘writeable attribute’.

So forward sailing boys.  Let slip the anchor.  Let’s get sailing while the tide is high.

(In other words):

I’m going to:

  1. I’m going to upgrade my customer’s AAD Connect
  2. Ensure during the upgrade, that I migrate ‘SourceAnchor’ option in the AAD Connect wizard to use the new msDS-ConsistencyGuid  value in AD.
  3. Ensure all users (in both AD forests) have a new & unique  value after AAD connect performs a full sync and export to both domains.
  4. Ensure my Active Directory Migration Tool (or PowerShell migration script) moves the users msDS-ConsistencyGuid value from one forest to another (as well as retaining SIDHistory and passwords)
  5. And always: Test, test, test – to ensure I don’t lose their Azure AD account in the process.

Cross fingers this all works of course.  There’s very little guidance out there that combines ADMT guidance with this latest AAD Connect versioning.  It’s not explicitly stated in the AAD Connect online documentation, but it suggests that Microsoft have made changes on the Azure AAD ‘cloud’ side of the equation to also migrate unique joins to use this new value during the upgrade.

So upgrading AAD Connect and selecting to use msDS-ConsistencyGuid as your new ‘SourceAnchor’ SHOULD also trigger some back end changes to the tenancy as well (I’m hoping).

As you know, There’s nothing worse than a good plan and design spoiled by one little bug in implementation.  So come back for a future blog or two on my perilous journey argh me maties.. (er, customer project friends).

 

MIM/FIM Full Sync of select objects only

As I detailed in my previous blog here, sometimes there is a need to perform a full synchronization of just a select set of objects in the MIM/FIM Synchronization Service. In my case, it was to all the Synchronization Rules which helped resolve my issue which required a selected Full Synchronization performed. For this customer’s FIM environment, I manually performed the Preview/Full Synchronization on 51 objects as I just needed it done. My colleague Darren Robinson suggested I look at scripting it using the ‘Lithnet PowerShell Module for FIM/MIM Synchronization Service’ located here. Up until this point, I hadn’t used this module in anger and I dearly wished I did as it would have saved plenty of time!

Ryan Newington has done a tremendous job with his modules released to the community to make our lives easier. He even enhanced the tool recently to make this particular task easier (note: make sure you grab the latest version, at least version v1.0.6351, as the script below will not work if you have an older version). The script I now regularly use on my customer’s large FIM environment after making changes to Synchronization Rules, where I mentioned in my previous blog we struggle for the opportunity to perform a Full Synchronization of the entire sync engine rule base, serves my purpose for Synchronization Rules, but you can easily modify the search criteria to perform a Full Synchronization of the objects you require. Here is the script and leave a comment if it has helped you.

How to access Microsoft Identity Manager Hybrid Report data using PowerShell, Graph API and oAuth2

Hybrid Reporting is a great little feature of Microsoft Identity Manager. A small agent installed on the MIM Sync Server will send reporting data to Azure for MIM SSPR and MIM Group activities. See how to install and configure it here.

But what if you want to get the reporting data without going to the Azure Portal and looking at the Audit Reports ? Enter the Azure AD Reports and Events REST API that is currently in preview.  It took me a couple of cracks and getting this working, because documentation is a little vague especially when accessing it via PowerShell and oAuth2. So I’ve written it up and hope it helps for anyone else looking to go down this route.

Gotchas

Accessing the Reports via the API has a couple of caveats that I had to work through:

  • Having the correct permissions to access the report data. Pretty much everything you read tells you that you need to be a Global Admin. Once I had my oAuth tokens I messed around a little and a was able to also get the following from back from the API when purposely using an identity that didn’t have the right permissions. The key piece is “Api request is not from global admin or security admin or security reader role”. I authorized the WebApp using an account that is in the Security Reader Role, and can successfully access the report data.

"Name":  "mimSsgmGroupActivityEvents",
"Name":  "mimSsprActivityEvents",
"Name":  "mimSsprRegistrationActivityEvents",

Here is the full list of Reports available as of 24 May 2017.

{
    "Name":  "b2cAuthenticationCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationEvent",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cAuthenticationCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cMfaRequestCountSummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "tenantUserCount",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageDetailEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummaryEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "b2cUserJourneySummaryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "b2cUserJourneyEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "cloudAppDiscoveryEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "mimSsgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssgmGroupActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "mimSsprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "ssprRegistrationActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "threatenedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "compromisedCredentials",
    "LicenseRequired":  "False"
}
{
    "Name":  "auditEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "accountProvisioningEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromUnknownSourcesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromIPAddressesWithSuspiciousActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsFromMultipleGeographiesEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "signInsFromPossiblyInfectedDevicesEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "irregularSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "allUsersWithAnomalousSignInActivityEvents",
    "LicenseRequired":  "True"
}
{
    "Name":  "signInsAfterMultipleFailuresEvents",
    "LicenseRequired":  "False"
}
{
    "Name":  "applicationUsageSummary",
    "LicenseRequired":  "True"
}
{
    "Name":  "userActivitySummary",
    "LicenseRequired":  "False"
}
{
    "Name":  "groupActivitySummary",
    "LicenseRequired":  "True"
}

How to Access the Reporting API using PowerShell

What you need to do is;

  • Register a WebApp
  • Get an oAuth2 Authentication Code using an account that is either Global Admin or in the Security Admin or Security Reader Azure Roles
  • Use your Bearer and Refresh tokens to query for the reports you’re interested in

Register your WebApp

In the Azure Portal create a new Web app/API app and assign it https://localhost as the Reply URL. Record the Application ID for use in the PowerShell script.

Assign the Read Directory data permission as shown below

Obtain a key from the Keys option on your new Web App.  Record it for use in the PowerShell script.

Generate an Authentication Code, get a Bearer and Refresh Token

Update the following script, changing Lines 5 & 6 for the ApplicationID/ClientId and Client Secret for the WebApp you created above.

Run the script and you will be prompted to authenticate. Use an account in the tenant where you created the Web App that is a Global Admin or in the Security Admin or Security Reader Azure Roles. You will need to change the location where you want the refresh.token stored (line 18).

If you’ve done everything correctly you have authenticated, got an AuthCode which was then used to get your Authorization Tokens. The value of the $Authorization variable should look similar to this;

Now you can use the Refresh token to generate new Authorization Tokens when they time out, simply by calling the Get-NewTokens function included in the script above.

Querying the Reporting API

Now that you have the necessary prerequisites sorted you can query the Reporting API.

Here are a couple of simple queries to return some data to get you started. Update the script for the tenant name of your AzureAD. With the $Authorization values from the script above you can get data for the MIM Hybrid Reports.

Synchronizing Exchange Online/Office 365 User Profile Photos with FIM/MIM

Introduction

This is Part Two in the two-part blog post on managing users profile photos with Microsoft FIM/MIM. Part one here detailed managing users Azure AD/Active Directory profile photo. This post delves deeper into photos, specifically around Office 365 and the reason why you may want to manage these via FIM/MIM.

Background

User profile photos should be simple to manage. But in a rapidly moving hybrid cloud world it can be a lot more complex than it needs to be. The best summary I’ve found of this evolving moving target is from Paul Ryan here.

Using Paul’s sound advice we too are advising our customers to let users manage their profile photo (within corporate guidelines) via Exchange Online. However as described in this article photos managed in OnPremise Active Directory are synchronized to Azure AD and on to other Office365 services only once. And of course we want them to be consistent across AD DS, Azure AD, Exchange Online and all other Office365 Services.

This post details synchronizing user profile photos from Exchange Online to MIM for further synchronization to other systems. The approach uses a combination of Azure GraphAPI and Exchange Remote PowerShell to manage Exchange Online User Profile Photos.

The following graphic depicts the what the end goal is;

Current State

  • Users historically had a photo in Active Directory. DirSync/ADSync/AzureADConnect then synchronized that to Azure AD (and once only into Office 365).
  • Users update their photo in Office365 (via Exchange Online and Outlook Web Access)
    • the photo is synchronized across Office365 Services

Desired State

  • An extension of the Current State is the requirement to be able to take the image uploaded by users in Exchange Online, and synchronize it back to the OnPremise AD, and any other relevant services that leverage a profile photo
  • Have AzureADConnect keep AzureAD consistent with the new photo obtained from Office365 that is synchronized to the OnPrem Active Directory
  • Sync the current photo to the MIM Portal

Synchronizing Office365 Profile Photos

Whilst Part-one dealt with the AzureAD side of profile photos as an extension to an existing AzureAD PowerShell Management Agent for FIM/MIM, I’ve separated out the Office365 side to streamline it and make it as efficient as possible. More on that later. As such I’ve created a new PowerShell Management Agent specifically for Office365 User Profile Photos.

I’m storing the Exchange Online photo in the MIM Metaverse as a binary object just as I did for the AzureAD photo (but in a different attribute ). I’m also storing a checksum of the photos (as I did for the AzureAD Photo, but also in a different attribute) to make it easier for comparing what is in Azure AD and Exchange Online, to then be used to determine if changes have been made (eg. user updated their profile photo).

Photo Checksum

For generating the hash of the profile photos I’m using Get-Hash from the Powershell Community Extensions.  Whilst PowerShell has Get-FileHash I don’t want to write the profile photos out to disk and read them back in just to get the checksum. That slows the process up by 25%. You can get the checksum using a number of different methods and algorithms. Just be consistent and use the same method across both profile photos and you’ll be comparing apples with apples and the comparison logic will work.

Some notes on Photos and Exchange Online (and MFA)

This is where things went off on a number of tangents. Initially I tried accessing the photos using Exchange Online Remote PowerShell.

CAVEAT 1: If your Office365 Tenant is enabled for Multi-Factor Authentication (which it should be) you will need to get the Exchange Online Remote PowerShell Module as detailed here. Chances are you won’t have full Office365 Admin access though, so as long as the account you will be using is in the Recipient Management Role you should be able to go to the Exchange Control Panel using a URL like https://outlook.office365.com/ecp/?realm=&lt;tenantname>&wa=wsignin1.0 where tenantname is something like customer.com.au From the Hybrid menu on in the right handside pane you will then be able to download the Microsoft.Online.CSE.PSModule.Client.application I had to use Internet Explorer to download the file and get it installed successfully. Once installed I used a few lines from this script here to load the Function and start my RPS session from within PowerShell ISE during solution development.

CAVEAT 2: The EXO RPS MFA PS Function doesn’t allow you to pass it your account password. You can pass it the identity you want to use, but not the password. That makes scheduled process automation with it impossible.

CAVEAT 3: The RPS session exposes the Get-UserPhoto cmdlet which is great. But the RPS session leverages the GraphAPI. The RPS PS Module doesn’t refresh it’s tokens, so if the import takes longer than 60 minutes then using this method you’re a bit stuffed.

CAVEAT 4: Using the Get-UserPhoto cmdlet detailed above, the syncing of photos is slow. As in I was only getting ~4 profile photos per minute slow. This also goes back to the token refresh issue as for pretty much any environment of the size I deal with, this is too slow and will timeout.

CAVEAT 5: You can whitelist the IP Address (or subnet) of your host so MFA is not required using Contextual IP Addressing Whitelisting. At that point there isn’t really a need to use the MFA Enabled PREVIEW EXO RPS function anyway. That said I still needed to whitelist my MIM Sync Server(s) from MFA to allow integration into the Graph API. I configured just the single host. The whitelist takes CIDR format so that looks like /32 (eg. 11.2.33.4/32)

Performance Considerations

As I mentioned above,

  • using the Get-UserPhoto cmdlet was slow. ~4 per minute slow
  • using the GraphAPI into Exchange Online and looking at each user and determining if they had a photo then downloading it, was also slow. Slow because at this customer only ~50% of their users have a photo on their mailbox. As such I was only able to retrieve ~145 photos in 25 minutes. *Note: all timings listed above were during development and actually outputting the images to disk to verify functionality. 

Implemented Solution

After all my trial and error on this, here is my final approach and working solution;

  1. Use the Exchange Online Remote PowerShell (non-MFA version) to query and return a collection of all mailboxes with an image *Note, add an exception for your MIM Sync host to the white-listed hosts for MFA (if your Office365 Tenant is enabled for MFA) so the process can be automated
  2. Use the Graph API to obtain those photos
    • with this I was able to retrieve ~1100 profile photos in ~17* minutes (after ~2 minutes to query and get the list of mailboxes with a profile photo)

Pre-requisites

There’s a lot of info above, so let me summarize the pre-requisties;

  • The Granfeldt PowerShell MA
  • Whitelist your FIM/MIM Sync Server from MFA (if your Office 365 environment is enabled for MFA)
  • Add the account you will run the MA as, that will in turn connect to EXO via RPS to the Recipient Management Role
  • Create a WebApp for the PS MA to use to access users Profile Photos via the Graph API (fastest method)
  • Powershell Community Extensions to generate the image checksum

Creating the WebApp to access Office365 User Profile Photos

Go to your Azure Portal and select the Azure Active Directory Blade from the Resource Menu bar on the left. Then select App Registrations and from the Manage Section of the Azure Active Directory menu, and finally from the top of the main pane select “New Application Registration“.

Give it a name and select Web app/API as the type of app. Make the sign-in URL https://localhost and then select Create.

Record the ApplicationID that you see in the Registered App Essentials window. You’ll need this soon.

Now select All Settings => Required Permissions. Select Read all users basic profiles in addition to Sign in and read user profile. Select Save.

Under Required Permissions select Add and then select 1 Select an API, and select Office 365 Exchange Online then click Select.

Choose 2 Select Permissions and then select Read user profiles and Read all users’ basic profiles. Click Select.

Select Grant Permissions

From Settings select Keys, give your key a Description, choose a key lifetime and select Save. RECORD the key value. You’ll need this along with the WebApp ApplicationID/ClientID for the Import.ps1 script.

Using the information from your newly registered WebApp, we need to perform the first authentication (and authorization of the WebApp) to the Graph API. Taking your ApplicationID, Key (Client Secret) and the account you will use on on the Management Agent (and that you have assigned the Recipient Management Role in Exchange Online) and run the script detailed in this post here. It will authenticate you to your new WebApp via the GraphAPI after asking you to provide the account you will use on the MA and Authorizing the permissions you selected when registering the app. It will also create a refresh.token file which we will give to the MA to automate our connection. The Authorization dialog looks like this.

Creating the Management Agent

Now we can create our Management Agent using the Granfeldt PowerShell Management Agent. If you haven’t created one before checkout a post like this one, that further down the post shows the creation of a Granfeldt PSMA. Don’t forget to provide blank export.ps1 and password.ps1 files on the directory where you place the PSMA scripts.

PowerShell Management Agent Schema.ps1

PowerShell Management Agent Import.ps1

As detailed above the PSMA will leverage the WebApp to read users Exchange Profile Photos via the Graph API. The Import script also leverages Remote Powershell into Exchange Online (for reasons also detailed above). The account you run the Management Agent as will need to be added to the Recipient Management Role Group in order to use Remote PowerShell into Exchange Online and get the information required.

Take the Import.ps1 script below and update;

  • Update lines 11, 24 and 42 for the path to where you have put your PSMA. Mine is under the Extensions directory in a directory named EXOPhotos.
  • copy the refresh.token generated when authenticating and authorizing the WebApp earlier into the directory you specified in line 42 above.
  • Create a Debug directory under the directory you specified in lines 11,24 and 42 above so you can see what the MA is doing as you implement and debug it the first few times.
  • I’ve written the Import to use Paged Imports, so make sure you tick the Paged Imports checkbox on the configuration of the MA
  •  Update Lines 79 and 80 with your ApplicationID and Client Secret that you recorded when creating your WebApp

Running the Exchange User Profile Photos MA

Now that you have created the MA, you should have select the EXOUser ObjectClass and the attributes defined in the schema. You should also create the EXOPhoto (as Binary) and EXOPhotoChecksum (as String) attributes in the Metaverse on the person ObjectType (assuming you are using the built-in person ObjectType).

Configure your flow rules to flow the EXOPhoto and EXOPhotoChecksum on the MA to their respective attributes in the MV.

Create a Stage Only run profile and run it. If you have done everything correctly you will see photos come into the Connector Space.

Looking at the Connector Space, I can see EXOPhoto and EXOPhotoChecksum have been imported.

After performing a Synchronization to get the data from the Connector Space into the Metaverse it is time to test the image that lands in the Metaverse. That is quick and easy via PowerShell and the Lithnet MIIS Automation PowerShell Module.

$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
$me.Attributes.EXOPhoto.Values.ValueBinary
[System.Io.File]::WriteAllBytes("c:\temp\myOutlookphoto.jpg" ,$me.Attributes.EXOPhoto.Values.ValueBinary )

The file is output to the directory with the filename specified.

Opening the file reveals correctly my Profile Photo.

Summary

In Part one we got the AzureAD/Active Directory photo. In this post we got the Office365 photo.

Now that we have the images from Office365 we need to synchronize any update to photos to Active Directory (and in-turn via AADConnect to Azure AD). Keep in mind the image size limits for Active Directory and that we retrieved the largest photo available from Office365 when synchronizing the photo on. There are a number of PowerShell modules for photo manipulation that will allow you to resize accordingly.

How to Synchronize users Active Directory/Azure Active Directory Photo using Microsoft Identity Manager

Introduction

Whilst Microsoft FIM/MIM can be used to do pretty much anything your requirements dictate, dealing with object types other than text and references can be a little tricky when manipulating them the first time. User Profile Photos fall into that category as they are stored in the directory as binary objects. Throw in Azure AD and obtaining and synchronizing photos can seem like adding a double back-flip to the scenario.

This post is Part 1 of a two-part post. Part two is here. This is essentially the introduction to the how-to piece before extending the solution past a users Active Directory Profile Photo to their Office 365 Profile Photo. Underneath the synchronization and method for dealing with the binary image data is the same, but the API’s and methods used are different when you are looking to implement the solution for any scale.

As for why you would want to do this, refer to Part two here. It details why you may want to do this.

Overview

As always I’m using my favourite PowerShell Management Agent (the Granfeldt PSMA). I’ve updated an existing Management Agent I had for Azure AD that is described here. I highly recommend you use that as the basis for the extra photo functionality that I describe in this post. Keep in mind the AzureADPreview, now AzureAD Powershell Module has change the ADAL Helper Libraries. I detail the changes here so you can get AuthN to work with the new libraries.

Therefore the changes to my previous Azure AD PowerShell MA are to add two additional attributes to the Schema script, and include the logic to import users profile photo (if they have one) in the Import script.

Schema.ps1

Take the schema.ps1 from my Azure AD PSMA here and add the following two lines to the bottom (before the $obj in the last line where I’ve left an empty line (29)).

$obj | Add-Member -Type NoteProperty -Name "AADPhoto|Binary" -Value 0x20 
$obj | Add-Member -Type NoteProperty -Name "AADPhotoChecksum|String" -Value "23973abc382373"

The AADPhoto attribute of type Binary is where we will store the photo. The AADPhotoChecksum attribute of type String is where we will store a checksum of the photo for use in logic if we need to determine if images have changed easily during imports.

Import.ps1

Take the import.ps1 from my Azure AD PSMA here and make the following additions;

  • On your MIM Sync Server download/install the Pscx PowerShell Module.
    • The Pscx Powershell Module is required for Get-Hash (to calculate Image checksum) based on variables vs a file on the local disk
    • You can get the module from the Gallery using Install-Module Pscx -Force
    • Add these two lines up the top of the import.ps1 script. Around line 26 is a good spot
# Powershell Module required for Get-Hash (to calculate Image checksum)
Import-Module Pscx
  • Add the following lines into the Import.ps1 in the section where we are creating the object to pass to the MA. After the $obj.Add(“AADCity”,$user.city) line is a good spot. 
  • What the script below does is create a WebClient rather than use Invoke-RestMethod or Invoke-WebRequest to get the users Azure AD Profile image only if the ‘thumbnailPhoto@odata.mediaContentType’ attribute exists which indicates the user has a profile photo. I’m using the WebClient over the PowerShell Invoke-RestMethod or Invoke-WebRequest functions so that the returned object is in binary format (rather than being returned as a string), which saves having to convert it to binary or output to a file and read it back in. The WebClient is also faster for transferring images/data.
  • Once the image has been returned (line 8 below) the image is added to the object as the attribute AADPhoto to be passed to the MA (line 11)
  • Line 14 gets the checksum for the image and adds that to the AADPhotoChecksum attribute in line 16.

Other changes

Now that you’ve updated the Schema and Import scripts, you will need to;

  • Refresh your schema on your Azure AD PSMA to get the new attributes (AADPhoto and AADPhotoChecksum) added
  • Select the two new attributes in the Attributes section of your Azure AD PSMA
  • Create in your MetaVerse via the MetaVerse Designer two new attributes on the person (or whatever ObjectType you are using for users), for AADPhoto and AADPhotoChecksum. Make sure that AADPhoto is of type Binary and AADPhotoChecksum is of type string.

  • Configure your Attribute Flow on your Azure AD PSMA to import the AADPhoto and AADPhotoChecksum attributes into the Metaverse. Once done and you’ve performed an Import and Sync you will have Azure AD Photos in your MV.

  • How do you know they are correct ? Let’s extract one from the MV, write it to a file and have a look at it. This small script using the Lithnet MIIS Automation PowerShell Module makes it easy. First I get my user object from the MV. I then have a look at the text string version of the image (to make sure it is there), then output the binary version to a file in the C:\Temp directory.
$me = Get-MVObject -ObjectType person -Attribute accountName -Value "drobinson"
[string]$myphoto = $me.Attributes.AADPhoto.Values.ValueString
[System.Io.File]::WriteAllBytes("c:\temp\UserPhoto.jpg" ,$me.Attributes.AADPhoto.Values.ValueBinary )
  • Sure enough. The image is valid.

Conclusion

Photos are still just bits of data. Once you know how to get them and manipulate them you can do what ever you need to with them. See Part two that takes this concept and extends it to Office 365.

Understanding Password Sync and Write-back

For anyone who has worked with Office 365/Azure AD and AADConnect, you will of course be aware that we can now sync passwords two ways from Azure AD to our on-premises AD. This is obviously a very handy thing to do for myriad reasons, and an obvious suggestion for a business intending to utilise Office 365. The conversation with the security bod however, might be a different kettle of fish. In this post, I aim to explain how the password sync and write-back features work, and hopefully arm you with enough information to have that chat with the security guys.

Password Sync to AAD

If you’ve been half-listening to any talks around password sync, the term ‘it’s not the password, it’s a hash of a hash’ is probably the line you walked away with, so let’s break down what that actually means. First up, a quick explanation of what it actually means to hash a value. Hashing a value is applying a cryptographic algorithm to a string to irreversibly convert that string into another sting of a fixed length. This differs from encryption in that with encryption we intend to be able to retrieve the data we have stored, so must hold a decryption key – with hashing, we are unable to derive the original string from the hashed string, and nor do we want to. If it makes no sense to you that we wouldn’t want to reverse a hashed value, it will by the end of this post.

So, in Active Directory when a user sets their password, the value stored is not actually the password itself, it’s an MD4 hash of the password once it’s been converted to Unicode Little Endian format. Using this process, my password “nicedog” ends up in Active Directory as 2993E081C67D79B9D8D3D620A7CD058E. So, this is the first part of the “hash of a hash”.

The second part of the “hash of a hash” is going to go buck wild with the hashing. Once the password has been securely transmitted to AADConnect (I won’t detail that process, because we end up with our original MD4 hash anyway), the MD4 hash is then:

  1. Converted to a 32-byte hexadecimal string
  2. Converted into binary with UTF-16 encoding
  3. Salted with 10 additional bytes
  4. Resultant string hashed and re-hashed 1000 times with HMAC-SHA256 via PBKDF2

So, as you can see, we’re actually hashing the hash enough to make Snoop Dogg blush. Now, we finally have the have the hash which is securely sent to Azure AD via an SSL channel. At this point, the hashed value which we send is so far removed from the original hash that even the most pony-tailed, black tee-shirt and sandal wearing security guy would have to agree it’s probably pretty secure. Most importantly, if Azure AD was ever compromised and an attacker did obtain your users password hashes, they are unable to be used to attack any on-premises infrastructure with a “pass the hash” attack, which would be possible if AD’s ntds.dit was compromised and password hashes extracted.

So now Azure AD has these hashes, but doesn’t know your passwords – how are user’s passwords validated? Well it’s just a matter of doing the whole thing again with a user provided password when that user logs on. The password is converted, salted, hashed and rehashed 1000 times in the exact same manner, meaning the final hash is exactly the same as the one stored in Azure AD – the two hashes are compared, and if we have a match, we’re in. Easy. Now let’s go the other way.

Password Write-back to AD

So how does it work going back the other way? Differently. Going back the other way, we can’t write a hash ourselves to Active Directory, so we need to get the actual password back from AAD to AD and essentially perform a password reset on-prem allowing AD to then hash and store the password. There are a couple of things which happen when we install AADConnect and enable Password Write-back to allow us to do this

  1. Symmetric keys are created by AAD and shared with AADConnect
  2. A tenant specific Service Bus is created, and a password for it shared with AADConnect

The scenario in which we are writing a password back from AAD to AD is obviously a password reset event. This is the only applicable scenario where we would need to, because a password is only set in AAD if that user is “cloud only”, so they would of course not have a directory to write back to. Only synced users need password write-back, and only upon password reset. So AAD gets the password back on-premises by doing the following:

  1. User’s submitted password is encrypted with the 2048-bit RSA Key generated when you setup write-back
  2. Some metadata is added to the package, and it is re-encrypted with AES-GCM
  3. Message sent to the Service Bus via and SSL/TLS channel
  4. On-premises AADConnect agent wakes up and authenticates to Service Bus
  5. On-premises agent decrypts package
  6. AADConnect traces cloudanchor attribute back to the connected AD account via the AADConnect sync engine
  7. Password reset is performed for that user against a Primary Domain Controller

So, that’s pretty much it. That’s how we’re able to securely write passwords back and forth between our on-premises environment and AAD without Microsoft ever needing to store a user’s password. It’s worth noting for that extra paranoid security guy though that if he’s worried about any of the encryption keys or shared passwords being compromised, you can re-generate the lot simply by disabling and re-enabling password writeback.

Hopefully this post has gone some way towards helping you have this particular security conversation, no matter what year the security guys Metallica tour tee-shirt is from.

Using the Lithnet PowerShell Modules to generate full object metadata FIM/MIM HTML Reports

How many times have you wanted a consolidated report out of FIM/MIM for an object? What connectors does it have, what are the values of the attributes, which Management Agent contributed the value(s) and when? Individually of course you can get that info using the Metaverse Search and looking at the object in MIM Portal. But what if you wanted it all with a single query? This blog post provides an approach to doing just that. The graphic above shows a screenshot of a sample output. Click this Sample Report for full resolution version of the screenshot above. Note: The updated version of the script below outputs DisplayName for the ExpectedRulesList attribute so it actually provides valuable information. 

Overview

The approach is quite simple. It is;

  • Query the FIM/MIM Metaverse for an object
  • Take the response from the Metaverse to build the Connectors and Metaverse Hologram reports
  • Use the connector information to query the MIM Service MA (this example assumes it is on the same server. If not add the following line into the script with the appropriate values) and get the objects MIM Service Connector Space info
    Set-ResourceManagementClient -BaseAddress http://fimsvc:5727;
  • Take information retrieved above to then query the MIM Service and return the information for the object.
  • Format all the output for HTML, apply a simple style sheet, output to file and display in the default browser

NOTE: If you combine this with the Get-MVObject query building script detailed here it can be a relatively simple solution. That script even uses the same variables $queries and $query as outputs from the search and input into the HTML Report.

NOTE: You could possibly run it remotely from the MIM Sync Server too, if you leverage Remote Powershell to your FIM/MIM Sync server as detailed here.

The Script

Here it is. Lines 23 and 24 contain a hard-coded query. Update for your search criteria, or as detailed above combine this with the Get-MVObject query building script detailed here .  The Output directory specified in Line 7 is where the stylesheet and the resultant HTML file will be placed. Update for your needs.

For the Expected Rules List (unlike the screenshot as I’ve modified the script afterwards), the script gets the DisplayName for them and puts that in the report. DisplayName is more valuable than an ERE ObjectID.

Scripting queries for Lithnet Get-MVObject searches into the Microsoft Identity Manager Metaverse

It probably seems obvious by now, but I seem to live in PowerShell and Microsoft Identity Manager. I’m forever looking into the Microsoft Identity Manager Metaverse for objects.

However, sometimes I get tripped up by the differences in Object Classes between the FIM/MIM Service and the Metaverse, the names of the Object Classes (obviously not Person, Group and Contact) and in situations where they are case-sensitive.  If you’re using the Sync Service Manager Metaverse Search function though you get a pick list. But getting the data out to do something else with isn’t an option.

Solution

I’ve looked to quickly provide a similar function to the pick lists in the Metaverse Search GUI via Powershell which then gets executed by the Get-MVObject PowerShell Module.

UPDATE: 17 May 2017 The Lithnet MIIS Automation PowerShell Module has been updated for Get-MVObject to support the ObjectType Scope. I’ve updated the script to include the scope parameter based on the ObjectClass selected at the beginning of the script. 

I’ve defaulted the ObjectClass to Person so you can just press enter. But if you have custom ObjectClasses in your Metaverse you may need to change the index number in Line 48 from 5 to whichever index Person appears in your environment. Same goes for the default attribute of AccountName in the Attribute list. It appears at index 5 (Line 77) in my attribute list.

Process

Basically just run the PowerShell script and choose your options. The script needs interaction with the FIM/MIM Sync server, so you run it from the FIM/MIM Sync server. If you want to run it remotely (of course you do), then Remote PowerShell is your friend. Checkout how to do that to the FIM/MIM Server in this post here.

The Script itself will query the FIM/MIM MV Schema and return a list of Object Classes. As detailed above, in Line 48 of the script I have ‘index 5’ as the default which in my environment is Person and as such you can just hit enter if that is the Object Class you want to choose attributes from in the next step. Otherwise type the name of the ObjectClass you want. You don’t have to worry about case sensitivity as the script handles that. You can only choose a single ObjectClass obviously, but the menu ui I’ve used allows for multiple selections. Just press enter when prompted for another option for ObjectClass.

You’ll then be presented with a list of attributes from the chosen Object Class above. Again as detailed above I have it defaulting to ‘accountName’ which is index 5 in my list. Change (Line 77) for the default you want. This means you can just hit enter if accountName is what you’re querying on (which is common). Or choose another option. This then also allows you to also choose multiple attributes (which will be added to an array). This means you can use this for complex queries such as;

accountName startsWith 'dar'
sn startsWith 'rob'
mail contains '@kloud'

If you want to choose multiple attributes for your query and one of them is the default option, make sure you specify one of the attributes that is not the default first so that you get the option to specify more. When you’ve chosen all the attributes you are going to use in your query hit enter and the script will take an empty response as the end of your choices.

Now for each attribute chosen you will be prompted for an Operator. Pretty simple. Just choose from the available options. Note: all operators are shown but not all operators can be used for all attribute types. e.g. Don’t select ‘EndsWith’ for a Boolean attribute type and expect it to work. If you choose an operator other than the default (equals in my example) hit enter when prompted for the second time and the script will take an empty response as the end of your choices.

Finally provide what you the value is for the search term for the attribute. If the value has spaces, don’t worry about putting the value in quotes. The script takes care of that.

The last two steps will iterate through, for queries where you have chosen multiple attributes.

And you’re done. $query is the variable that contains the results. In line 115 I’m using Show-Object from the PowershellCookBook PSM. That then gives you a GUI representation of the result as shown below. If the query returns multiple results this will only show the last.

Line 114 outputs the value of the attributes ($query.attributes) to the console as well. If you have multiple objects returned $query will show them as shown below.

Finally if you want to run the query again, or just make a subtle change, you shouldn’t have to go through that again. Get the value of $querytxt and you’ll get the query and the command to execute it. $querytxt is also output to the console as shown below. Copy and paste it into Powershell ISE, update and execute.

The Script

Here is the raw script. Hardly any error handling etc, but enough to get you started and tailor it for your requirements. Enjoy.

MfaSettings.xml updates not taking effect

First published at https://nivleshc.wordpress.com

Last week, I was at a client site, extending their Microsoft Identity Manager (MIM) 2016 Self Service Password Reset Solution so that it could use Azure MultiFactor Authentication (MFA). This is an elegant solution since instead of using Questions and Answers to authenticate yourself when trying to reset your password, you can use One Time Passwords (OTP), sent as a security code via a text message to your registered mobile device.

I followed the steps as outlined in https://github.com/Microsoft/MIMDocs/blob/master/MIMDocs/DeployUse/working-with-self-service-password-reset.md to enable Azure MFA, and everything went smoothly.

I then proceeded to testing the solution.

Using the Password Registration Portal, I registered my mobile number against my test user account.

I then opened the Password Reset Portal, entered my test user username and proceeded to wait for the text message from Microsoft Azure with the security code, so that I could enter it in the next screen.

MIM_Verify_MobilePhoneVerification

I waited and waited (for at least 5 min), unfortunately the text message didn’t arrive 😦

Ok, troubleshooting time.

On my Microsoft Identity Manager 2016 Service Server, I opened the Windows EventLogs viewer and then expanded the section for Forefront Identity Manager event logs. Aha, I was on the right track as I saw a lot of errors reported.

MIMServiceServer_EventLogs

I went through the event log entries and found one which looked abit odd. The error essentially said that the certificate path contained illegal characters.

MIMServiceServer_Error_CertificatePath

I couldn’t make much sense of this error, so I opened the MfaSettings.xml file to check, and I quickly realised my mistake. I had included the certificate file path within quotes!

I quickly removed the unnecessary ” “ , saved the MfaSettings.xml file and restarted my testing process.

I went through the password reset process again, and yet again, I didn’t receive any text message from Microsoft Azure with the security code 😦

I re-checked the eventlogs and noticed the same Exception: Illegal characters in path for the Certificate File Path error. Thinking that I might have forgotten to save the previous modification to the MfaSettings.xml file, I opened the file to confirm. The quotes were no where to be seen! Alas, the plot thickens my dear Watson!

I couldn’t find any explanation for this behaviour. Then, thinking that maybe the MIM server was having issues accessing the long filepath for the Certificate file, I moved the certificate file to a folder that was closer to the root of the C:\ drive, updated the MfaSettings.xml file appropriately and repeated my testing.

Again, no text message 😦

Checking the event logs, I noticed the same dreaded Exception: Illegal characters in path for the Certificate File Path error again.

However, looking closer at the error, I realised that the file path was reported as C:\Program Files\Microsoft Forefront Identity Manager\2010\Service\MfaCerts\cert_key.p12, which wasn’t correct since I had moved the certificate file to another folder and updated the MfaSettings.xml file accordingly!

Suddenly I had that light bulb moment 😉 Updates to the MfaSettings.xml file were not being read by the MIM Server! This could only mean that it wasn’t monitoring this file for any changes, quite the opposite to what I had initially assumed!

To force MIM to re-read the MfaSettings.xml file, I restarted the Forefront Identity Manager Service service and went through my password reset testing process again.

Eureka! This time around, I received the text message from Microsoft Azure with the security code! Checking the Event logs, I couldn’t find any new occurrences of the Exception: Illegal characters in path for the Certificate File Path error. Hurray!

I completed the password reset process and confirmed that the password for my test account had indeed been changed.

I hope this post helps others.

BTW, below is a sample of the MFASettings.xml file (for security reasons, the keys have been scrambled, however as seen below, none of the values need quotes)

<?xml version=”1.0″ encoding=”utf-8″ ?>
<SubscriberKeys>
<LICENSE_KEY>1A3FED2C1BZA</LICENSE_KEY>
<GROUP_KEY>a1b234c567890e123456gh1234567eij</GROUP_KEY>
<CERT_PASSWORD>1ABDCDEF1GHDAVWA</CERT_PASSWORD>
<CertFilePath>C:\Program Files\Microsoft Forefront Identity Manager\2010\Service\MfaCert\cert_key.p12</CertFilePath>
<Username>john.doe</Username>
<DefaultCountryCode>61</DefaultCountryCode>
</SubscriberKeys>