Building a Microsoft Identity Manager PowerShell Management Agent for Workday HR

Before I even get started with this post, let me state that the integration I describe here is not a standalone solution. Integrating with Workday for any organisation of significant size will require multiple integration points each providing coverage for the scenarios for your implementation. I list a few in this post, but Alexander Filipin has already done an awesome job here.

You may state, that there is of course the Azure Active Directory Provisioning Service for Workday. But what if you need more granular customisation than that provides, or you have requirements to get that data to a number of other systems and you desire to have connectivity to the authoritative source? Those are requirements I had and why I built a Management Agent for Workday to consume Workday HR data directly.

As the title implies it uses the ever versatile Granfeldt PowerShell Management Agent. The other key component is a PowerShell Module that eases the integration with Workdays’ SOAP API. Specifically the Workday API PowerShell Module available here.

Enabling the Workday (Get_Workers) API

In order to access the Workday API you need to have an API  account created. I pointed the Workday Support guys to this Microsoft Azure Inbound Workday Provisioning Documentation. Specifically the ‘Configure a system integration user in Workday‘ section in that link.

Once enabled they were able to give me a Service and Tenant name along with a Username and Password.

  • when using this information your Username is the username and the tenant. So if the username is ‘API User’ and the Tenant is ‘Identity_Corp’ then loginID for our purpose is API User@Identity_Corp
  • the URL you are provided will combine the Service and Tenant names. It will look something like this for the Human Resources Endpoint https://wd3-impl-services1.workday.com/ccx/service/TENANTNAME/Human_Resources/v30.2
    • where wd3-impl-services1 is the Service Name

Install the WorkdayAPI PowerShell Module

On your FIM/MIM Sync Server you will need to install the Workday API PowerShell Module available here. You will need to install it using an Elevated PowerShell session.

Unblock the PowerShell Module and Scripts

After installing the Workday API PowerShell Module it should be located in ‘C:\Program Files\WindowsPowerShell\Modules\WorkdayApi’. You will need to unblock the module and scripts. Run the following two commands in an elevated PowerShell session.

Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi' | Unblock-File
Get-ChildItem 'C:\Program Files\WindowsPowerShell\Modules\WorkdayApi\scripts' | Unblock-File

Verify your Execution Policy

As the PowerShell Module is unsigned you might need to do something similar to the following. The Get-ExecutionPolicy -List command will show you what the Execution Policy settings currently are.

Set-ExecutionPolicy "Unrestricted" -Scope Process -Confirm:$false
Set-ExecutionPolicy "Unrestricted" -Scope LocalMachine -Confirm:$false

Import Analytics

50k records with just the base profile (no -include work or -include personal options) takes ~7 minutes to ‘stage’ into the connector space. 50k records WITH work and personal metadata takes ~32 hours at a pretty consistent rate of ~20 mins/500 user records.

If you are retrieving just the Base record then the networking receive bandwidth consumption is ~240kbps. When retrieving the full records as a batch process the networking receive bandwidth consumption it ~3Mbps as shown below.

Full Object Network Graph

Why is this important?

The first “FULL” Sync depending on how many records you have in Workday will alter the approach you will need to take in order to obtain them all. I found that trying to retrieve full records in one call for anything over ~5000 records got inconsistent. I wouldn’t get the full dataset and the machine running it would start to run out of resources (processing power and memory). If you have only a few thousand records, requesting full records in one call will probably suffice.

Now I have ~100k records to return. What I found worked best is to get just the base record for all users then the full record for each user using pagination (via PSMA Paged Imports; I have my set to 500). The the PSMA Paged Imports feature will process the objects through the MA 500 at a time. That way you’re not stressing the host running the Sync Engine to the maximum and you don’t have to wait an hour+ to see any processing of objects on the MA.

Once you have completed a Full Sync and you are of any significant scale you will want to perform Delta Sync’s for the objects that have changed since your last sync. I’m not going to cover that in this post, but in a separate one in the future.

Here is a screenshot of showing the time taken for a Stage (Import) of 50k objects. Just under 33 hours.

Import - Stage Only.PNG

Other Options for Scale

If you are a large organisation this solution isn’t necessary a valid one (in isolation) as I indicated in the opening paragraph. Consider it ancillary augmentation to a multi-pronged implementation (as described nicely by Alexander Filipin here). Potentially something like;

  • Azure Active Directory Inbound Provisioning for object creation
  • A Management Agent such as the one I describe in this post for certain aspects
    • and a modification or two to identify new accounts from a Base Workday discovery and only import the full object for them on workdays and a full sync on the weekends or
    • delta syncs using the Workday Transaction Log Criteria Data and Transaction_Date_Range_Data
      • I’ll cover this in a future post but essentially on every sync I store a cookie-file with the watermark of the time of the sync. On the next deltasync I retrieve the cookie-file with the timestamp and make a call to get all objects changed since the previous sync up to the current time

PSMA Workday Management Agent Script Files

Wow, what a lot of caveats and clarifications. But with all that said, below are base  Schema and Import Scripts examples for the Grandfeldt PowerShell Management Agent that leverages the Workday API PowerShell module.

Schema.ps1

The schema is the base schema for my tenant. You shouldn’t have to change anything here unless you are retreiving additional attributes you want in MIM.

Import.ps1

The import script leverages AuthN creds from the MA config. Make sure the Username is in the format of UserID@TenantName. Also update;

  • Line 10 for the location you put your extension as well as the 8.3 format path to the MA Debug folder
  • Line 30 for the correct Service and Tenant info
  • Make sure you have Paged Imports selected on the Global Parameters screen of the MA Configuration

Export.ps1

I haven’t provided an example. The Workday API PowerShell Module has examples for updating Email, Phone and Photos. You can implement what you require.

Summary

The sample Workday MA Config in this post will give you a base integration with Workday. It is unlikely that it will give you everything you need and there isn’t a single solution that probably will, unless your organisation is quite small. There are other options as mentioned in this post and also the Workday Reports REST API. But those are topics for future posts.

Automate the Generation of a Granfeldt PowerShell Management Agent Schema Definition File

Generating Schema.ps1 for the Granfeldt FIM/MIM PowerShell Management Agent

Getting started writing your first Forefront/Microsoft Identity Manager Granfeldt PowerShell Management Agent can be a bit daunting. Before you can do pretty much anything you need to define the schema for the PSMA. Likewise if you have written many, the generation of the schema file often seems to take longer than it should and can be a little tedious when all you want to do is write the logic for the Import and Export scripts.

After a few chats with Soren around enhancements for the PSMA I suggested it would be awesome if the generation of the schema.ps1 file could be (semi)automated. So here is my first stab at doing just that.

My approach is;

  • Using PowerShell get an object that represents an object that will be managed on the PSMA
  • Enumerate the Properties of the PSObject and generate the Schema script accordingly
  • All that is left to do afterwards is;
    • define your anchor
    • define the name of the ObjectType
    • combine multiples if your MA will have multiple ObjectClasses
      • Update $obj to $obj2 etc for any additional object classes residing in the same schema file

Below I provide four examples covering the script that generates the schema definition along with the output. The four examples cover;

  • Azure AD User
  • Azure AD Group
  • Workday User
  • Flat File CSV

Example 1: Azure Active Directory User

The example below utilises the AzureAD PowerShell Module to connect to Azure AD. It then gets a User Object (update line 7 for a user to retrieve) and enumerates the properties of the User to generate the Schema file.

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "AccountEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "AgeGroup|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "AssignedLicenses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "AssignedPlans|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "City|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "CompanyName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ConsentProvidedForMinor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Country|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "CreationType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DeletionTimestamp|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Department|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DirSyncEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "DisplayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ExtensionProperty|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "FacsimileTelephoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "GivenName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ImmutableId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "IsCompromised|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "JobTitle|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastDirSyncTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LegalAgeGroupClassification|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "MailNickName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mobile|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OnPremisesSecurityIdentifier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OtherMails|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "PasswordPolicies|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PhysicalDeliveryOfficeName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PostalCode|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PreferredLanguage|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ProvisionedPlans|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProvisioningErrors|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProxyAddresses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "RefreshTokensValidFromDateTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ShowInAddressList|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "SignInNames|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "SipProxyAddress|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "State|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "StreetAddress|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Surname|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "TelephoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UsageLocation|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserPrincipalName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserType|string" -Value "string"

Update the Anchor for the attribute you’d like to use. I recommend ObjectId and give the ObjectClass a name for how you’d like it represented on your MA (User, AADUser or similar) and save it as something like schema.ps1 in you MA folder and you can get started.

Example 2: Azure Active Directory Group

The example below utilises the AzureAD PowerShell Module to connect to Azure AD. It then gets a Group Object (update line 7 for a group to retrieve) and enumerates the properties of the Group to generate the Schema file

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "DeletionTimestamp|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Description|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "DirSyncEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "DisplayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastDirSyncTime|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Mail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "MailEnabled|boolean" -Value $true
$obj | Add-Member -Type NoteProperty -Name "MailNickName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ObjectType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OnPremisesSecurityIdentifier|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "ProvisioningErrors|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "ProxyAddresses|String[]" -Value ("","")
$obj | Add-Member -Type NoteProperty -Name "SecurityEnabled|boolean" -Value $true

Update the Anchor for the attribute you’d like to use. I recommend ObjectId and give the ObjectClass a name for how you’d like it represented on your MA (Group, AADGroup or similar) and save it as something like schema.ps1 in you MA folder and you can get started.

Example 3: Workday User

The example below utilises the Workday PowerShell Module to connect to Workday. It then gets a User Object (update line 7 for a user to retrieve) and enumerates the properties of the User to generate the Schema file.

Update

  • Line 6 for your ServiceName and Tenant.
  • Line 13 for an object to retrieve

This script differs from AAD User and Group above in that the PowerShell Object returned uses NoteProperty as the type. So I updated Line 14 for that. Also the attribute when parsed by Get-Member includes a value so I had to get a substring of the result for the attribute name. That is what this change does:

$d[1].substring(0,$d[1].indexof("="))

The output looks like this:

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "BusinessTitle|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "FirstName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "JobProfileName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "LastName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "Location|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "PreferredName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "UserId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerDescriptor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerType|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerTypeReference|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkSpace|string" -Value "string"

Example 4: Flat File CSV

The example below utilises a sample CSV file with headers. It uses the Header row to generate the Schema file. It defaults all columns to strings.

Update;

  • Line 2 for your CSV File Name

The output looks like this (for my CSV File):

$obj = New-Object -Type PSCustomObject
$obj | Add-Member -Type NoteProperty -Name "Anchor-YourAnchorATTR|String" -Value ""
$obj | Add-Member -Type NoteProperty -Name "objectClass|String" -Value "YourObjectClass"
$obj | Add-Member -Type NoteProperty -Name "id|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "name|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "displayName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "comments|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "created|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "endDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "lastLogon|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "modified|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "startDate|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "status|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "type|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "groups|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "costCenter|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "country|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "department|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "division|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "email|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "employeeNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "familyName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "givenName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "honorificPrefix|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "honorificSuffix|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "locale|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "location|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "manager|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "middleName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "organization|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "phoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "preferredLanguage|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "preferredName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "secondaryEmail|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "secondaryPhoneNumber|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "timezone|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "title|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "risk|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerWid|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerDescriptor|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "OtherId|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "JobProfileName|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkSpace|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerTypeReference|string" -Value "string"
$obj | Add-Member -Type NoteProperty -Name "WorkerType|string" -Value "string"
$obj

Summary

Using a simple script and an example object we can quickly create the basis for a Granfeldt PSMA Schema Definition script.
As shown with the Workday example a minor tweak was required, but it was still a lot quicker than generating manually.

Hopefully this helps you get started quickly with your first, or next PSMA that you are building.

Automating Azure AD B2B Guest Invitations using Microsoft Identity Manager

Introduction

Earlier this year Microsoft released the Microsoft Identity Manager Azure AD B2B Management Agent. I wrote about using it to write to Azure AD in this post here. As detailed in that post my goal was to write to Azure AD using the MA. I provided an incomplete example of doing that for Guests. This post fills in the gap and unlike the note preceding the post indicates, I’ve updated my MA to use the Graph API over the Azure AD PowerShell Module. It does though work in unison with the Microsoft Azure AD B2B Management Agent.

Overview

The process is;

  • Using the Microsoft Azure B2B Management Agent connect to an Azure AD Tenant that contains users that you want to invite as Guests to your Tenant. Flow in the naming information for users and their email address and any other metadata that you need to drive the logic for who you wish to invite
  • Use my Azure AD B2B Invitation Management Agent to automate the invitation of users to your Azure AD Tenant using Azure AD B2B

My Azure AD B2B Invitation Management Agent works in two phases;

  1. Invitation of Users as Guests
  2. Update of Guests with naming information (Firstname, Lastname, DisplayName)

The Azure AD B2B Invite Management Agent uses my favorite PowerShell Management Agent (the Granfeldt PSMA). I’ve posted many times on how to configure it. See these posts here if you are new to it.

Prerequisites

Setting up an Azure AD User Account for the Management Agent

In your Azure AD create a New User that will be used by the Management Agent to invite users to your Azure AD. I named mine B2B Inviter as shown below.

Inviter Account.PNG

You then want to assign them the Guest inviter role as shown below. This will be enough permissions to invite users to the Azure AD.

Inviter Role.PNG

However depending on how you want these invitee’s to look, you probably also want their names to be kept consistent with their home Azure AD. To also enable the Management Agent to do that you need to also assign the User administrator role as shown below.

Add User Admin Role.PNG

Now log in using that account to Azure AD and change the password. The account is now ready to go.

Management Agent Scripts

The Management Agent uses the Granfeldt PowerShell Management Agent. This is a cut down version of my MIM Azure AD Management Agent. 

Schema Script

I’ve kept the schema small with just enough interesting info to facilitate the functionality required. Expand it if you need additional attributes and update the import.ps1 accordingly.

Import.ps1

The Import script imports users from the Azure AD Tenant that you will be inviting remote Azure AD users too (as Guests).

  • Change line 10 for your file path
  • Change line 24 for the version of an AzureAD or AzureADPreview PowerShell Module that you have installed on the MIM Sync Server so that the AuthN Helper Lib can be used. Note if using a recent version you will also need to change the AuthN calls as well as the modules change. See this post here for details.
  • Change line 27 for your tenant name
  • Change line 47/48 for a sync watermark file
  • The Import script also contains an attribute from the MA Schema named AADGuestUser that is a boolean attribute. I create the corresponding attribute in the MetaVerse and MIM Service Schemas for the Person/User objectClasses. This is used to determine when a Guest has been successfully created so their naming attributes can then be updated (using a second synchronisation rule).

Export.ps1

The Export script handles the creation (invitation) of users from another azure AD Tenant as Guests as well synchronizing their naming data. It doesn’t include deletion logic, but that is simple enough include a deletion  API call based on your MA Deprovisioning logic and requirements.

  • By default I’m not sending invitation notifications. If you want to send invitation notifications change “sendInvitationMessage“= $false to $true on Line 129. You should then also change the Invitation Reply URL on line 55 to your Tenant/Application.
  • Change Line 10 for the path for the debug logging
  • Change Line 24 as per the Import Script if you are using a different version of the help lib
  • Change Line 27 for your Azure AD Tenant Name

Declarative Sync Rules

I’m not going to cover import flow configurations on the MS Azure AD B2B MA here. See here for that. Below details how I’ve configured my Invitation MA for the Creation/Export functions. My join rule (configured in the Sync Engine Invitation MA Config) is email address as shown below. Not the best anchor as it isn’t immutable. But as we don’t know what the DN is going to be until after it is created this is the next best thing.

Join Rule.PNG

Creation Sync Rule

Here are the three attributes I’m syncing to the B2B Invite Management Agent to perform the invitation. I’m using the mail attribute for the DN as it matches the anchor for the schema. We don’t know what objectID will be assigned until the directory service does it. By using email/upn once created we will get the join and won’t end up with two objects on the MA for the same user.

Outbound Flow for Create 2.PNG

For Inbound I’m flowing in the AADGuestUser boolean value. You will need to create this attribute in the MetaVerse and then the MIM Service. In the MIM Service allow the Sync Service Account to manage the attribute and change the MIM Service Filter Permissions to allow Admins to use the attribute in Sets. Also on the MIM Service MA add an Export flow from the MV to the MIM Service for AADGuestUser.

Inbound Flow Create.PNG

Naming Update Sync Rule

The second Sync Rule is to update the guests GivenName, Surname and DisplayName. This sync rule has a dependency on the creation sync rule and has a corresponding Set, Workflow and MPR associated with value of the AADGuestUser boolean attribute populated by the Import script. If True (which it will be after successful creation and the confirming import) the second synchronization rule will be applied.

Sync Naming Synchronisation Rule.PNG

I will trigger an export flow for the three naming attributes.

Outbound Flow for Naming.PNG

Example of Inviting Guests

In this example Rick Sanchez is a member of a guest organisation and meets the criteria of my rules to be invited as a guest to our Tenant. We then, that we get an Add for Rick on the B2B Invite MA.

Create Rick Sanchez.PNG

On export Rick is created in our Azure AD as a Guest User

Rick Created Sync Engine.PNG

Rick appears in Azure AD as a Guest via the Azure Portal.

Rick Created AzureAD.PNG

Following the confirming import our second sync rule fires and flows through an update to DisplayName and adds GivenName and Surname.

Update Rick.PNG

This naming attributes are then successfully exported.

Success Export.PNG

Going to the Azure AD Portal we see that Rick has indeed been updated.

Rick Updated.PNG

Notification Emails

If you enable notification emails a generic notification email is sent like shown below. The import.ps1 MA script has them disabled by default.

Email Invite Notification2.PNG

Summary

Using a combination of the Microsoft Azure AD B2B Management Agent and my Azure AD B2B Invitation Management Agent you can automate the invitation of Guest users to your Azure AD Tenant.

How to use the FIM/MIM Azure Graph Management Agent for B2B Member/Guest Sync between Azure Tenants

Introduction

UPDATE: August 2018
As promised below I've finally written up my 
Azure AD B2B Invitation Management Agent. 
You can find it in this post here.

UPDATE: June 2018
When I originally wrote this post the intent was to test
the ability of the Graph MA to export to Azure AD. 
That works.

That then extended to messing with an identity type other 
than member (which works to an extent) but I detailed 
guests. However that is incomplete. I do have a working 
solution that utilises the Graph Invitation API via my 
favourite PowerShell MA (Granfeldt PS MA) and the 
PowerShell cmdlet New-AzureADMSInvitation from the 
Azure AD PowerShell Module.

That solution involves the MS Graph MA connected to a 
Partner tenant to get visibility of partner users and 
then creating relevant users in the home tenant via a 
PowerShell MA and the New-AzureADMSInvitation cmdlet. 
Another MS Graph MA connected to the home tenant provides 
easy visibility of additional guest user attributes for
ongoing functions such as reporting and de-provisioning. 

I will write that up later in July.  Stay tuned and keep
the above in mind when reading this post.

Just landed from the Microsoft Identity Manager Engineering Team is a new Management Agent built specifically for managing Azure Users and Groups and Contacts.

Microsoft have documented a number of scenarios for implementing the management agent. The scenarios the MA has been built for are valid and I have customers that will benefit from the new MA immediately. There is however another scenario I’m seeing from a number of customers that is possible but not detailed in the release notes. That is B2B Sync between Azure Tenants; using Microsoft Identity Manager to automate the creation of Guests in an Azure Tenant.

This could be one-way or two-way depending on what you are looking to achieve. Essentially this is the Azure equivalent of using FIM/MIM for Global Address List Sync.

B2B MA.png

Overview

The changes are minimal to the documentation provided with the Management Agent. Essentially;

  • ensure you enable Write Permissions to the Application you create in the AAD Tenant you will be writing too
  • Enable Invite Guest to the Organization permission on the AAD Application
  • Create an Outbound Sync Rule to an AAD Tenant with the necessary mandatory attributes
  • Configure the Management Agent for Export Sync Profiles

In the scenario I’m detailing here I’m showing taking a number of users from Org2 and provisioning them as Guests in Org1.

What I’m detailing here supplements the Microsoft documentation. For configuring the base MA definitely checkout their documentation here.

Microsoft Graph Permissions

When setting up the Graph Permissions you will need to have Write permissions to the Target Azure AD for at least Users. If you plan to also synchronize Groups or Contacts you’ll need to have Write permissions for those too.

Graph Permissions 1

In addition as we will be automating the invitation of users from one Tenant to another we will need to have the permission ‘Invite guest users to the organization’.

Graph Permissions 2

With those permissions selected and while authenticated as an Administrator select the Grant Permissions button to assign those permissions to the Application.

Grant Permissions 1Grant Permissions 2

Repeat this in both Azure AD Tenants if you are going to do bi-directional sync.  If not you only need write and invite permissions on the Tenant you will be creating Guest accounts in.

Creating the Import/Inbound Sync Rules Azure Tenants

Here is an example of my Import Sync Rules to get Members (Users) in from an Azure Tenant. I have an inbound sync rule for both Azure Tenants.

Sync Rules.PNG

Make sure you have ‘Create Resource in FIM‘ configured on the source (or both if doing bi-directional) Graph Connector.

Sync Rule Relationship.PNG

The attribute flow rules I’ve used are below. They are a combination of the necessary attributes to create the corresponding Guest account on the associated management agent and enough to be used as logic for scoping who gets created as a Guest in the other Tenant. I’ve also used existing attributes negating the need to create any new ones.

Inbound SyncRule Flow.PNG

Creating the Export/Outbound Sync Rule to a Partner B2B Tenant

For your Export/Outbound rule make sure you have ‘Create resource in external system’ configured.

Export Relationship.PNG

There are a number of mandatory attributes that need to be flowed out in order to create Guests in Azure AD. The key attributes are;

  • userType = Guest
  • accountEnabled = True
  • displayName is required
  • password is required (and not export_password as normally required on AD style MA’s in FIM/MIM)
  • mailNickname is required
  • for dn and id initially I’m using the id (flowed in import to employeeID) from the source tenant. This needs to be provided to the MA to get the object created. Azure will generate new values on export so we’ll see a rename come back in on the confirming import
  • userPrincipalName is in the format of
    • SOURCEUPN (with @ replaced with _ ) #EXT# DestinationUPNSuffix
    • e.g user1_org2.com#EXT#org1.com

Export Attributes.PNG

Here is an example of building a UPN.

UPN Rule.PNG

Sets, Workflows and MPR’s

I didn’t need to do anything special here. I just created a Set based on attributes coming in from the source Azure Tenant to scope who gets created in the target Tenant. An MPR that looks for transition into the Set and applies the Workflow that associates the Sync Rule.

End to End

After synchronizing in from the source (B2B Org 2) the provisioning rules trigger and created the Users as Guests on B2B Org 1.

Prov to Org1 1.PNG

Looking at the Pending Export we can see our rules have applied.

Pending Export.PNG

On Export the Guest accounts are successfully created.

Export Success.PNG

On the confirming import we get the rename as Azure has generated a new CN and therefore DN for the Guest user.

Rename on Import 2.PNG

Looking into Azure AD we can see one of our new Guest users.

User in AAD.PNG

Summary

Using the Microsoft Azure B2B Graph Management Agent we can leverage it to create users from one Tenant as Azure AD Members in another Tenant. Stay tuned for another post detailed the solution detailed in the Update in the Introduction.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

PCNS provides all the details for the password change.

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 2

Introduction

As the title suggests this is Part 2 of a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.

This post details the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

Part 3 here  details calling the Domino Agents on password sync events (from PCNS via MIM)

Creating a New Domino Application

As mentioned above and in Part 1 we need to create Domino Agents to process password change events into the ID Vault. Domino Agents are required as Domino security will not allow password change events (called using the resetUserPassword method) to be run remotely.  The resetUserPassword method is only supported using the RunOnServer method.

In order to create a Domino Agent we need to install and run the IBM Domino Designer.

With that installed we can start with our first Domino Agent. We will create two Agents. The first will be the one that will perform the execution of the resetUserPassword method. The second will be the trigger that will retrieve the details of the user to change the password for and pass it to the first agent to execute.

In IBM Domino Designer select File => New => Application

Specify the Server to create the new Application on (and subsequently where it will run) and give the Application a name. I used ID Vault PWD Sync.

Create the MIM Password Sync Domino Agent

With the New Application created we can navigate to Code => Agents and select New Agent

Give the Agent a name. I named this one MIMPasswordSync and make sure the type is Java

With the Agent created we need to give it the script that will perform the password changes. Double click on the agent then in the Agent Contents double-click on JavaAgent.java and paste in the script (from Github further below). The only change you may need to make is the location where you want the logging to go to. You will need to create that path if it doesn’t exist as well.

Selecting the Agent Tab in the main pain locate the Agent Properties and configure as per the screenshot below.

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option.

Create the MIM Password Trigger Domino Agent

Create the MIM Password Trigger Domino Agent just as you did the MIM Password Sync Domino Agent. Name it MIMPwdTrigger and make sure the type is also Java. Double click on the Agent and then double-click on JavaAgent.java in the Agent Contents. Use the following script. Note it calls the MIMPasswordSync Agent so if you called yours something different you will need to change it in this script (line 12).

Select the MIMPwdTrigger Agent in the main pane and look at the Properties. Make Runtime to be On event and After documents are created or modified. 

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option

Configuration ID Vault Password Reset Authority

In order for our MIMPasswordSync Agent to actually change users passwords in the ID Vault we need to configure the ID Vault to allow the account that created and signs the Agents and the Server that the Agents will run on to be Trusted Password Reset Authorities.

Using the IBM Domino Administrator select the Administration menu item and then Configuration. Expand Security from the left hand pane and select ID Vaults.

Having selected the ID Vault from the main pane you will be changing passwords in, on the right hand menu pane expand ID Vaults and double-click on Password Reset Authority.

From the Password reset authority by organisation box select the OrgU/Org you will be sync’ing passwords too. If you have many you will need to complete this step for each one. You will need to do one OrgU/Org at a time if they have different certifiers.

From the Available users, groups and servers box select the Server that you run the Agents with, and select Add. Repeat for selecting the user that you created the Agents with and that will sign the Agents. Then select the user you just added in the Password reset authority by organisation and then select the Password reset agent authority checkbox.  That will put the red @ symbol on the user which identifies it as a Password Reset Agent Authority.

Select Next/Configure, locate the certifier ID for that OrgU/Org, provide the password and complete the process. Repeat for each OrgU/Org.

Signing the Agents

Back using the Domino Designer double-click Agents in the left menu pane. Select each Agent and then click Sign.

Creating a Form to test the Agents

Now we will create a form to allow us to create a document in the DB easily and test that our agents work. In Domino Designer, right-click on Forms and select New Form.

Give the Form a name and an alias. It doesn’t matter what you call it. We’re just using it to test the agents.

Double Click on your new Form. Click in the empty pane and then from the Create menu select Field. Name it server. Repeat for another field and name it username.

Repeat for the third text field, but name it password and select the Type Password.

Finally from the Create menu select Hotspot => Button. Name it Submit and then select it. In the Properties of the button for Run select Client. For Formula enter the formula below.

@Command([FileSave]);
@Command([FileCloseWindow]);
@Command([ToolsRunMacro];"MIMPwdTrigger")

Testing the Agents

In the Domino Designer right-click on the form and select Preview in Notes.

The format for the fields is;

  • server: Server/Org
  • user: Joe Smith/OrgU/Org
  • password: P@SSw0rd

Enter valid input for your environment.

And click on the Submit button. If you have everything correct the document you just created will be processed by the Trigger Agent and then the MIM Password Sync Agent.

If there is an error you will likely have a document still in the IDVault PWD Sync database as shown below. Check the document to make sure you got the details for the user and server correct.

Also check the log file. C:\PWDSync\AgentLog.txt by default as per the script path. When working correctly you will see an entry as per below. If it wasn’t successful the error message should point you to where you have gone wrong. More than likely different names for the Agents, or incorrect format or name for the user and/or server. Or Trusted Password Authority not set for the account the agent was signed with to the OrgU/Org containing the user you are trying to change a password for.

MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Reseting password ...
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Server: XXXNotes1 User:Jane Doe/OrgU/Org-Aus
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Return value: true
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Removed User ID Vault change document from 'XXXNotes1'

Summary

Now that we have our Agents built and working we need to be able to call them from our MIM Sync Server. That will be covered in the third and final post in this series.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 1

Introduction

Recently I wrote about getting started with the latest IBM/Lotus Notes/Domino Management Agent for Microsoft Identity Manager. In a recent engagement we are using that MA to provision and manage identities into Domino. We are also using the MA to synchronise passwords via PCNS and MIM to the Notes users’ Internet (HTTP) password.

What you may or may not be aware of is that IBM introduced a new feature with Domino 8.5 called the ID Vault. The ID Vault is a Domino based application that holds protected copies of Notes user IDs. Now here’s the twist. The Microsoft Domino MA only supports password synchronisation to the HTTP password, not to the ID Vault.

My customer is using the ID Vault and naturally we need to synchronise password changes to both the HTTP Password and the ID Vault (for users Notes IDs). This post is the first in a series that details how I recently accomplished synchronising passwords to the Domino ID Vault.

  1. This post provides the introduction and the creation of a PowerShell Management Agent into Domino to join identities into the MIM Metaverse
  2. Post two details Creating Domino Agents that will handle taking requests from the MIM PS MA to change users ID Vault password
  3. Post three will detail calling the Domino Agents on password sync events (from PCNS via MIM)

Overview

The following diagram shows a high-level overview of password synchronisation using FIM/MIM from AD to Domino. Password changes/resets can be initiated using a number of methods. The FIM/MIM Self Service Password Reset functionality, users changing their password via their domain joined workstations as defined by AD Group Password Polic(y)ies, using the AD FS Password Change function, or even on behalf of users by a Service Desk/Administrator. In each scenario implementing Microsoft’s Password Change Notification Service will get the password change to FIM/MIM. I’m not going to cover PCNS as it is out of the box and straight forward to install and configure. This MS PFE PCNS implementation document covers it quite well.

Likewise I’m not going to go into any detail about password sync to the HTTP Password. That’s out of the box functionality, that is pretty much the same as any other MA configured as a Password Sync Target. That said in my environment I did have to configure the MS Domino MA like this to get password events out to Domino.

ID Vault FIM/MIM PowerShell Management Agent

First up, we are going to need a Management Agent to join Notes users to our users from Active Directory in the Metaverse. I’ve gone to my favourite PowerShell Management Agent (Granfeldt) for this.

The Granfeldt PS MA will be configured to;

  • Import and join Domino Users to the Metaverse. The MA will be slimline in the number of attributes it brings in. Enough to perform the join and have enough information about the users context in Domino to be able to perform the password sync event
  • be a target for Password Synchronisation
  • send the password change event to the Domino Agent we will build to perform the password change. A Domino Agent is required as the ID Vault will only accept password changes from a process run on the Domino Server(s). More on this in parts 2 and 3

The integration of the MIM Sync Engine with Domino with the PowerShell Management Agent is done using LDAP. The Name and Address Book is easily accessed via LDAP.

To get started I looked up the Server Document for the Domino Server I wanted to connect to that had the Name and Address Book. Selecting the Directory tab I could see that LDAP(S) 389/636 was enabled.

I then went to the Name and Address Book and looked up my Admin Notes ID to get its context so I could translate it to LDAP format. Darren Kloud Robinson/OrgU/Org-AUS translates to CN=Darren Kloud Robinson,OU=Org,O=Org-AUS“.

Knowing my Notes ID Password I used LDP from Windows Server to bind to the Domino Directory. You could use any LDAP Browser/Tool.

Once I validated I could connect and browse the tree, I knew I had my connection details sorted, I translated that to PowerShell.

That looked something like this;

I then wrapped this into a FIM/MIM PowerShell Management Agent. The Granfeldt PS MA Scripts are below.

Domino PowerShell Management Agent Schema Script

As described above the Schema Script is very light on the number of attributes specified. Basically the minimum required to get a join and give us the context of the user to process password sync events.

Domino PowerShell Management Agent Import Script

As detailed above the Import brings through enough metadata to perform the join and give us the attributes needed for the user context to be able to sync passwords through.

Domino PowerShell Management Agent Export Script

File just needs to exist. Not used in this scenario.

Domino PowerShell Management Agent Password Script

See Part 3 in this series for the Password.ps1 script. But if you are following sequentially, copy the empty Export.ps1 script for now and name it Password.ps1 and have it located in the same directory as the other PS MA scripts.

Wiring it all together

As for creating the PS MA, I’ve detailed this in-depth many times. Check out this post (or the many other similar I’ve posted) and the Getting Started section if you are new to the Granfeldt PowerShell Management Agent. Copy the above scripts to the directory you create, and when creating the MA provide the paths to the scripts (in 8.3 format).

A key item though is to configure the PS MA as a Password Sync Target as per the screenshot below. You will also need to configure where passwords are coming from to send to this new MA. If it is Active Directory, open the Properties of your AD MA select Configure Directory Partitions then under Password Synchronization enable the checkbox Enable this partition as a password synchronization source. Select Targets and select your newly created Notes ID Vault Password MA. Select Ok then Ok again.

After creating a Run Profile and doing a Stage and Import, based on your Join rule (probably email address) you should have a heap of connectors. In my environment displayName contains the context of the user. Eg. Full Name/OrgU/Org We’ll need this to send the password change event to the ID Vault.

Summary

Through the PowerShell MA as detailed above we have been able to enumerate users from Domino and join them to existing users in the MIM Sync Metaverse. We can now set about creating Domino Agents to take password sync events from this MA and change users passwords in the Domino ID Vault.

Part 2 in this series here details creating the Domino Agents and configuring Domino to accept the changes.

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

 

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

Provision Users for Exchange with FIM/MIM 2016 using the Granfeldt PowerShell MA, avoiding the AD MA (no-start-ma) error

Forefront / Microsoft Identity Manager provides Exchange Mailbox provisioning out of the box on the Active Directory Management Agent. I’ve used it in many many implementations over the years. However, in my first MIM 2016 implementation in late 2015 I ran into issues with something I’d done successfully many times before.

I was getting “no-start-ma” on the AD MA on export to AD. The point at which the MA sets up its connection to the Exchange environment. After some searching I found Thomas’s blog detailing the problem and a solution. In short update the MIM Sync Server to .NET 4.6. For me this was no-joy. However when MS released the first rollup update for MIM in December everything fired up and worked as normal.

Step forward a month as I was finalising development for the MIM solution I was building for my customer and my “no-start-ma” error was back when I re-enabled mailbox provisioning. Deselect the Exchange Provisioning option on the AD MA and all is good. Re-enable it and it fails. One week left of dev and I need mailbox provisioning so time for a work around whilst I lodge a Premier Support ticket.

So how can I get mailbox provisioning working reliably and quickly? I was already using Søren Granfeldt’s PowerShell MA for managing users Terminal Services configuration, Home Directories and Lync/Skype for Business. What’s one more. Look out for blog posts on using the PS MA to perform those other functions that I’ll be posting in the coming weeks.

Using the Granfeldt PowerShell Management Agent to Provision Exchange Mailboxes

In this blog post I’ll document how you can enable Mailbox Provisioning in Exchange utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of enabling a user with a mailbox. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change account settings for POP/IMAP/ActiveSync and de-provisioning).

My Exchange PS MA is used in conjunction with an Active Directory MA and Declarative Provisioning Rules in the MIM Portal. Essentially all the AD MA does, when you enable Exchange Provisioning (when it works) is call the ‘update-recipient’ cmdlet to finish of the mailbox provisioning. My Exchange PSMA does the same thing.

Overview

There are three attributes you need to supply values for in order to then provision them a mailbox (on top of having an Active Directory account, or course);

  • mailNickName
  • homeMDB
  • homeExchangeServerName

The later two I’m flowing the appropriate values for using my Active Directory MA. I’m setting those attributes on the AD MA as I’m provisioning the AD account on that MA which then lets me set those two attributes as initial flow only. I’m doing that as over time it is highly likely that those attribute values may change with normal business as usual messaging admin tasks. I don’t want my Exchange MA stomping all over them.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the management agent account to the appropriate Exchange role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Exchange

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing mailbox provisioning, the schema only consists of the attributes needed to know the state of the user with respect to enablement and the attributes associated with enabling and confirming a user for a mailbox.

https://gist.github.com/darrenjrobinson/ae46cdfccb825dce69b3

Password Script (password.ps1)

Empty as described above.

Import Script (Import.ps1)

Import values for attributes defined in the schema.

Export Script (Export.ps1)

The business part of the MA. Take the mailnickName attribute value flowed from FIM, (the other required attributes are populated via the AD MA) and call update-recipient to provision the mailbox.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attribute, create the Set used for the transition, create your Synchronisation Rule, create your Mailbox Workflow, create your Mailbox MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management its empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flows are a combination of logic used for other parts of my solution, a Boolean flag & Mailbox GUID to determine if the user has a mailbox or not (used for my Transition Set and my Export script).

Below is my rules extension that sets a boolean value in the MV and then flowed to the MIM Portal that I use in my Transition Set to trigger my Synchronisation Rule.

Synchronisation Rules

My Exchange Outbound Sync rule doesn’t and isn’t complex. All it is doing is sync’ing out the mailnickName attribute and applying the rule based on an MPR, Set and Workflow.

For this implementation my outbound attribute flow for mailnickName is a simple firstname.lastname format.

Set

I have a Set that I use as a ‘transition set’ to trigger provisioning to Lync. My Set looks to see if the user account exists in AD (I flow in the AD DN to an attribute in the Portal) and the mailbox status (set by the Advanced Flow Rule shown above). I also have (not shown in the screenshot) a Boolean attribute in the MIM Portal that is set based on an advanced flow rule on the AD MA that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active or not).

Workflow

An action based workflow that will use the trigger the Synchronisation rule for Exchange Mailbox creation.

MPR

Finally my MPR for provisioning mailboxes is based on the transition set,

and my Mailbox Workflow.

Summary

Using the Granfeldt PowerShell MA I was able to quickly abstract Mailbox Provisioning from the AD Management Agent and perform the functionality on its own MA.

 

Follow Darren on Twitter @darrenjrobinson