How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

Introduction to MIM Advanced Workflows with MIMWAL

Introduction

Microsoft late last year introduced the ‘MIMWAL’, or to say it in full: (inhales) ‘Microsoft Identity Manager Workflow Activity Library’ – an open source project that extends the default workflows & functions that come with MIM.

Personally I’ve been using a version of MIMWAL for a number of years, as have my colleagues, in working on MIM projects with Microsoft Consulting.   This is the first time however it’s been available publicly to all MIM customers, so I thought it’d be a good idea to introduce how to source it, install it and work with it.

Microsoft (I believe for legal reasons) don’t host a compiled version of MIMWAL, instead host the source code on GitHub for customers to source, compile and potentially extend. The front page to Microsoft’s MIMWAL GitHub library can be found here: http://microsoft.github.io/MIMWAL/

Compile and Deploy

Now, the official deployment page is fine (github) but I personally found Matthew’s blog to be an excellent process to use (ithinkthereforeidam.com).  Ordinarily, when it comes to installing complex software, I usually combine multiple public and private sources and write my own process but this blog is so well done I couldn’t fault it.

…however, some minor notes and comments about the overall process:

  • I found that I needed to copy the gacutil.exe and sn.exe utilities you extract from the old FIM patch in the ‘Solution Output’ folder.  The process mentions they need to be in the ‘src\Scripts’ (Step 6), but they need to be in the ‘Solution Output’ folder as well, which you can see in the last screenshot of that Explorer folder in Step 8 (of process: Configure Build/Developer Computer).
  • I found the slowest tasks in the entire process was sourcing and installing Visual Studio, and extracting the required FIM files from the patch download.  I’d suggest keeping a saved Windows Server VM somewhere once you’ve completed these tasks so you don’t have to repeat them in case you want to compile the latest version of MIMWAL in the future (preferably with MIM installed so you can perform the verification as well).
  • Be sure to download the ‘AMD 64’ version of the FIM patch file if you’re installing MIMWAL onto a Windows Server 64-bit O/S (which pretty much everyone is).  I had forgotten that old 64 bit patches used to be titled after the AMD 64-bit chipset, and I instead wasted time looking for the newer ‘x64’ title of the patch which doesn’t exist for this FIM patch.

 

‘Bread and Butter’ MIMWAL Workflows

I’ll go through two examples of MIMWAL based Action Workflows here that I use for almost every FIM/MIM implementation.

These action workflows have been part of previous versions of the Workflow Activity Library, and you can find them in the MIMWAL Action Workflow templates:

I’ll now run through real world examples in using both Workflow templates.

 

Update Resource Workflow

The Update Resource MIMWAL action workflow I use all the time to link two different objects together – many times linking a user object with a new and custom ‘location’ object.

For new users, I execute this MIMWAL workflow when a user first ‘Transitions In’ to a Set whose dynamic membership is “User has Location Code”.

For users changing location, I also execute this workflow use a Request-based MPR of the Synchronization Engine changing the “Location Code” for a user.

This workflow looks like the following:

location1

The XPath Filter is:  /Location[LocationCode = ‘[//Target/LocationCode]’]

When you target the Workflow at the User object, it will use the Location Code stored in the User object to find the equivalent Location object and store it in a temporary ‘Query’ object (referenced by calling [//Queries]):

Location2.jpg

The full value expression used above, for example, sending the value of the ‘City’ attribute stored in the Location object into the User object is:

IIF(IsPresent([//Queries/Location/City]),[//Queries/Location/City],Null())

This custom expression determines if there is a value stored in the ‘[//Queries]’ object (ie. a copy of the Location object found early in the query), and if there is a value, then send it to the City attribute of the user object ie. the ‘target’ of the Workflow.  If there is no value, it will send a ‘null’ value to wipe out the existing value (in case a user changes location, but the new location doesn’t have a value for one of the attributes).

It is also a good idea (not seen in this example) to send the Location’s Location Code to the User object and store it in a ‘Reference’ attribute (‘LocationReference’).  That way in future, you can directly access the Location object attributes via the User object using an example XPath:  [//Person/LocationReference/City].

 

Generate Unique Value from AD (e.g. for sAMAccountName, CN, mailnickname)

I’ve previously worked in complex Active Directory and Exchange environments, where there can often be a lot of conflict when it comes to the following attributes:

  • sAMAccountName (used progressively less and less these days)
  • User Principal Name (used progressively more and more these days, although communicated to the end user as ’email address’)
  • CN (or ‘container’ value, which forms part of the LDAP Distinguished Name (DN) value.  Side note: the most commonly mistaken attribute for admins who think this is the ‘Display Name’ when they view it in AD Users & Computers.
  • Mailnickname (used by some Exchange environments to generate a primary SMTP address or ‘mail’ attribute values)

All AD environments require a unique sAMAccountName (otherwise you’ll get a MIM export error into AD if there’s already an account with it) for any AD account to be created.  It will also require a unique CN value in the same OU as other objects, otherwise the object cannot be created.  Unique CN values are generally required to be unique if you export all user accounts for a large organization to the same OU where there is a greater chance for a conflict happening.

UPNs are generally unique if you copy a person’s email address, but sometimes not – sometimes it’s best to combine a unique mailnickname, append a suffix and send that value to the UPN value.  Again, it depends on the structure and naming of your AD, and the applications that integrate with it (Exchange, Office 365 etc.).

Note: the default MIMWAL Generate Unique Value template assumes the FIM Service account has the permissions required to perform LDAP lookups against the LDAP path you specify.  There are ways to enhance the MIMWAL to add in an authentication username/password field in case there is an ‘air gap’ between the FIM server’s joined domain and the target AD you’re querying (a future blog post).

In this example in using the ‘Generate Unique Value’ MIMWAL workflow, I tend to execute as part of a multi-step workflow, such as the one below (Step 2 of 3):sam1

I use the workflow to generate a query of the LDAP to look for existing accounts, and then send that value to the [//Workflowdata/AccountName] attribute.

The LDAP filter used in this example looks at all existing sAMAccountNames across the entire domain to look for an existing account:   (&(objectClass=user)(objectCategory=person)(sAMAccountName=[//Value]))

The workflow will also query the FIM Service database for existing user accounts (that may not have been provisioned yet to AD) using the XPath filter:  /Person[AccountName = ‘[//Value]’]

The Uniqueness Key Seed in this example is ‘2’, which essentially means that if you cannot resolve a conflict with using other attribute values (such as a user’s middle name, or using more letters of a first or last name) then you can use this ‘seed’ number to break the conflict as a last resort.  This number increments by 1 for each confict, so if there’s a ‘michael.pearn’, and a ‘michael.pearn2’ for example, the next one to test will be ‘michael.pearn3’ etc etc.

sam2

The second half of the workflow shows the rules to use to generate sAMAccountName values, and the rules in order in which to break the conflict.  In this example (which is a very simple example), I use an employee’s ‘ID number’ to generate an AD account.  If there is already an account for that ID number, then this workflow will generate a new account with the string ‘-2’ added to the end of it:

Value Expression 1 (highest priority): NormalizeString([//Target/EmployeeID])

Value Expression 2 (lowest priority):  NormalizeString([//Target/EmployeeID] + “-” + [//UniquenessKey])

NOTE: The function ‘NormalizeString’ is a new MIMWAL function that is also used to strip out any diacritics character out.  More information can be found here: https://github.com/Microsoft/MIMWAL/wiki/NormalizeString-Function

sam3

Microsoft have posted other examples of Value Expressions to use that you could follow here: https://github.com/Microsoft/MIMWAL/wiki/Generate-Unique-Value-Activity

My preference is to use as many value expressions as you can to break the conflict before having to use the uniqueness key.  Note: the sAMAccountName has a default 20 character limit, so often the ‘left’ function is used to trim the number of characters you take from a person’s name e.g. ‘left 8 characters’ of a person’s first name, combined with ‘left 11 characters’ of a person’s last name (and not forgetting to save a character for the seed value deadlock breaker!).

Once the Workflow step is executed, I then send the value to the AD Sync Rule (using [//WorkflowData/AccountName] to then pass to the outbound ‘AccountName –> sAMAccountName’ outbound AD rule flow:

sam4

 

More ideas for using MIMWAL

In my research on MIMWAL, I’ve found some very useful links to sample complex workflow chains that use the MIMWAL ‘building block’ action workflows and combine them to do complex tasks.

Some of those ideas can be found here by some of Microsoft’s own MSDN: https://blogs.msdn.microsoft.com/connector_space/2016/01/15/the-mimwal-custom-workflow-activity-library/

These include:

  • Create Employee IDs
  • Create Home Directories
  • Create Admin Accounts

I particularly like the idea of using the ‘Create Employee ID’ example workflow, something that I’ve only previously done outside of FIM/MIM, for example with a SQL Trigger that updates a SQL database with a unique number.

 

 

Setting up your SP 2013 Web App for MIM SP1 & Kerberos SSO

I confess: getting a Microsoft product based website working with Kerberos and Single Sign On (i.e. without authentication prompts from a domain joined workstation or server) feels somewhat of a ‘black art’ for me.

I’m generally ok with registering SPNs, SSLs, working with load balancing IPs etc, but when it comes to the final Internet Explorer test, and it fails and I see an NTLM style auth. prompt, it’s enough to send me into a deep rage (or depression or both).

So, recently, I’ve had a chance to review the latest guidance on getting the Microsoft Identity Manager (MIM) SP1 Portal setup on Windows Server 2012 R2 and SharePoint Foundation 2013 SP1 for both of the following customer requirements:

  • SSL (port 443)
  • Single Sign On from domain joined workstations / servers

The official MIM guidance here is a good place to start if you’re building out a lab (https://docs.microsoft.com/en-us/microsoft-identity-manager/deploy-use/prepare-server-sharepoint).  There’s a major flaw however in this guidance for SSL & Kerberos SSO – it’ll work, but you’ll still get your NTLM style auth. prompt should you configure the SharePoint Web Application initially under port 82 (if you’re following this guidance strictly like I did) and then in the words of this article: “Initially, SSL will not be configured. Be sure to configure SSL or equivalent before enabling access to this portal.”

Unfortunately, this article doesn’t elaborate on how to configure Kerberos and SSL post FIM portal installation, and to then get SSO working across it.

To further my understanding of the root cause, I built out two MIM servers in the same AD:

  • MIM server #1 FIM portal installed onto the Web Application on port 82, with SSL configured post installation with SSL bindings in IIS Manager and a new ‘Intranet’ Alternate Access Mapping configured in the SharePoint Central Administration
  • MIM server #2, FIM portal installed onto the Web Application built on port 443 (no Alternate Access Paths specified) and SSL bindings configured in IIS Manager.

After completion, I found MIM Server #1 was working with Kerberos and SSO under port 82, but each time I accessed it using the SSL URL I configured post installation, I would get the NTLM style auth. prompt regardless of workstation or server used to access it.

With MIM server #2, I built the web application purely into port 443 using this command:

New-SpWebApplication -Name “MIM Portal” -ApplicationPool “MIMAppPool” -ApplicationPoolAccount $dbManagedAccount -AuthenticationMethod “Kerberos” -SecureSocketsLayer:$true -Port 443 -URL https://<snip>.mimportal.com.au

Untitled.jpg

The key switches are:

  • -SecureSocketsLayer:$true
  • -Port 443
  • -URL (with URL starting with https://)

I then configured SSL after this SharePoint Web Application command in IIS Manager with a binding similar to this:

ssl1

A crucial way to see if it’s configured properly is to test the MIM Portal FQDN (without the /identitymanagement specification) you’re intending to use after you configure SharePoint Web Application and bind the SSL certificate in IIS Manager but BEFORE you install the FIM Service and Portal.

So in summary test this:

Verify it working with SSO, then install the FIM Portal to get this URL working:

The first test should appear as a generic ‘Team Site’ in your browser without authentication prompt from a domain joined workstation or server if it’s working correctly.

The other item to take note is that I’ve seen other guidance that this won’t work from a browser locally on the MIM server – something that I haven’t seen in any of my tests.  All test results that I’ve seen are consistent with using a browser from any domain joined workstation, remote domain joined server or the domain joined MIM server itself.  There’s no difference in results in terms of SSO in my opinion.   Be sure to add the MIM portal to the ‘Intranet’ site as well for you testing.

Also, I never had to configure ‘Require Kerberos = True’ for the Web Config that used to be part of the guidance for FIM and previous versions of SharePoint.  This might work as well, but wouldn’t explain the port 82/443 differences for MIM Server #1 (ie. why would that work for 443 and not 82? etc.)

I’ve seen other MIM expert peers configure their MIM sites using custom PowerShell installations of SharePoint Foundation to configure the MIM portal under port 80 (overwriting the default SharePoint Foundation 2013 taking over port 80 during it’s wizard based installation).  I’m sure that might be a valid strategy as well, and SSO may then work as well with SSL with further configuration, but I personally can’t attest to that working.

Good luck!

 

 

 

 

 

output

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

 

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

 

 

 

pageimports

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

 

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries

Introduction

In August 2016 I wrote this post on how to use PowerShell to leverage the Microsoft GraphAPI and use Differential Queries. The premise behind that post was I required a Microsoft Identity Manager Management Agent to synchronize identity information from AzureAD into Microsoft Identity Manager. However the environment it was intended for has a large AzureAD implementation and performing a Full Sync every-time is just to time consuming. Even more so with this limitation that still exists today in MIM 2016 with SP1.

In this blog post I’ll detail how to implement a PowerShell Management Agent for FIM/MIM to use the MS GraphAPI to synchronize objects into FIM/MIM, supporting Delta and Full Synchronization run profiles. I’m also using my favourite PowerShell Management Agent, the Granfeldt PowerShell Management Agent.

Pre-requsites

I’m using the ADAL helper library from the AzureADPreview PowerShell Module. Install that module on you MIM Sync Server via PowerShell (WMF5 or later) using the PowerShell command;

Install-Module AzureADPreview

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the AzureAD/Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions if you intend to write-back to AzureAD.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\AzureAD

Schema.ps1

My Schema is based around enumerating and managing users from AzureAD. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. Use the Schema info below for a base set of attributes that will get you started. You can add more as required. I’ve prefixed most of them with AAD for my implementation.

If you want to manage Groups or Contacts or a combination of object types, you will need to update the Schema.ps1 script accordingly.

Import.ps1

The logic that the Import.ps1 implements is the same as detailed here in my post using Differential Queries. Essentially, perform a full import and create a file with the cookie/watermark. Allow Delta Sync run profiles to be performed by requesting the GraphAPI to return only changes since the cookie/watermark.

You’ll need to update the script for your AzureAD Tenant name on Line 28. Also the path to where the cookie file will go and the debug file if your path is different to mine. Lines 11, 46 and 47.

Importing of the attributes is based around the names in the Schema.ps1 scripts. Any changes you made there will need to be reflected in the import.ps1.

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented in this example. If you are going to write information back to AzureAD you’ll need to put the appropriate logic into this script.

Management Agent Configuration

With the Granfeldt PowerShell Management Agent installed on your FIM/MIM Synchronisation Server, in the Synchronisation Server Manager select Create Management Agent and choose “PowerShell” from the list of Management Agents to create.

As this example is for Users, I’ve named my MA accordingly.

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to AzureAD/Office365 and import the user data.

Paths to the Import, Export and Password scripts. Note: the Export and Password PS1 scripts files exist but are empty.

Object Type as configured in the Schema.ps1 file.

Attributes as configured in the Schema.ps1 file.

Anchor as per the Schema.ps1 file.

The rest of the MA configuration is up to your implementation. What you are going to join on and what attributes flow into the MV will vary based on your needs and solution. At a minimum you’d probably be looking to do a join on immutableID (after some manipulation) or UPN and flow in attributes such as AADAccountEnabled etc.

Completing the Configuration

To finalise the MA you’ll need to do the usual tasks of creating run profiles, staging the connector space from AzureAD/Office365 and syncing into the Metaverse. Once you’ve done your initial Stage/Full Sync you can perform Delta Sync’s.

Summary

A “Full Import” on a small AzureAD (~8500 Users) took 2 minutes.
A subsequent “Delta Import” with no changes took 6 seconds.

A similar implementation of the logic, but for Groups gives similar results/performance.
A  “Full Import” on a small AzureAD (~9800 Groups) took 5 minutes.
A subsequent “Delta Import” with 7 Adds (new Groups) and 157 Updates took 1 minute.

 

Follow Darren on Twitter @darrenjrobinson

licensing-report

How to embed Power BI Reports into the Microsoft Identity Manager Portal

About seven years ago at a conference in Los Angeles I attended I remember a session where a consultant from Oxford Computer Group gave a presentation on integrating Quest Identity Manager (now Dell One Identity Manager) with the Forefront Identity Manager Portal. I’ve recently had a requirement to do something similar and Carol pointed me in the direction of her experiments with doing something similar based off inspiration from that same presentation/session.

Well it is now 2017 and FIM and SharePoint have all moved through a few versions and doing something similar has changed. So now that I’ve got it working I thought I’d share how I’ve done it, and also to solicit any improvements. I’ve done this with SharePoint 2013.

Overview

In this post I’ll detail;

  • Publishing a Power BI Report
  • Creating new Microsoft Identity Manager Navigation Bar Resources
  • Embedding as an IFrame the published Power BI Report in the Microsoft Identity Manager Portal so that it appears like below

Pre-requisites

Obviously to follow this verbatim you are going to need to have a Power BI workspace and a Power BI Report. But you could embed any page you want to test it out.

You’ll also need;

Publish a Power BI Report

In Power BI select a Report you are looking to embed in the MIM Portal. I selected License Plans under Reports from my Power BI Worksapce.

From the File menu select Publish to Web.

Select Create embed code.

Copy the link to your report somewhere where you can retrieve it easily later. Don’t worry about the HTML line or the size.

 

SharePoint Designer

Download and install with the defaults SharePoint Designer 2013 from the link above. I’m using the 64-bit version. I installed it on my Development MIM Portal Server. I’m using 2013 as my MIM Portal is using SharePoint 2013 Foundation (with SP1).

Once installed start SharePoint Designer and select Open Site.

Enter the URL for your MIM Portal and select Open.

Note: In order for SharePoint Designer to successfully load your MIM Portal Site, the URL you provide above must be in your SharePoint Alternate Access Mappings. If it isn’t you will probably get the error “The server could not complete your request. For more specific information, click the Details button.”

And in your Windows Application Event Log Event ID 3 – WebHost

WebHost failed to process a request.

 Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/42194754

 Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/client.svc' cannot be activated due to an exception during compilation.

 

Select Microsoft Identity Management, then All Files. You should then see a list of all the files in the MIM Portal website.

Locate the aspx folder, right click on it and select New => Folder. Create a new folder under the aspx directory named ‘reports’.

Right click on your new Reports Folder and select New => ASPX. Create an aspx file named reports.aspx.

Repeat to create another aspx file named report.aspx.

 

Click on the Reports.aspx file form the main pane and put the following contents in it overwritting everything else. Select Save.

<%@ Page Language="C#" %>
<html dir="ltr">

<head runat="server">
<meta name="WebPartPageExpansion" content="full" />
<title>Reports</title>

 window.open("report.aspx",target="_self")


</head>
<body/>
</html>

Click on the report.aspx file and replace the contents with the following and select Save.

Replace <yourreportlink> in https://app.powerbi.com/view?r=<yourreportlink&gt; with your Power BI link.

<%@ Page masterpagefile="~masterurl/custom.master" Title="Reports" language="C#" inherits="Microsoft.SharePoint.WebPartPages.WebPartPage, Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" meta:progid="SharePoint.WebPartPage.Document" UICulture="auto" Culture="auto" meta:webpartpageexpansion="full" %>
<%@ Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="Utilities" Namespace="Microsoft.SharePoint.Utilities" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Import Namespace="Microsoft.SharePoint" %> <%@ Register Tagprefix="WebPartPages" Namespace="Microsoft.SharePoint.WebPartPages" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<asp:Content ContentPlaceHolderID="PlaceHolderTitleBar" Visible="true" runat="server">
</asp:Content>

<asp:Content id="content1" ContentPlaceHolderID="PlaceHolderMain" runat="server">
https://app.powerbi.com/view?r=

</asp:Content>

MIM Portal Navigation Resources

Now we need to create the MIM Portal Navigation Resources to link to our new files.

In the MIM Portal Select Navigation Bar Resources. Select New.

 

Provide a Display Name, Description and select Next. Ignore Usage Keyword for now. More on that later.

Make the Parent Order 8 to have it at the bottom of the Left Nav bar. Order is 0 as this is going to be our Group header. Select Next.

Provide the path to the Reports.aspx file  ~/IdentityManagement/aspx/reports/reports.aspx Select Next.

Provide the Localised Display name, select Finish and then Submit.

Repeat, this time for linking to ~/IdentityManagement/aspx/reports/report.aspx and name it Licensing Report or whatever makes sense for your report. Also make the Order 1 so it nests under Reports.

Perform an IISReset.

Refresh you MIM Portal Page and you should see your new menu items on the left Navigatin Bar at the bottom.

Click on Reports and your Licensing Report will auto-magically load. Same as if you click on Licensing Report. You can now add as many reports as you need. And change which report you want to be the default by updating the Reports.aspx file in SharePoint Designer.

You will probably also want to limit who see’s what reports. You can do that through Usage Keywords and Sets etc. By default as described here the reports will only be visible to Administrators.  Details to get you started on changing who can see what can be found here.

Let me know if you have any improvements.

 

Follow Darren Robinson on Twitter

commands

How to configure a Graphical PowerShell Dev/Admin/Support User Interface for Azure/Office365/Microsoft Identity Manager

During the development of an identity management solution I find myself with multiple PowerShell/RDP sessions connected to multiple environments using different credentials often to obtain trivial data/information. It is easy to trip yourself up as well with remote powershell sessions to differing environments. If only there was a simple UI that could front-end a set of PowerShell modules and make those simple queries quick and painless. Likewise to allow support staff to execute a canned set of queries without providing them elevated permissions.

I figured someone would have already solved this problem and after some searching with the right keywords I found the powershell-command-executor-ui from bitsofinfo . Looking into it he had solved a lot of the issues with building a UI front-end for PowerShell with the powershell-command-executor and the stateful-process-command-proxy. That solution provided the framework for what I was thinking. The ability to provide a UI for PowerShell using powershell modules including remote powershell was exactly what I was after. And it was built on NodeJS and AngularJS so simple enough for some customization.

Introduction

In this blog post I’ll detail how I’ve leveraged the projects listed above for integration with;

Initially I had a vision of serving up the UI from an Azure WebApp. NodeJS on Azure WebApp’s is supported, however with all the solution dependencies I just couldn’t get it working.

My fallback was to then look to serve up the UI from a Windows Server 2016 Nano Server. I learnt from my efforts that a number of the PowerShell modules I was looking to provide a UI for, have .NET Framework dependencies. Nano Server does not have full .NET Framework support. Microsoft state to do so would mean the server would no longer be Nano.

For now I’ve deployed an Azure Windows Server 2016 Server secured by an Azure NSG to only allow my machine to access it. More on security later.

Overview

Simply, put the details in Github for the powershell-command-executor provide the architecture and integration. What I will detail is the modifications I’ve made to utilize the more recent AzureADPreview PowerShell Module over the MSOL PowerShell Module. I also updated the dependencies of the solution for the latest versions and hooked it into Microsoft Identity Manager. I also made a few changes to allow different credentials to be used for Azure and Microsoft Identity Manager.

Getting Started

I highly recommend you start with your implementation on a local development workstation/development virtual machine. When you have a working version you’re happy with you can then look at other ways of presenting and securing it.

NodeJS

NodeJS is the webserver for this solution. Download NodeJS for your Windows host here. I’m using the 64-bit version, but have also implemented the solution on 32-bit. Install NodeJS on your local development workstation/development virtual machine.

You can accept all the defaults.

Following the installation of NodeJS download the powershell-command-executor-ui from GitHub. Select Clone 0r Download, Download ZIP and save it to your machine.

Right click the download when it has finished and select Extract All. Select Browse and create a folder at the root of C:\ named nodejs. Extract powershell-command-executor-ui.

Locate the c:\nodejs\powershell-command-executor-ui-master\package.json file.

Using an editor such as Notepad++ update the package.json file ……

…… so that it looks like the following. This will utilise the latest versions of the dependencies for the solution.

From an elevated (Administrator) command prompt in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\program files\nodejs npm” installThis will read the package.json file you edited and download the dependencies for the solution.

You can see in the screenshot below NodeJS has downloaded all the items in package.json including the powershell-command-executor and stateful-process-command-proxy.

When you now list the directories under C:\nodejs\powershell-command-executor-ui-master\node_modules you will see those packages and all their dependencies.

We can now test that we have a working PowerShell UI NodeJS website. From an elevated command prompt whilst still in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\Program Files\nodejs\node.exe” bin\www

Open a browser on the same host and go to http://localhost:3000&#8221;. You should see the default UI.

Configuration and Customization

Now it is time to configure and customize the PowerShell UI for our needs.

The files we are going to edit are:

  • C:\nodejs\powershell-command-executor-ui-master\routes\index.js
    • Update Paths to the encrypted credentials files used to connect to Azure, MIM. We’ll create the encrypted credentials files soon.
  • C:\nodejs\powershell-command-executor-ui-master\public\console.html
    • Update for your customizations for CSS etc.
  • C:\nodejs\powershell-command-executor-ui-master\node_modules\powershell-command-executor\O365Utils.js
    • Update for PowerShell Modules to Import
    • Update for Commands to make available in the UI

We also need to get a couple of PowerShell Modules installed on the host so they are available to the site. The two I’m using I’ve mentioned earlier. With WMF5 intalled using Powershell we can simply install them as per the commands below.

Install-Module AzureADPreview
Install-Module LithnetRMA

In order to connect to our Microsoft Identity Manager Synchronization Server we are going to need to enable Remote Powershell on our Microsoft Identity Manager Synchronization Server. This post I wrote here details all the setup tasks to make that work. Test that you can connect via RPS to your MIM Sync Server before updating the scripts below.

Likewise for the Microsoft Identity Manager Service Server. Make sure after installing the LithnetRMA Powershell Module you can connect to the MIM Service using something similar to:

# Import LithnetRMA PS Module
import-module lithnetrma

# MIM AD User Admin
$username = "mimadmin@mim.mydomain.com"
# Password 
$password = "Secr3tSq1rr3l!" | convertto-securestring -AsPlainText -Force
# PS Creds
$credentials = New-Object System.Management.Automation.PSCredential $Username,$password

# Connect to the FIM service instance
# Will require an inbound rule for TCP 5725 (or your MIM Service Server Port) in you Resource Group Network Security Group Config
Set-ResourceManagementClient -BaseAddress http://mymimportalserver.:5725 -Credentials $credentials

 

\routes\index.js

This file details the encrypted credentials the site uses. You will need to generate the encrypted credentials for your environment. You can do this using the powershell-credentials-encryption-tools. Download that script to your workstation and unzip it. Open the credentialEncryptor.ps1 script using an Administrator PowerShell ISE session.

I’ve changed the index.js to accept two sets of credentials. This is because your Azure Admin Credentials are going to be different from your MIM Administrator Credentials (both in name and password). The username for my Azure account looks something like myname@mycompany.com whereas for MIM it is Domainname\Username.

Provide an account name for your Azure environment and the associated password.

The tool will create the encrypted credential files.

Rename the encrypted.credentials file to whatever makes sense for your environment. I’ve renamed it creds1.encrypted.credentials.

Now we re-run the script to create another set of encrypted credentials. This time for Microsoft Identity Manager. Once created, rename the encrypted.credentials file to something that makes sense in your environment. I’ve renamed the second set to creds2.encrypted.credentials.

We now need to copy the following files to your UI Website C:\nodejs\powershell-command-executor-ui-master directory:

  • creds1.encrypted.credentials
  • creds2.encrypted.credentials
  • decryptUtil.ps1
  • secret.key

Navigate back to Routes.js and open the file in an editor such as Notepad++

Update the index.js file for the path to your credentials files. We also need to add in the additional credentials file.

The changes to the file are, the paths to the files we just copied above along with the addition var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE for the second set of credentials used for Microsoft Identity Manager.

var PATH_TO_DECRYPT_UTILS_SCRIPT = "C:\\nodejs\\powershell-command-executor-ui-master\\decryptUtil.ps1";
var PATH_TO_ENCRYPTED_CREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds1.encrypted.credentials";
var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds2-encrypted.credentials";
var PATH_TO_SECRET_KEY = "C:\\nodejs\\powershell-command-executor-ui-master\\secret.key";


Also to initCommands to pass through the additional credentials file


initCommands: o365Utils.getO365PSInitCommands(
 PATH_TO_DECRYPT_UTILS_SCRIPT,
 PATH_TO_ENCRYPTED_CREDENTIALS_FILE,
 PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE,
 PATH_TO_SECRET_KEY,
 10000,30000,3600000),

Here is the full index.js file for reference.

 

public/console.html

The public/console.html file is for formatting and associated UI components. The key things I’ve updated are the Bootstrap and AngularJS versions. Those are contained in the top of the html document. A summary is below.

https://ajax.googleapis.com/ajax/libs/angularjs/1.6.1/angular.min.js
https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.6.1/angular-resource.min.js
http://javascripts/ui-bootstrap-tpls-2.4.0.min.js
http://javascripts/console.js
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap-theme.min.css">

You will also need to download the updated Bootstrap UI (ui-bootstrap-tpls-2.4.0.min.js). I’m using v2.4.0 which you can download from here. Copy it to the javascripts directory.

I’ve also updated the table types, buttons, colours, header, logo etc in the appropriate locations (CSS, Tables, Div’s etc). Here is my full file for reference. You’ll need to update for your colours, branding etc.

powershell-command-executor\O365Utils.js

Finally the O365Utils.js file. This contains the commands that will be displayed along with their options, as well as the connection information for your Microsoft Identity Manager environment.

You will need to change:

  • Line 52 for the address of your MIM Sync Server
  • Line 55 for the addresses of your MIM Service Server
  • Line 141 on-wards for what commands and parameters for those commands you want to make available in the UI

Here is an example with a couple of AzureAD commands, a MIM Sync and a MIM Service command.

Show me my PowerShell UI Website

Now that we have everything configured let’s start the site and browse to it. If you haven’t stopped the NodeJS site from earlier go to the command window and press Cntrl+C a couple of times. Run “c:\Program Files\nodejs\node.exe” bin\www again from the C:\nodejs\powershell-command-executor-ui-master directory unless you have restarted the host and now have NodeJS in your environment path.

In a browser on the same host go to http://localhost:3000 again and you should see the site as it is below.

Branding and styling from the console.html, menu options from the o365Utils.js and when you select a command and execute it data from the associated service …….

… you can see results. From the screenshot below a Get-AzureADUser command for the associated search string executed in milliseconds.

 

Summary

The powershell-command-executor-ui from bitsofinfo is a very extensible and powerful NodeJS website as a front-end to PowerShell.

With a few tweaks and updates the look and feel can be easily changed along with the addition of any powershell commands that you wish to have a UI for.

As it sits though keep in mind you have a UI with hard-coded credentials that can do whatever commands you expose.

Personally I am running one for my use only and I have it hosted in Azure in its own Resource Group with an NSG allowing outgoing traffic to Azure and my MIM environment. Incoming traffic is only allowed from my personal management workstations IP address. I also needed to allow port 3000 into the server on the NSG as well as the firewall on the host. I did that quickly using the command below.

# Enable the WebPort NodeJS is using on the firewall 
netsh advfirewall firewall add rule name="NodeJS WebPort 3000" dir=in action=allow protocol=TCP localport=3000

Follow Darren on Twitter @darrenjrobinson

mim-db-not-populated

Resolving “The Microsoft Identity Manager server database could not be successfully populated” installation error

Here is yet another of those Microsoft Identity Manager installation errors that doesn’t give you much information and when looking for a resolution you can’t find an exact match through Dr Google.

Nearing the end of the Microsoft Identity Manager Service and Portal installation you receive the “The Microsoft Identity Manager server database could not be successfully populated” error.

Looking into the installation log (which I’m in the good practice of always initiating when doing an installation of the MIM Service/Portal these days eg. msiexec /i “e:\Service and Portal\Service and Portal.msi” /l*v c:\temp\install.log )  didn’t give up much information at all. Fatal Error. Dialog created.

Looking at the server that the installation was being done on I could see that it was being spanked. This server is for a customers development environment, hosted in Azure but also done rather frugally (my Virtual Machine was running, SQL, MIM Sync, all the dependencies for the MIM Portal and then the MIM Service/Portal itself). FWIW the initially provisioned VM was an Azure DS1v2 server. Seems a character may have got lost in the VM request where an Azure DS11v2 server would have been more appropriate.

I re-sized the Azure VM and actually chose to go with an Azure DS3v2 VM size. I kicked off the Microsoft Identity Manager Service & Portal installation again and ….

….. SUCCESS.

Hope this helps someone else who may find themselves in a similar position.

Follow Darren on Twitter @darrenjrobinson

2337-error-only

Microsoft Identity Manager installation error “Internal Error 2337. 0, Microsoft.MetadirectoryServices.host.dll”

Today I was doing a fresh installation of Microsoft Identity Manger 2016 with Service Pack 1 into a new development environment. The exact binary is “en_microsoft_identity_manager_2016_with_service_pack_1_x64_dvd_9270854”

Not too far into the installation of the Microsoft Identity Manager Synchronization Server I got the “Internal Error 2337. 0, Microsoft.MetadirectoryServices.host.dll” error as shown below.

Doing a few searches didn’t throw me any bones. I could see that the installation had added the MIM Sync Server Service Account to the Logins on the SQL Server.

I then recalled that there was an updated version that was released just before Xmas (14 Dec 2016). The exact binary is “en_microsoft_identity_manager_2016_with_service_pack_1_x64_dvd_9656597” and you can get the updated Microsoft Identity Manager 2016 with SP1 media here.

Re-running the installation and ….

….. SUCCESS.

Now I’m not going to spend any time trying to figure out was is bad with the base MIM 2016 SP1 media, I’m just going to pretend it never existed and use the latest. Onward and upwards.

Follow Darren on Twitter @darrenjrobinson