How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

Introduction to MIM Advanced Workflows with MIMWAL

Introduction

Microsoft late last year introduced the ‘MIMWAL’, or to say it in full: (inhales) ‘Microsoft Identity Manager Workflow Activity Library’ – an open source project that extends the default workflows & functions that come with MIM.

Personally I’ve been using a version of MIMWAL for a number of years, as have my colleagues, in working on MIM projects with Microsoft Consulting.   This is the first time however it’s been available publicly to all MIM customers, so I thought it’d be a good idea to introduce how to source it, install it and work with it.

Microsoft (I believe for legal reasons) don’t host a compiled version of MIMWAL, instead host the source code on GitHub for customers to source, compile and potentially extend. The front page to Microsoft’s MIMWAL GitHub library can be found here: http://microsoft.github.io/MIMWAL/

Compile and Deploy

Now, the official deployment page is fine (github) but I personally found Matthew’s blog to be an excellent process to use (ithinkthereforeidam.com).  Ordinarily, when it comes to installing complex software, I usually combine multiple public and private sources and write my own process but this blog is so well done I couldn’t fault it.

…however, some minor notes and comments about the overall process:

  • I found that I needed to copy the gacutil.exe and sn.exe utilities you extract from the old FIM patch in the ‘Solution Output’ folder.  The process mentions they need to be in the ‘src\Scripts’ (Step 6), but they need to be in the ‘Solution Output’ folder as well, which you can see in the last screenshot of that Explorer folder in Step 8 (of process: Configure Build/Developer Computer).
  • I found the slowest tasks in the entire process was sourcing and installing Visual Studio, and extracting the required FIM files from the patch download.  I’d suggest keeping a saved Windows Server VM somewhere once you’ve completed these tasks so you don’t have to repeat them in case you want to compile the latest version of MIMWAL in the future (preferably with MIM installed so you can perform the verification as well).
  • Be sure to download the ‘AMD 64’ version of the FIM patch file if you’re installing MIMWAL onto a Windows Server 64-bit O/S (which pretty much everyone is).  I had forgotten that old 64 bit patches used to be titled after the AMD 64-bit chipset, and I instead wasted time looking for the newer ‘x64’ title of the patch which doesn’t exist for this FIM patch.

 

‘Bread and Butter’ MIMWAL Workflows

I’ll go through two examples of MIMWAL based Action Workflows here that I use for almost every FIM/MIM implementation.

These action workflows have been part of previous versions of the Workflow Activity Library, and you can find them in the MIMWAL Action Workflow templates:

I’ll now run through real world examples in using both Workflow templates.

 

Update Resource Workflow

The Update Resource MIMWAL action workflow I use all the time to link two different objects together – many times linking a user object with a new and custom ‘location’ object.

For new users, I execute this MIMWAL workflow when a user first ‘Transitions In’ to a Set whose dynamic membership is “User has Location Code”.

For users changing location, I also execute this workflow use a Request-based MPR of the Synchronization Engine changing the “Location Code” for a user.

This workflow looks like the following:

location1

The XPath Filter is:  /Location[LocationCode = ‘[//Target/LocationCode]’]

When you target the Workflow at the User object, it will use the Location Code stored in the User object to find the equivalent Location object and store it in a temporary ‘Query’ object (referenced by calling [//Queries]):

Location2.jpg

The full value expression used above, for example, sending the value of the ‘City’ attribute stored in the Location object into the User object is:

IIF(IsPresent([//Queries/Location/City]),[//Queries/Location/City],Null())

This custom expression determines if there is a value stored in the ‘[//Queries]’ object (ie. a copy of the Location object found early in the query), and if there is a value, then send it to the City attribute of the user object ie. the ‘target’ of the Workflow.  If there is no value, it will send a ‘null’ value to wipe out the existing value (in case a user changes location, but the new location doesn’t have a value for one of the attributes).

It is also a good idea (not seen in this example) to send the Location’s Location Code to the User object and store it in a ‘Reference’ attribute (‘LocationReference’).  That way in future, you can directly access the Location object attributes via the User object using an example XPath:  [//Person/LocationReference/City].

 

Generate Unique Value from AD (e.g. for sAMAccountName, CN, mailnickname)

I’ve previously worked in complex Active Directory and Exchange environments, where there can often be a lot of conflict when it comes to the following attributes:

  • sAMAccountName (used progressively less and less these days)
  • User Principal Name (used progressively more and more these days, although communicated to the end user as ’email address’)
  • CN (or ‘container’ value, which forms part of the LDAP Distinguished Name (DN) value.  Side note: the most commonly mistaken attribute for admins who think this is the ‘Display Name’ when they view it in AD Users & Computers.
  • Mailnickname (used by some Exchange environments to generate a primary SMTP address or ‘mail’ attribute values)

All AD environments require a unique sAMAccountName (otherwise you’ll get a MIM export error into AD if there’s already an account with it) for any AD account to be created.  It will also require a unique CN value in the same OU as other objects, otherwise the object cannot be created.  Unique CN values are generally required to be unique if you export all user accounts for a large organization to the same OU where there is a greater chance for a conflict happening.

UPNs are generally unique if you copy a person’s email address, but sometimes not – sometimes it’s best to combine a unique mailnickname, append a suffix and send that value to the UPN value.  Again, it depends on the structure and naming of your AD, and the applications that integrate with it (Exchange, Office 365 etc.).

Note: the default MIMWAL Generate Unique Value template assumes the FIM Service account has the permissions required to perform LDAP lookups against the LDAP path you specify.  There are ways to enhance the MIMWAL to add in an authentication username/password field in case there is an ‘air gap’ between the FIM server’s joined domain and the target AD you’re querying (a future blog post).

In this example in using the ‘Generate Unique Value’ MIMWAL workflow, I tend to execute as part of a multi-step workflow, such as the one below (Step 2 of 3):sam1

I use the workflow to generate a query of the LDAP to look for existing accounts, and then send that value to the [//Workflowdata/AccountName] attribute.

The LDAP filter used in this example looks at all existing sAMAccountNames across the entire domain to look for an existing account:   (&(objectClass=user)(objectCategory=person)(sAMAccountName=[//Value]))

The workflow will also query the FIM Service database for existing user accounts (that may not have been provisioned yet to AD) using the XPath filter:  /Person[AccountName = ‘[//Value]’]

The Uniqueness Key Seed in this example is ‘2’, which essentially means that if you cannot resolve a conflict with using other attribute values (such as a user’s middle name, or using more letters of a first or last name) then you can use this ‘seed’ number to break the conflict as a last resort.  This number increments by 1 for each confict, so if there’s a ‘michael.pearn’, and a ‘michael.pearn2’ for example, the next one to test will be ‘michael.pearn3’ etc etc.

sam2

The second half of the workflow shows the rules to use to generate sAMAccountName values, and the rules in order in which to break the conflict.  In this example (which is a very simple example), I use an employee’s ‘ID number’ to generate an AD account.  If there is already an account for that ID number, then this workflow will generate a new account with the string ‘-2’ added to the end of it:

Value Expression 1 (highest priority): NormalizeString([//Target/EmployeeID])

Value Expression 2 (lowest priority):  NormalizeString([//Target/EmployeeID] + “-” + [//UniquenessKey])

NOTE: The function ‘NormalizeString’ is a new MIMWAL function that is also used to strip out any diacritics character out.  More information can be found here: https://github.com/Microsoft/MIMWAL/wiki/NormalizeString-Function

sam3

Microsoft have posted other examples of Value Expressions to use that you could follow here: https://github.com/Microsoft/MIMWAL/wiki/Generate-Unique-Value-Activity

My preference is to use as many value expressions as you can to break the conflict before having to use the uniqueness key.  Note: the sAMAccountName has a default 20 character limit, so often the ‘left’ function is used to trim the number of characters you take from a person’s name e.g. ‘left 8 characters’ of a person’s first name, combined with ‘left 11 characters’ of a person’s last name (and not forgetting to save a character for the seed value deadlock breaker!).

Once the Workflow step is executed, I then send the value to the AD Sync Rule (using [//WorkflowData/AccountName] to then pass to the outbound ‘AccountName –> sAMAccountName’ outbound AD rule flow:

sam4

 

More ideas for using MIMWAL

In my research on MIMWAL, I’ve found some very useful links to sample complex workflow chains that use the MIMWAL ‘building block’ action workflows and combine them to do complex tasks.

Some of those ideas can be found here by some of Microsoft’s own MSDN: https://blogs.msdn.microsoft.com/connector_space/2016/01/15/the-mimwal-custom-workflow-activity-library/

These include:

  • Create Employee IDs
  • Create Home Directories
  • Create Admin Accounts

I particularly like the idea of using the ‘Create Employee ID’ example workflow, something that I’ve only previously done outside of FIM/MIM, for example with a SQL Trigger that updates a SQL database with a unique number.

 

 

Setting up your SP 2013 Web App for MIM SP1 & Kerberos SSO

I confess: getting a Microsoft product based website working with Kerberos and Single Sign On (i.e. without authentication prompts from a domain joined workstation or server) feels somewhat of a ‘black art’ for me.

I’m generally ok with registering SPNs, SSLs, working with load balancing IPs etc, but when it comes to the final Internet Explorer test, and it fails and I see an NTLM style auth. prompt, it’s enough to send me into a deep rage (or depression or both).

So, recently, I’ve had a chance to review the latest guidance on getting the Microsoft Identity Manager (MIM) SP1 Portal setup on Windows Server 2012 R2 and SharePoint Foundation 2013 SP1 for both of the following customer requirements:

  • SSL (port 443)
  • Single Sign On from domain joined workstations / servers

The official MIM guidance here is a good place to start if you’re building out a lab (https://docs.microsoft.com/en-us/microsoft-identity-manager/deploy-use/prepare-server-sharepoint).  There’s a major flaw however in this guidance for SSL & Kerberos SSO – it’ll work, but you’ll still get your NTLM style auth. prompt should you configure the SharePoint Web Application initially under port 82 (if you’re following this guidance strictly like I did) and then in the words of this article: “Initially, SSL will not be configured. Be sure to configure SSL or equivalent before enabling access to this portal.”

Unfortunately, this article doesn’t elaborate on how to configure Kerberos and SSL post FIM portal installation, and to then get SSO working across it.

To further my understanding of the root cause, I built out two MIM servers in the same AD:

  • MIM server #1 FIM portal installed onto the Web Application on port 82, with SSL configured post installation with SSL bindings in IIS Manager and a new ‘Intranet’ Alternate Access Mapping configured in the SharePoint Central Administration
  • MIM server #2, FIM portal installed onto the Web Application built on port 443 (no Alternate Access Paths specified) and SSL bindings configured in IIS Manager.

After completion, I found MIM Server #1 was working with Kerberos and SSO under port 82, but each time I accessed it using the SSL URL I configured post installation, I would get the NTLM style auth. prompt regardless of workstation or server used to access it.

With MIM server #2, I built the web application purely into port 443 using this command:

New-SpWebApplication -Name “MIM Portal” -ApplicationPool “MIMAppPool” -ApplicationPoolAccount $dbManagedAccount -AuthenticationMethod “Kerberos” -SecureSocketsLayer:$true -Port 443 -URL https://<snip>.mimportal.com.au

Untitled.jpg

The key switches are:

  • -SecureSocketsLayer:$true
  • -Port 443
  • -URL (with URL starting with https://)

I then configured SSL after this SharePoint Web Application command in IIS Manager with a binding similar to this:

ssl1

A crucial way to see if it’s configured properly is to test the MIM Portal FQDN (without the /identitymanagement specification) you’re intending to use after you configure SharePoint Web Application and bind the SSL certificate in IIS Manager but BEFORE you install the FIM Service and Portal.

So in summary test this:

Verify it working with SSO, then install the FIM Portal to get this URL working:

The first test should appear as a generic ‘Team Site’ in your browser without authentication prompt from a domain joined workstation or server if it’s working correctly.

The other item to take note is that I’ve seen other guidance that this won’t work from a browser locally on the MIM server – something that I haven’t seen in any of my tests.  All test results that I’ve seen are consistent with using a browser from any domain joined workstation, remote domain joined server or the domain joined MIM server itself.  There’s no difference in results in terms of SSO in my opinion.   Be sure to add the MIM portal to the ‘Intranet’ site as well for you testing.

Also, I never had to configure ‘Require Kerberos = True’ for the Web Config that used to be part of the guidance for FIM and previous versions of SharePoint.  This might work as well, but wouldn’t explain the port 82/443 differences for MIM Server #1 (ie. why would that work for 443 and not 82? etc.)

I’ve seen other MIM expert peers configure their MIM sites using custom PowerShell installations of SharePoint Foundation to configure the MIM portal under port 80 (overwriting the default SharePoint Foundation 2013 taking over port 80 during it’s wizard based installation).  I’m sure that might be a valid strategy as well, and SSO may then work as well with SSL with further configuration, but I personally can’t attest to that working.

Good luck!

 

 

 

 

 

ADFS v 2.0 Migration to ADFS 2016

Introduction

Some organisations may still have ADFS v2 or ADFS v2.1 running in their environment, and haven’t yet moved to ADFS v3. In this blog, we will discuss how can you move away from ADFS v2 or ADFS v2.1 and migrate or upgrade to ADFS 2016.

In previous posts, Part 1 and Part 2 we have covered the migration of ADFS v3.0 to ADFS 2016. I have received some messages on LinkedIn to cover the migration process from ADFS v2 to ADFS 2016 as there currently isn’t much information about this.

As you may have noticed from the previous posts, upgrading to ADFS 2016 couldn’t be any easier. In ADFS v2 however, the process is as simple, albeit differently than upgrading from ADFS v2 or ADFS v2.1 to ADFS v3.

Migration Process

Before we begin however, this post assumes you already have a running ADFS v2 or ADFS v2.1 environment. This blog post will not go into a step-by-step installation of ADFSv2/ADFSv2.1, and will only cover the migration or upgrade to ADFS 2016.

This blog post assumes you already have a running Windows Server 2016 with the ADFS 2016 role installed, if not, please follow the procedures outlined in part 2.

Prerequisites

Prior to commencing the upgrade, there are few tasks you need to perform.

  1. Take a note of your ADFS Server Display Name
  2. Take a note of your Federation Service Name
  3. Take note of your federation Service Identifier
  4. Export the service communication certificate (if you don’t already have a copy of it)
  5. Install/Import the service communication certificate onto ADFS 2016 server

Notes:

  • There is no need to make your ADFS 2016 server as primary, since this should have been a new installation. You can’t add to an ADFS v2/ADFS v2.1 farm anyway.
  • There is no need to raise the farm behavior level, since this is not a farm member like we did when migrating from ADFS v3 to ADFS 2016.
  • However, you will still need to upgrade the schema as outlined in part 2.

Before you begin, you will need to download the following PowerShell script found here.

Those scripts can also be found in the support\adfs location in the Windows Server 2016 installation file. Those scripts are provided by Microsoft and not Kloud.

The two main functions, are the export and import federation configuration.

Let’s begin

Firstly, we will need to export the federation configuration by running the “export-federationconfiguration.ps1”.

Here are the current relying party trust I have in the ADFS v2.1.

The “Claims Provider Trust” is Active Directory, this toll will be exported and imported.

1

1- Navigate to the folder you have just downloaded:

Then, type

.\export-federationconfiguration.ps1 -path "c:\users\username\desktop\exported adfs configuration"

 

 

 

3

Once successful, you will see the same results in the above picture.

Open your folder, and you should see the extracted configuration in .xml files.

4

2- Head over your ADFS 2016 Server and copy/paste both the folder in which you have extracted your federation configuration, and the one you downloaded that includes the scripts.

Then open PowerShell and run:

.\import-federationconfiguration.ps1 -path "c:\users\username\desktop\exported adfs configuration"

 

6

When successful, you will see a similar message as above.

And now when you open the ADFS management you should see the same relying party trust as you had in ADFS v2/ADFS v2.1.

7

Basically, by now you have completed the move from ADFSv2/ADFSv2.1 to ADFS 2016.

Notice how the token-signing and token-decrypting certificates are the same. The screenshots below are only of the token-signing certificates only, for the purpose of this blog.

ADFS v2.1:

8

ADFS 2016:

9

You can also check the certificates through PowerShell:

Get-AdfsCertificate

Last thing, make sure that service account that was running the Active Direction Federation Services in ADFS v2/ADFS v2.1 is the same one running in ADFS 2016.

Notice the message in the exported results:

Ensure that the destination farm has the farm name ‘adfs.iknowtech.com.au’ and uses service account ‘IKNOWTECH\svc-adfs’.

If this is not setup, then head over your services and select the account that you were originally using to run the ADFS service.

In addition, make sure that the service account has read-only access to the certificate private key.

10

Conclusion

This is a very straight forward process, all that you need to be sure of is to have the right components in place, service certificate installed, service account setup, etc.

After you have complete the steps, follow the steps here to configure your Web Application Proxy. Although this covers a migration, but it also helps you in configuring a new deployment.

I hope you’ve found this informative. Should you have any question or feedback please leave a comment below.

 

output

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

 

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

 

 

 

pageimports

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

 

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

WAP (2012 R2) Migration to WAP (2016)

In Part 1, and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In part 3 we have discussed the integration of Azure MFA with ADFS 2016, and in this post (technically part 4) we will cover the migration or better yet upgrade WAP 2012 R2 to WAP 2016.

Again, this blog assumes you already have installed the Web Application Proxy feature while adding the Remote Access role. And have prepared the WAP server to be able to establish a trust with the Federation Service.

In addition, a friendly reminder once again that the domain name and federation service name have changed from swayit.com.au to iknowtech.com.au. The certificate of swayit.com.au expired before completing the lab, hence the change.

Before we begin however, the current WAP servers (WAP01, WAP02) are the only connected servers that are part of the cluster:

1

To install the WebApplicationProxy, run the following cmdlet in PowerShell:

Install-WindowsFeature Web-Application-Proxy -IncludeManagementTools

Once installed, follow these steps:

Step 1: Specify the federation service name, and provide the local Administrator account for your ADFS server.

2

Step2: Select your certificate

3

Step 3: Confirm the Configuration4

Do this for both the WAP servers you’re adding to the cluster.

Alternatively, you could do so via PowerShell:


$credential = Get-Credential
Install-WebApplicationProxy -FederationServiceName "sts.swayit.com.au" -FederationServiceTrustCredential $credential -CertificateThumbprint "071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" 

Once complete, the WAP servers will be added to the cluster.

Run the following cmdlet to get the WebApplication Configuration:

Get-WebApplicationProxyConfiguration

You will notice that we now have all four WAP servers in the ConnectedServersName and are part of the cluster.

You will also notice that the ConfigurationVersion is Windows Server 2012 R2. Which we will need to upgrade to Windows Server 2016.

5

Head back to the one of the previous WAP servers running in Windows Server 2012 R2, and run the following cmdlet to remove the old servers from the cluster, and keep only the WAP 2016 Servers:

 Set-WebApplicationProxyConfiguration -ConnectedServersName WAP03, WAP04 

Once complete, check the WebApplicationProxyConfiguration by running the Get-WebApplicationProxyConfiguration cmdlet.

Notice the change in the ConnectServersName (this information is obtained from WAP Server 2012 R2).

6

If you run the Get-WebApplicationProxyConfiguration from WAP 2016, you will get a more detailed information.

6-1

The last remaining step before publishing a Web Application (in my case) was to upgrade the ConfigurationVersion, as it was still in Windows Server 2012 R2 mode.

If you already have published Web Application, you can do this any time.

Set-WebApplicationProxyConfiguration -UpgradeConfigurationVersion

When successful, check again your WebApplicationProxyConfiguration by running

Get-WebApplicationProxyConfiguration

Notice the ConfigurationVersion:

8

You have now completed the upgrade and migration of your Web Application Proxy servers.

If this is a new deployment, of course you don’t need to go through this whole process. You WAP 2016 servers would already be in a Windows Server 2016 ConfigurationVersion.

Assuming this was a new deployment or if you simply need to publish a new Web Application, continue reading and follow the steps below.

Publishing a Web Application

There’s nothing new or different in publishing a Web Application in WAP 2016. It’s pretty similar to WAP 2012 R2. The only addition Microsoft added, is a redirection from HTTP to HTTPS.

Steps 1: Publishing a New Web Application

9

Step 2: Once complete, it will appear on the published Web Applications. Also notice that we only have WAP03, and WAP04 as the only WAP servers in the cluster as we have previously remove WAP01 and WAP02 running Windows Server 2012 R2.

7

There you have it, you have now upgraded your WAP servers that were previously running WAP 2012 R2 to WAP 2016.

By now, you have completed migrating from ADFS v3 to ADFS 2016, integrated Azure MFA with ADFS 2016, and upgraded WAP 2012 R2 to WAP 2016. No additional configuration is required, we have reached the end of our series, and this concludes the migration and upgrade of your SSO environment.

I hope you’ve enjoyed those posts and found them helpful. For any feedback or questions, please leave a comment below.

How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries

Introduction

In August 2016 I wrote this post on how to use PowerShell to leverage the Microsoft GraphAPI and use Differential Queries. The premise behind that post was I required a Microsoft Identity Manager Management Agent to synchronize identity information from AzureAD into Microsoft Identity Manager. However the environment it was intended for has a large AzureAD implementation and performing a Full Sync every-time is just to time consuming. Even more so with this limitation that still exists today in MIM 2016 with SP1.

In this blog post I’ll detail how to implement a PowerShell Management Agent for FIM/MIM to use the MS GraphAPI to synchronize objects into FIM/MIM, supporting Delta and Full Synchronization run profiles. I’m also using my favourite PowerShell Management Agent, the Granfeldt PowerShell Management Agent.

Pre-requsites

I’m using the ADAL helper library from the AzureADPreview PowerShell Module. Install that module on you MIM Sync Server via PowerShell (WMF5 or later) using the PowerShell command;

Install-Module AzureADPreview

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the AzureAD/Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions if you intend to write-back to AzureAD.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\AzureAD

Schema.ps1

My Schema is based around enumerating and managing users from AzureAD. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. Use the Schema info below for a base set of attributes that will get you started. You can add more as required. I’ve prefixed most of them with AAD for my implementation.

If you want to manage Groups or Contacts or a combination of object types, you will need to update the Schema.ps1 script accordingly.

Import.ps1

The logic that the Import.ps1 implements is the same as detailed here in my post using Differential Queries. Essentially, perform a full import and create a file with the cookie/watermark. Allow Delta Sync run profiles to be performed by requesting the GraphAPI to return only changes since the cookie/watermark.

You’ll need to update the script for your AzureAD Tenant name on Line 28. Also the path to where the cookie file will go and the debug file if your path is different to mine. Lines 11, 46 and 47.

Importing of the attributes is based around the names in the Schema.ps1 scripts. Any changes you made there will need to be reflected in the import.ps1.

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented in this example. If you are going to write information back to AzureAD you’ll need to put the appropriate logic into this script.

Management Agent Configuration

With the Granfeldt PowerShell Management Agent installed on your FIM/MIM Synchronisation Server, in the Synchronisation Server Manager select Create Management Agent and choose “PowerShell” from the list of Management Agents to create.

As this example is for Users, I’ve named my MA accordingly.

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to AzureAD/Office365 and import the user data.

Paths to the Import, Export and Password scripts. Note: the Export and Password PS1 scripts files exist but are empty.

Object Type as configured in the Schema.ps1 file.

Attributes as configured in the Schema.ps1 file.

Anchor as per the Schema.ps1 file.

The rest of the MA configuration is up to your implementation. What you are going to join on and what attributes flow into the MV will vary based on your needs and solution. At a minimum you’d probably be looking to do a join on immutableID (after some manipulation) or UPN and flow in attributes such as AADAccountEnabled etc.

Completing the Configuration

To finalise the MA you’ll need to do the usual tasks of creating run profiles, staging the connector space from AzureAD/Office365 and syncing into the Metaverse. Once you’ve done your initial Stage/Full Sync you can perform Delta Sync’s.

Summary

A “Full Import” on a small AzureAD (~8500 Users) took 2 minutes.
A subsequent “Delta Import” with no changes took 6 seconds.

A similar implementation of the logic, but for Groups gives similar results/performance.
A  “Full Import” on a small AzureAD (~9800 Groups) took 5 minutes.
A subsequent “Delta Import” with 7 Adds (new Groups) and 157 Updates took 1 minute.

 

Follow Darren on Twitter @darrenjrobinson

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 3 – Azure MFA Integration

In Part 1 and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In this series we will continue our venture in configuring Azure MFA in ADFS 2016.

Azure MFA – What is it about?

It is a bit confusing when we mention that we need to enable Azure MFA on ADFS. Technically, this method is actually integrating Azure MFA with ADFS. MFA itself is authenticating on Azure AD, however, ADFS is prompting you enter an MFA code which will be verified with the Azure AD to sign you in.

In theory, this by itself is not a multi-factor authentication. When users choose to login with a multi-factor authentication on ADFS, they’re not prompted to enter a password, they simply will login with the six digit code they receive on their mobile devices.

Is this secure enough? Arguably. Of course users had to previously set up their MFA to be able to login by choosing this method, but if someone has control or possession of your device they could simply login with the six digit code. Assuming the device is not locked, or MFA is setup to receive calls or messages (on some phones message notifications will appear on the main display), almost anyone could login.

Technically, this is how Azure MFA will look once integrated with the ADFS server. I will outline the process below, and show you how we got this far.

7

Once you select Azure Multi-Factor Authentication you will be redirected to another page

8

And when you click on “Sign In” you will simply sign in to the Office or Azure Portal, without any other prompt.

The whole idea here is not much about security as much as it is about simplicity.

Integrating Azure MFA on ADFS 2016

Important note before you begin: Before integrating Azure MFA on ADFS 2016, please be aware that users should already have setup MFA using the Microsoft Authenticator mobile app. Or they can do it while first signing in, after being redirected to the ADFS page. The aim of this post is to use the six digit code generated by the mobile app.

If users have MFA setup to receive calls or texts, the configuration in this blog (making Azure MFA as primary) will not support that. To continue using SMS or a call, whilst using Azure MFA, the “Azure MFA” need to be configured as a secondary authentication method, under “Multi-Factor”, and “Azure MFA” under “Primary” should be disabled.

Integrating Azure MFA on ADFS 2016, couldn’t be any easier. All that is required, is running few PowerShell cmdlets and enabling the authentication method.

Before we do so however, let’s have a look at our current authentication methods.

0

As you have noticed, that we couldn’t enable Azure MFA without first configuring Azure AD Tenant.

The steps below are intended to be performed on all AD FS servers in the farm.

Step 1: Open PowerShell and connect to your tenant by running the following:

Connect-MsolService

Step 2: Once connected, you need to run the follow cmdlets to configure the AAD tenant:

$cert = New-AdfsAzureMfaTenantCertificate -TenantID swayit.onmicrosoft.com

When successful, head to the Certificate snap in, and check that a certificate with the name of your tenant has been added in the Personal folder.

2a22

Step 3: In order to enable the AD FS servers to communicate with the Azure Multi-Factor Auth Client, you need to add the credentials to the SPN for the Azure Multi-Factor Auth Client. The certificate that we generated in a previsou step,  will serve as these credentials.

To do so run the following cmdlet:

New-MsolServicePrincipalCredential -AppPrincipalId 981f26a1-7f43-403b-a875-f8b09b8cd720 -Type asymmetric -Usage verify -Value $cert

3

Note that the GUID 981f26a1-7f43-403b-a875-f8b09b8cd720 is not made up, and it is the GUID for the Azure Multi Factor Authentication client. So you basically can copy/paste the cmdlet as is.

Step 4: When you have completed the previous steps, you can now configure the ADFS Farm by running the following cmdlet:

Set-AdfsAzureMfaTenant -TenantId swayit.onmicrosoft.com -ClientId 981f26a1-7f43-403b-a875-f8b09b8cd720

Note how we used the same GUID from the previous step.

4

When that is complete, restart the ADFS service on all your ADFS farm servers.

net stop adfssrv

net start adfssrv

Head back to your ADFS Management Console and open the Authentication method and you will notice that Azure MFA has been enabled, and the message prompt disappeared.

5

6

If the Azure MFA Authentication methods were not enabled, then enable them manually and restart the services again (on all your ADFS servers in the farm).

Now that you have completed all the steps, when you try and access Office 365 or the Azure Portal you will be redirected to the pages posted above.

Choose Azure Multi-Factor Authentication

7

Enter the six digit code you have received.

8

And then you’re signed in.

10

By now you have completed migrating from ADFS v3 to ADFS 2016, and in addition have integrated Azure MFA authentication with your ADFS farm.

The last part in this series will be about WAP 2012 R2 upgrade to WAP 2016. So please make sure to come back tomorrow and check in details the upgrade process.

I hope you’ve enjoyed the posts so far. For any feedback or questions, please leave a comment below.

 

 

licensing-report

How to embed Power BI Reports into the Microsoft Identity Manager Portal

About seven years ago at a conference in Los Angeles I attended I remember a session where a consultant from Oxford Computer Group gave a presentation on integrating Quest Identity Manager (now Dell One Identity Manager) with the Forefront Identity Manager Portal. I’ve recently had a requirement to do something similar and Carol pointed me in the direction of her experiments with doing something similar based off inspiration from that same presentation/session.

Well it is now 2017 and FIM and SharePoint have all moved through a few versions and doing something similar has changed. So now that I’ve got it working I thought I’d share how I’ve done it, and also to solicit any improvements. I’ve done this with SharePoint 2013.

Overview

In this post I’ll detail;

  • Publishing a Power BI Report
  • Creating new Microsoft Identity Manager Navigation Bar Resources
  • Embedding as an IFrame the published Power BI Report in the Microsoft Identity Manager Portal so that it appears like below

Pre-requisites

Obviously to follow this verbatim you are going to need to have a Power BI workspace and a Power BI Report. But you could embed any page you want to test it out.

You’ll also need;

Publish a Power BI Report

In Power BI select a Report you are looking to embed in the MIM Portal. I selected License Plans under Reports from my Power BI Worksapce.

From the File menu select Publish to Web.

Select Create embed code.

Copy the link to your report somewhere where you can retrieve it easily later. Don’t worry about the HTML line or the size.

 

SharePoint Designer

Download and install with the defaults SharePoint Designer 2013 from the link above. I’m using the 64-bit version. I installed it on my Development MIM Portal Server. I’m using 2013 as my MIM Portal is using SharePoint 2013 Foundation (with SP1).

Once installed start SharePoint Designer and select Open Site.

Enter the URL for your MIM Portal and select Open.

Note: In order for SharePoint Designer to successfully load your MIM Portal Site, the URL you provide above must be in your SharePoint Alternate Access Mappings. If it isn’t you will probably get the error “The server could not complete your request. For more specific information, click the Details button.”

And in your Windows Application Event Log Event ID 3 – WebHost

WebHost failed to process a request.

 Sender Information: System.ServiceModel.ServiceHostingEnvironment+HostingManager/42194754

 Exception: System.ServiceModel.ServiceActivationException: The service '/_vti_bin/client.svc' cannot be activated due to an exception during compilation.

 

Select Microsoft Identity Management, then All Files. You should then see a list of all the files in the MIM Portal website.

Locate the aspx folder, right click on it and select New => Folder. Create a new folder under the aspx directory named ‘reports’.

Right click on your new Reports Folder and select New => ASPX. Create an aspx file named reports.aspx.

Repeat to create another aspx file named report.aspx.

 

Click on the Reports.aspx file form the main pane and put the following contents in it overwritting everything else. Select Save.

<%@ Page Language="C#" %>
<html dir="ltr">

<head runat="server">
<meta name="WebPartPageExpansion" content="full" />
<title>Reports</title>

 window.open("report.aspx",target="_self")


</head>
<body/>
</html>

Click on the report.aspx file and replace the contents with the following and select Save.

Replace <yourreportlink> in https://app.powerbi.com/view?r=<yourreportlink&gt; with your Power BI link.

<%@ Page masterpagefile="~masterurl/custom.master" Title="Reports" language="C#" inherits="Microsoft.SharePoint.WebPartPages.WebPartPage, Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" meta:progid="SharePoint.WebPartPage.Document" UICulture="auto" Culture="auto" meta:webpartpageexpansion="full" %>
<%@ Register Tagprefix="SharePoint" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="Utilities" Namespace="Microsoft.SharePoint.Utilities" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Import Namespace="Microsoft.SharePoint" %> <%@ Register Tagprefix="WebPartPages" Namespace="Microsoft.SharePoint.WebPartPages" Assembly="Microsoft.SharePoint, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %>

<asp:Content ContentPlaceHolderID="PlaceHolderTitleBar" Visible="true" runat="server">
</asp:Content>

<asp:Content id="content1" ContentPlaceHolderID="PlaceHolderMain" runat="server">
https://app.powerbi.com/view?r=

</asp:Content>

MIM Portal Navigation Resources

Now we need to create the MIM Portal Navigation Resources to link to our new files.

In the MIM Portal Select Navigation Bar Resources. Select New.

 

Provide a Display Name, Description and select Next. Ignore Usage Keyword for now. More on that later.

Make the Parent Order 8 to have it at the bottom of the Left Nav bar. Order is 0 as this is going to be our Group header. Select Next.

Provide the path to the Reports.aspx file  ~/IdentityManagement/aspx/reports/reports.aspx Select Next.

Provide the Localised Display name, select Finish and then Submit.

Repeat, this time for linking to ~/IdentityManagement/aspx/reports/report.aspx and name it Licensing Report or whatever makes sense for your report. Also make the Order 1 so it nests under Reports.

Perform an IISReset.

Refresh you MIM Portal Page and you should see your new menu items on the left Navigatin Bar at the bottom.

Click on Reports and your Licensing Report will auto-magically load. Same as if you click on Licensing Report. You can now add as many reports as you need. And change which report you want to be the default by updating the Reports.aspx file in SharePoint Designer.

You will probably also want to limit who see’s what reports. You can do that through Usage Keywords and Sets etc. By default as described here the reports will only be visible to Administrators.  Details to get you started on changing who can see what can be found here.

Let me know if you have any improvements.

 

Follow Darren Robinson on Twitter