Send mail to Office 365 via an Exchange Server hosted in Azure

Those of you who have attempted to send mail to Office 365 from Azure know that sending outbound mail directly from an email server hosted in Azure is not supported due to elastic nature of public cloud service IPs and the potential for abuse. Therefore, the Azure IP address blocks are added to public block lists with no exceptions to this policy.

To be able to send mail from an Azure hosted email server to Office 365 you to need to send mail via a SMTP relay. There is a number of different SMTP relays you can utilise including Exchange Online Protection, more information can be found here: https://blogs.msdn.microsoft.com/mast/2016/04/04/sending-e-mail-from-azure-compute-resource-to-external-domains

To configure Exchange Server 2016 hosted in Azure to send mail to Office 365 via SMTP relay to Exchange Online protection you need to do the following;

  1. Create a connector in your Office 365 tenant
  2. Configure accepted domains on your Exchange Server in Azure
  3. Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

Create a connector in your Office 365 tenant

  1. Login to Exchange Online Admin Center
  2. Click mail flow | connector
  3. Click +
  4. Select from: “Your organisation’s email server” to: “Office 365”o365-connector1
  5. Enter in a Name for the Connector | Click Nexto365-connector2
  6. Select “By verifying that the IP address of the sending server matches one of these IP addresses that belong to your organization”
  7. Add the public IP address of your Exchange Server in Azureo365-connector3

Configure accepted domains on your Exchange Server in Azure

  1. Open Exchange Management Shell
  2. Execute the following PowerShell command for each domain you want to send mail to in Office 365;
  3. New-AcceptedDomain -DomainName Contoso.com -DomainType InternalRelay -Name Contosoaccepted-domain1

Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

  1. Execute the following PowerShell command;
  2. New-SendConnector -Name “My company to Office 365” -AddressSpaces * -CloudServicesMailEnabled $true -RequireTLS $true -SmartHosts yourdomain-com.mail.protection.outlook.com -TlsAuthLevel CertificateValidationsend-connector1

How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

     

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

  • You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

 

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

Monitor SharePoint Changelog in Azure Function

Azure Functions have officially reached ‘hammer’ status

I’ve been enjoying the ease with which we can now respond to events in SharePoint and perform automation tasks, thanks to the magic of Azure Functions. So many nails, so little time!

The seminal blog post that started so many of us on that road, was of course John Liu’s Build your PnP Site Provisioning with PowerShell in Azure Functions and run it from Flow and that pattern is fantastic for many event-driven scenarios.

One where it currently (at time of writing) falls down is when dealing with a list item delete event. MS Flow can’t respond to this and nor can a SharePoint Designer workflow.

Without wanting to get into Remote Event Receivers (errgh…), the other way to deal with this is after the fact via the SharePoint change log (if the delete isn’t time sensitive). In my use case it wasn’t – I just needed to be able to clean up some associated items in other lists.

SharePoint Change Logs

SharePoint has had an API for getting a log of changes of certain types, against certain objects, in a certain time window since the dawn of time. The best post for showing how to query it from the client side is (in my experience) Paul Schaeflin’s Reading the SharePoint change log from CSOM and was my primary reference for the below PowerShell-based Function.

In my case, I am only interested in items deleted from a single list, but this could easily be scoped to an entire site and capture more/different event types (see Paul’s post for the specifics).

The biggest challenge in getting this working was persisting the change token to Azure Storage, and this wasn’t that difficult in and of itself – it’s just that the PowerShell bindings for Azure are as of yet woefully under-documented (TIP: Get-Content and Set-Content are the key to the PowerShell bindings… easy when you know how). In my case I have an input and output binding to a single Blob Storage blob (to persist the change token for reference the next time the Function runs) and another output to Queue Storage to trigger another function that actually does the cleanup of the other list items linked to the one now sitting in the recycle bin. The whole thing is triggered by an hourly timer. If nothing has been deleted, then no action is taken (other than the persisted token blob update).

A Note on Scaling

Note that if multiple delete events occurred since the last check, then these are all deposited in one message. This won’t cause a problem in my use case (there will never be more than a handful of items deleted in one pass of the Function), but it obviously doesn’t scale well, as too many being handled by the receiving Function would threaten to bump up against the 5 min execution time limit. I wanted to use the OOTB message queue binding for simplicity, but if you needed to push multiple messages, you could simple use the Azure Storage PowerShell cmdlets instead of an out binding.

Code Now Please

Here’s the Function code (following the PnP PowerShell Azure Functions implementation as per John’s article above and liberally stealing from Paul’s guide above).

Going Further

This is obviously a simple example with a single objective, but you could take this pattern and ramp it up as high as you like. By targeting the web instead of a single list, you could push a lot of automation through this single pipeline, perhaps ramping up the execution recurrence to every 5 mins or less if you needed that level of reduced latency. Although watch out for your Functions consumption if you turn up the executions too high!

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

 

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson

 

 

 

How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent

Introduction

In the last 12 months I’ve lost count of the number of PowerShell Management Agents I’ve written to integrate Microsoft Identity Manager with a plethora of environments. The majority though have not been of huge scale (<50k objects) and the import of the managed entities into the Connector Space/Metaverse runs through pretty timely.

However this week I’ve been working on a AzureAD Groups PS MA for an environment with 40k+ groups. That in itself isn’t that large, but when you start processing Group Memberships as well, the Import process can take an hour for a Full Sync. During this time before the results are passed to the Sync Engine you don’t have any visual of where the Import is up to (other than debug logging). And the ability to stop the MA requires a restart of the Sync Engine Server.

I’ve wanted to mess with Paging the Imports for sometime, but it hadn’t been a necessity. Now it is, so I looked to working out how to achieve it. The background information on Paged Imports is available at the bottom of the PSMA documentation page here.  However there are no working examples. I contacted Soren and he had misplaced his demo scripts for the time being. With some time at hand (in between coats of paint on the long weekend renovation)  I therefore worked it out for myself. I detail how to implement Paged Imports in this blogpost.

This post uses an almost identical Management Agent to the one described in this post. Review that post to get an understanding of the AzureAD Differential Queries. I’m not going to cover those elements in this post or setting up the MA at all.

Getting Started

There are two things you need to do in preparation for enabling Paged Imports on your PowerShell Management Agent;

  1. Enable Paged Imports (if your Import.ps1 is checking for this setting)
  2. Configure Page Size on your Import Run Profiles

The first is as simple as clicking the checkbox on the Global Parameters tab on your PS MA as shown below.

The 2nd is in your Run Profile. By default this will be 100. For my “let’s figure this out” process I dropped my Run Profiles to 50 on one Run Profile and 10 on another.

 

Import Script

With Paged Imports setup on the MA the rest of the logic goes into your Import Script. In your param section at the start of the script $usepagedimport and $pagesize are the variables that reflect the items from the two enablement components you did above.

$usepageimport is either True or False. Your Import.ps1 script can check to see if it is set and process accordingly. In this example I’m not even checking if it is set and doing Paged Imports anyway. For completeness in a production example you should check to see what the intention of the MA is.

$pagesize is the pagesize from the Run Profile (100 by default, or whatever you changed your’s too).

param (
    $Username,
    $Password,
    $Credentials,
    $OperationType,
    [bool] $usepagedimport,
    $pagesize
 )

 

An important consideration to keep in mind is that the Import.ps1 will be called multiple times (ie. #of_times = #ofObjects / pagesize).

So anything that you would normally expect in any other MA to only process once when the Import.ps1 runs you need to limit to only running once.

Essentially the way I’ve approached it is, retrieve all the objects that will be processed and put them in a Global variable. If the variable does not have any values/data then it is the first run, so go and get our source data. If the Global variable has values/data in it then we must be on a subsequent loop so no need to go process that part, just page through our import.

In my example below this appears as;

if (!$global:tenantObjects) {
    # Authenticate
    # Search and get the users
    # Do some rationalisation on the results (if required)
    # setup some global variables so we know where we are with processing the data
} # Finish the one time tasks

As you’ll see in the full import.ps1 script below there are more lines that could be added into this section so they don’t get processed each time. In a production implementation I would.

For the rest of the Import.ps1 script we are expecting it to run multiple times. This is where we do our logic and process our objects to send through to the Sync Engine/Connector Space. We need to keep track of where we are up to processing the dataset and continue on from where we left off. We also need to know how many objects we have processed in relation to the ‘pagesize’ we get from the Run Profile so we know when we’ve finished.

When we reach the pagesize but know we have more objects to process we set the $global:MoreToImport  to $true and break out of the foreach loop.

When we have processed all our objects we set $global:MoreToImport = $false and break out of the foreach loop to finish.

With that explanation out of the way here is a working example. I’ve left in debugging output to a log file so you can see what is going on.

You can get the associated relevant Schema.ps1 from the Management Agent described in this post. You’ll need to update your Tenant name on line 29, your directory paths on lines 10 and 47. If you are using a different version of the AzureADPreview PowerShell Module you’ll need to change line 26 as well.

Everything else is in the comments within the example script below and should make sense.

Summary

For managing a large number of objects on a PS MA we can now see progress as the import processes the objects, and we can now stop an MA if required.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

 

WAP (2012 R2) Migration to WAP (2016)

In Part 1, and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In part 3 we have discussed the integration of Azure MFA with ADFS 2016, and in this post (technically part 4) we will cover the migration or better yet upgrade WAP 2012 R2 to WAP 2016.

Again, this blog assumes you already have installed the Web Application Proxy feature while adding the Remote Access role. And have prepared the WAP server to be able to establish a trust with the Federation Service.

In addition, a friendly reminder once again that the domain name and federation service name have changed from swayit.com.au to iknowtech.com.au. The certificate of swayit.com.au expired before completing the lab, hence the change.

Before we begin however, the current WAP servers (WAP01, WAP02) are the only connected servers that are part of the cluster:

1

To install the WebApplicationProxy, run the following cmdlet in PowerShell:

Install-WindowsFeature Web-Application-Proxy -IncludeManagementTools

Once installed, follow these steps:

Step 1: Specify the federation service name, and provide the local Administrator account for your ADFS server.

2

Step2: Select your certificate

3

Step 3: Confirm the Configuration4

Do this for both the WAP servers you’re adding to the cluster.

Alternatively, you could do so via PowerShell:


$credential = Get-Credential
Install-WebApplicationProxy -FederationServiceName "sts.swayit.com.au" -FederationServiceTrustCredential $credential -CertificateThumbprint "071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" 

Once complete, the WAP servers will be added to the cluster.

Run the following cmdlet to get the WebApplication Configuration:

Get-WebApplicationProxyConfiguration

You will notice that we now have all four WAP servers in the ConnectedServersName and are part of the cluster.

You will also notice that the ConfigurationVersion is Windows Server 2012 R2. Which we will need to upgrade to Windows Server 2016.

5

Head back to the one of the previous WAP servers running in Windows Server 2012 R2, and run the following cmdlet to remove the old servers from the cluster, and keep only the WAP 2016 Servers:

 Set-WebApplicationProxyConfiguration -ConnectedServersName WAP03, WAP04 

Once complete, check the WebApplicationProxyConfiguration by running the Get-WebApplicationProxyConfiguration cmdlet.

Notice the change in the ConnectServersName (this information is obtained from WAP Server 2012 R2).

6

If you run the Get-WebApplicationProxyConfiguration from WAP 2016, you will get a more detailed information.

6-1

The last remaining step before publishing a Web Application (in my case) was to upgrade the ConfigurationVersion, as it was still in Windows Server 2012 R2 mode.

If you already have published Web Application, you can do this any time.

Set-WebApplicationProxyConfiguration -UpgradeConfigurationVersion

When successful, check again your WebApplicationProxyConfiguration by running

Get-WebApplicationProxyConfiguration

Notice the ConfigurationVersion:

8

You have now completed the upgrade and migration of your Web Application Proxy servers.

If this is a new deployment, of course you don’t need to go through this whole process. You WAP 2016 servers would already be in a Windows Server 2016 ConfigurationVersion.

Assuming this was a new deployment or if you simply need to publish a new Web Application, continue reading and follow the steps below.

Publishing a Web Application

There’s nothing new or different in publishing a Web Application in WAP 2016. It’s pretty similar to WAP 2012 R2. The only addition Microsoft added, is a redirection from HTTP to HTTPS.

Steps 1: Publishing a New Web Application

9

Step 2: Once complete, it will appear on the published Web Applications. Also notice that we only have WAP03, and WAP04 as the only WAP servers in the cluster as we have previously remove WAP01 and WAP02 running Windows Server 2012 R2.

7

There you have it, you have now upgraded your WAP servers that were previously running WAP 2012 R2 to WAP 2016.

By now, you have completed migrating from ADFS v3 to ADFS 2016, integrated Azure MFA with ADFS 2016, and upgraded WAP 2012 R2 to WAP 2016. No additional configuration is required, we have reached the end of our series, and this concludes the migration and upgrade of your SSO environment.

I hope you’ve enjoyed those posts and found them helpful. For any feedback or questions, please leave a comment below.

How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries

Introduction

In August 2016 I wrote this post on how to use PowerShell to leverage the Microsoft GraphAPI and use Differential Queries. The premise behind that post was I required a Microsoft Identity Manager Management Agent to synchronize identity information from AzureAD into Microsoft Identity Manager. However the environment it was intended for has a large AzureAD implementation and performing a Full Sync every-time is just to time consuming. Even more so with this limitation that still exists today in MIM 2016 with SP1.

In this blog post I’ll detail how to implement a PowerShell Management Agent for FIM/MIM to use the MS GraphAPI to synchronize objects into FIM/MIM, supporting Delta and Full Synchronization run profiles. I’m also using my favourite PowerShell Management Agent, the Granfeldt PowerShell Management Agent.

Pre-requsites

I’m using the ADAL helper library from the AzureADPreview PowerShell Module. Install that module on you MIM Sync Server via PowerShell (WMF5 or later) using the PowerShell command;

Install-Module AzureADPreview

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the AzureAD/Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions if you intend to write-back to AzureAD.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\AzureAD

Schema.ps1

My Schema is based around enumerating and managing users from AzureAD. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. Use the Schema info below for a base set of attributes that will get you started. You can add more as required. I’ve prefixed most of them with AAD for my implementation.

If you want to manage Groups or Contacts or a combination of object types, you will need to update the Schema.ps1 script accordingly.

Import.ps1

The logic that the Import.ps1 implements is the same as detailed here in my post using Differential Queries. Essentially, perform a full import and create a file with the cookie/watermark. Allow Delta Sync run profiles to be performed by requesting the GraphAPI to return only changes since the cookie/watermark.

You’ll need to update the script for your AzureAD Tenant name on Line 28. Also the path to where the cookie file will go and the debug file if your path is different to mine. Lines 11, 46 and 47.

Importing of the attributes is based around the names in the Schema.ps1 scripts. Any changes you made there will need to be reflected in the import.ps1.

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented in this example. If you are going to write information back to AzureAD you’ll need to put the appropriate logic into this script.

Management Agent Configuration

With the Granfeldt PowerShell Management Agent installed on your FIM/MIM Synchronisation Server, in the Synchronisation Server Manager select Create Management Agent and choose “PowerShell” from the list of Management Agents to create.

As this example is for Users, I’ve named my MA accordingly.

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to AzureAD/Office365 and import the user data.

Paths to the Import, Export and Password scripts. Note: the Export and Password PS1 scripts files exist but are empty.

Object Type as configured in the Schema.ps1 file.

Attributes as configured in the Schema.ps1 file.

Anchor as per the Schema.ps1 file.

The rest of the MA configuration is up to your implementation. What you are going to join on and what attributes flow into the MV will vary based on your needs and solution. At a minimum you’d probably be looking to do a join on immutableID (after some manipulation) or UPN and flow in attributes such as AADAccountEnabled etc.

Completing the Configuration

To finalise the MA you’ll need to do the usual tasks of creating run profiles, staging the connector space from AzureAD/Office365 and syncing into the Metaverse. Once you’ve done your initial Stage/Full Sync you can perform Delta Sync’s.

Summary

A “Full Import” on a small AzureAD (~8500 Users) took 2 minutes.
A subsequent “Delta Import” with no changes took 6 seconds.

A similar implementation of the logic, but for Groups gives similar results/performance.
A  “Full Import” on a small AzureAD (~9800 Groups) took 5 minutes.
A subsequent “Delta Import” with 7 Adds (new Groups) and 157 Updates took 1 minute.

 

Follow Darren on Twitter @darrenjrobinson

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 3 – Azure MFA Integration

In Part 1 and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In this series we will continue our venture in configuring Azure MFA in ADFS 2016.

Azure MFA – What is it about?

It is a bit confusing when we mention that we need to enable Azure MFA on ADFS. Technically, this method is actually integrating Azure MFA with ADFS. MFA itself is authenticating on Azure AD, however, ADFS is prompting you enter an MFA code which will be verified with the Azure AD to sign you in.

In theory, this by itself is not a multi-factor authentication. When users choose to login with a multi-factor authentication on ADFS, they’re not prompted to enter a password, they simply will login with the six digit code they receive on their mobile devices.

Is this secure enough? Arguably. Of course users had to previously set up their MFA to be able to login by choosing this method, but if someone has control or possession of your device they could simply login with the six digit code. Assuming the device is not locked, or MFA is setup to receive calls or messages (on some phones message notifications will appear on the main display), almost anyone could login.

Technically, this is how Azure MFA will look once integrated with the ADFS server. I will outline the process below, and show you how we got this far.

7

Once you select Azure Multi-Factor Authentication you will be redirected to another page

8

And when you click on “Sign In” you will simply sign in to the Office or Azure Portal, without any other prompt.

The whole idea here is not much about security as much as it is about simplicity.

Integrating Azure MFA on ADFS 2016

Important note before you begin: Before integrating Azure MFA on ADFS 2016, please be aware that users should already have setup MFA using the Microsoft Authenticator mobile app. Or they can do it while first signing in, after being redirected to the ADFS page. The aim of this post is to use the six digit code generated by the mobile app.

If users have MFA setup to receive calls or texts, the configuration in this blog (making Azure MFA as primary) will not support that. To continue using SMS or a call, whilst using Azure MFA, the “Azure MFA” need to be configured as a secondary authentication method, under “Multi-Factor”, and “Azure MFA” under “Primary” should be disabled.

Integrating Azure MFA on ADFS 2016, couldn’t be any easier. All that is required, is running few PowerShell cmdlets and enabling the authentication method.

Before we do so however, let’s have a look at our current authentication methods.

0

As you have noticed, that we couldn’t enable Azure MFA without first configuring Azure AD Tenant.

The steps below are intended to be performed on all AD FS servers in the farm.

Step 1: Open PowerShell and connect to your tenant by running the following:

Connect-MsolService

Step 2: Once connected, you need to run the follow cmdlets to configure the AAD tenant:

$cert = New-AdfsAzureMfaTenantCertificate -TenantID swayit.onmicrosoft.com

When successful, head to the Certificate snap in, and check that a certificate with the name of your tenant has been added in the Personal folder.

2a22

Step 3: In order to enable the AD FS servers to communicate with the Azure Multi-Factor Auth Client, you need to add the credentials to the SPN for the Azure Multi-Factor Auth Client. The certificate that we generated in a previsou step,  will serve as these credentials.

To do so run the following cmdlet:

New-MsolServicePrincipalCredential -AppPrincipalId 981f26a1-7f43-403b-a875-f8b09b8cd720 -Type asymmetric -Usage verify -Value $cert

3

Note that the GUID 981f26a1-7f43-403b-a875-f8b09b8cd720 is not made up, and it is the GUID for the Azure Multi Factor Authentication client. So you basically can copy/paste the cmdlet as is.

Step 4: When you have completed the previous steps, you can now configure the ADFS Farm by running the following cmdlet:

Set-AdfsAzureMfaTenant -TenantId swayit.onmicrosoft.com -ClientId 981f26a1-7f43-403b-a875-f8b09b8cd720

Note how we used the same GUID from the previous step.

4

When that is complete, restart the ADFS service on all your ADFS farm servers.

net stop adfssrv

net start adfssrv

Head back to your ADFS Management Console and open the Authentication method and you will notice that Azure MFA has been enabled, and the message prompt disappeared.

5

6

If the Azure MFA Authentication methods were not enabled, then enable them manually and restart the services again (on all your ADFS servers in the farm).

Now that you have completed all the steps, when you try and access Office 365 or the Azure Portal you will be redirected to the pages posted above.

Choose Azure Multi-Factor Authentication

7

Enter the six digit code you have received.

8

And then you’re signed in.

10

By now you have completed migrating from ADFS v3 to ADFS 2016, and in addition have integrated Azure MFA authentication with your ADFS farm.

The last part in this series will be about WAP 2012 R2 upgrade to WAP 2016. So please make sure to come back tomorrow and check in details the upgrade process.

I hope you’ve enjoyed the posts so far. For any feedback or questions, please leave a comment below.

 

 

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 2

In Part 1 of this series we have been getting ready for our ADFS v3.0 migration to ADFS v4.0 (ADFS 2016).

In part 2 we will cover the migration process, step-by-step. However, a friendly reminder that this series does not cover installation of ADFS and federation from scratch. This post assumes you already have a federated domain and Single Sign On (SSO) for your applications.

You may notice domain change and federation service name change from swayit.com.au to iknowtech.com.au. This doesn’t impact our migration, the certificate for swayit.com.au expired before completing the lab. : )

Migration Process – ADFS – Phase 1:

Assuming you already have installed the Active Directory Federation Services on your new ADFS 2016 servers, and if not, then you could do so through PowerShell:

Install-windowsfeature adfs-federation -IncludeManagementTools

Once complete, follow these steps:

Step 1: Add the new ADFS 2016 server to the existing farm

1-add-farm

Step 2: Connect to AD2

Step 3: Specify the primary Federation server (or federation service).3

Step 4: Select your certificate4

Step 5: Select your service account. For the sake of this lab, I created a user and gave it permission to run the ADFS service. It is advisable however, to use a group managed service account (gMSA).5

Step 6: Complete.

The warnings below are irrelevant to the ADFS 2016 server being added to the farm.6

Alternatively, you could do so through PowerShell:

If you’re using Windows Internal Database:

 Import-Module ADFS

#Get the credential used for the federation service account

$serviceAccountCredential = Get-Credential -Message "Enter the credential for the Federation Service Account."

Add-AdfsFarmNode `
-CertificateThumbprint:"071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" `
-PrimaryComputerName:"sts.swayit.com.au" `
-ServiceAccountCredential:$serviceAccountCredential 

or:

Import-Module ADFS

$credentials = Get-Credential

Install-AdfsFarm `
-CertificateThumbprint:"071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" `
-FederationServiceDisplayName:"SwayIT" `
-FederationServiceName:"sts.swayit.com.au" `
-GroupServiceAccountIdentifier:"SWAYIT\ADFSgMSA`$"
Once the machine has restarted, open the ADFS Management Console, and you’ll notice it’s not the primary federation server in the farm. Now you need to make the newly added ADFS 2016 server as primary. Follow the steps below.

Once the newly added ADFS 2016 server run the following cmdlet:

Step 1:

Set-AdfsSyncProperties -Role PrimaryComputer

8

Open the ADFS Management Console and you’ll notice that ADFS03 (our ADFS2016 server) is now primary:

10

Step 2: Run the following cmdlet on all other federation servers in the farm

Set-AdfsSyncProperties -Role SecondaryComputer -PrimaryComputerName ADFS03.swayit.com.au 

Step 3: Run the following cmdlet on the secondary server to confirm that the ADFS Properties are correct.

Get-AdfsSyncProperties

9-5

Migration Process – ADPREP – Phase 2

Now that we’ve made our new ADFS 2016 Server as primary, it is time to upgrade the schema.

Assuming you had already downloaded Windows Server 2016 ISO file, and if not, you can obtain a copy from TechNet Evaluation Centre.

I performed these steps on the ADFS2016 server:

  1. Open a command prompt and navigate to support\adprep directory.
  2. Type in: adprep /forestprep

1

Once the first step is complete, you will get “The command has completed successfully.”

Next run: adprep /domainprep2

Migration Process – ADFS – Phase 3:

At this stage we had already completed:

  • Adding ADFS 2016 to the existing farm
  • Promoting one of the new ADFS2016 server as primary
  • Pointing all secondary server to the primary server
  • Upgraded the schema

Next phase is to remove the existing ADFS v3 (ADFS 2012 R2) from the Azure Load Balancer (or any load balancer you have in place).

After you have removed ADFS v3 from the load balancer, and possibly from the farm (or simply by having them turned off) you will need to raise the Farm Behavior Level (FBL).

When raising the FBL, any ADFS v3 server will be removed from the farm automatically. So you don’t have to remove them yourself.

When the ADFS v3 servers are no longer part of the farm, I would like to recommend to keep them turned off, should anything go wrong you simply can go back on turning the ADFS v3 servers, make one primary, and in this case you may avoid impacting the business.

If you find yourself in this situation, just make sure everything else is pointing to the ADFS v3.

When you’re ready again, just start from the beginning in adding ADFS 2016 back to the farm.

Here are the steps:

  1. On the Primary ADFS 2016 server open an elevated PowerShell and run the following cmdlet:
Invoke-AdfsFarmBehaviorLevelRaise

12

As you may have noticed, it automatically detected which ADFS servers the operation will be performed on. Both ADFS03 and ADFS04 are ADFS 2016 versions.

During the process, you will see the usual PowerShell execution message:

12-1

Once complete, you will see a successful message:

13

If the service account had failed to be added to the Enterprise Key Admins group, do it manually.

In order to confirm the Farm Behavior Level, run the following cmdlet:

Get-AdfsFarmInformation

14

If you go to https://portal.office.com, and enter the email address of a federated domain, you should be redirected to your ADFS login page:

15

And this is it. You have successfully migrated from ADFS v3.0 to ADFS 2016.

The next post in our series is on Azure MFA integration with ADFS 2016, so make sure to please come back tomorrow to check in details the configuration process.

I hope you’ve enjoyed this post. For any feedback or questions, please leave a comment below.