Provision Users for Exchange with FIM/MIM 2016 using the Granfeldt PowerShell MA, avoiding the AD MA (no-start-ma) error

Forefront / Microsoft Identity Manager provides Exchange Mailbox provisioning out of the box on the Active Directory Management Agent. I’ve used it in many many implementations over the years. However, in my first MIM 2016 implementation in late 2015 I ran into issues with something I’d done successfully many times before.

I was getting “no-start-ma” on the AD MA on export to AD. The point at which the MA sets up its connection to the Exchange environment. After some searching I found Thomas’s blog detailing the problem and a solution. In short update the MIM Sync Server to .NET 4.6. For me this was no-joy. However when MS released the first rollup update for MIM in December everything fired up and worked as normal.

Step forward a month as I was finalising development for the MIM solution I was building for my customer and my “no-start-ma” error was back when I re-enabled mailbox provisioning. Deselect the Exchange Provisioning option on the AD MA and all is good. Re-enable it and it fails. One week left of dev and I need mailbox provisioning so time for a work around whilst I lodge a Premier Support ticket.

So how can I get mailbox provisioning working reliably and quickly? I was already using Søren Granfeldt’s PowerShell MA for managing users Terminal Services configuration, Home Directories and Lync/Skype for Business. What’s one more. Look out for blog posts on using the PS MA to perform those other functions that I’ll be posting in the coming weeks.

Using the Granfeldt PowerShell Management Agent to Provision Exchange Mailboxes

In this blog post I’ll document how you can enable Mailbox Provisioning in Exchange utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of enabling a user with a mailbox. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change account settings for POP/IMAP/ActiveSync and de-provisioning).

My Exchange PS MA is used in conjunction with an Active Directory MA and Declarative Provisioning Rules in the MIM Portal. Essentially all the AD MA does, when you enable Exchange Provisioning (when it works) is call the ‘update-recipient’ cmdlet to finish of the mailbox provisioning. My Exchange PSMA does the same thing.


There are three attributes you need to supply values for in order to then provision them a mailbox (on top of having an Active Directory account, or course);

  • mailNickName
  • homeMDB
  • homeExchangeServerName

The later two I’m flowing the appropriate values for using my Active Directory MA. I’m setting those attributes on the AD MA as I’m provisioning the AD account on that MA which then lets me set those two attributes as initial flow only. I’m doing that as over time it is highly likely that those attribute values may change with normal business as usual messaging admin tasks. I don’t want my Exchange MA stomping all over them.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the management agent account to the appropriate Exchange role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Exchange

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing mailbox provisioning, the schema only consists of the attributes needed to know the state of the user with respect to enablement and the attributes associated with enabling and confirming a user for a mailbox.

Password Script (password.ps1)

Empty as described above.

Import Script (Import.ps1)

Import values for attributes defined in the schema.

Export Script (Export.ps1)

The business part of the MA. Take the mailnickName attribute value flowed from FIM, (the other required attributes are populated via the AD MA) and call update-recipient to provision the mailbox.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attribute, create the Set used for the transition, create your Synchronisation Rule, create your Mailbox Workflow, create your Mailbox MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management its empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flows are a combination of logic used for other parts of my solution, a Boolean flag & Mailbox GUID to determine if the user has a mailbox or not (used for my Transition Set and my Export script).

Below is my rules extension that sets a boolean value in the MV and then flowed to the MIM Portal that I use in my Transition Set to trigger my Synchronisation Rule.

Synchronisation Rules

My Exchange Outbound Sync rule doesn’t and isn’t complex. All it is doing is sync’ing out the mailnickName attribute and applying the rule based on an MPR, Set and Workflow.

For this implementation my outbound attribute flow for mailnickName is a simple firstname.lastname format.


I have a Set that I use as a ‘transition set’ to trigger provisioning to Lync. My Set looks to see if the user account exists in AD (I flow in the AD DN to an attribute in the Portal) and the mailbox status (set by the Advanced Flow Rule shown above). I also have (not shown in the screenshot) a Boolean attribute in the MIM Portal that is set based on an advanced flow rule on the AD MA that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active or not).


An action based workflow that will use the trigger the Synchronisation rule for Exchange Mailbox creation.


Finally my MPR for provisioning mailboxes is based on the transition set,

and my Mailbox Workflow.


Using the Granfeldt PowerShell MA I was able to quickly abstract Mailbox Provisioning from the AD Management Agent and perform the functionality on its own MA.


Follow Darren on Twitter @darrenjrobinson

Connected data source error code: 8344: insufficient access rights to perform the operation.

Originally blogged @ Lucian.Blog. Follow Lucian on Twitter @LucianFrango.

I’m in the final stages of a long running Exchange migration from two on-premises ADDS forests and Exchange organisations to Exchange Online. The infrastructure foundations were laid out by some Kloudie colleagues some time ago. The environment has been running great for a while now, however, recently when trying to do some remote move migration batches to Exchange Online, I’ve been running into failures.

A few months ago I had the same issue and at that time I quickly found it to be related to DirSync. This project has an older deployment of DirSync with some customisation specific to this environment. That time I managed to find some duplicate attributes between the DirSync metaverse and Active Directory on-premises for the problematic users, which for the most part, was the cause of the problems. Fast forward and this week and I’m again running into some migration failures.

Read More

Azure Active Directory Connect high-availability using ‘Staging Mode’

With the Azure Active Directory Connect product (AAD Connect) being announced as generally available to the market (more here, download here), there is a new feature available that will provide a greater speed of recovery of the AAD Sync component. This feature was not available with the previous AAD Sync or DirSync tools and there is little information about it available in the community, so hopefully this model can be considered for your synchronisation design.

Even though the AAD Sync component within the AAD Connect product is based on the Forefront Identity Manager (FIM) synchronisation service, it does not take on the same recovery techniques as FIM. For AAD Sync, you prepare two servers (ideally in different data centres) and install AAD Connect. Your primary server would be configured to perform the main role to synchronise identities between your Active Directory and Azure Active Directory, and the second server installed in much the same way but configured with the setting ‘enable staging mode’ being selected. Both servers are independent and do not share any components such as SQL Server, and the second server is performing the same tasks as the primary except for the following:

  • No exports occur to your on-premise Active Directory
  • No exports occur to Azure Active Directory
  • Password synchronisation and password writeback are disabled.

Should the primary server go offline for a long period of time or become unrecoverable, you can enable the second server by simply running the installation wizard again and disabling staging mode. When the task schedule next runs, it will perform a delta import and synchronisation and identify any differences between the state of the previous primary server and the current server.

Some other items you might want to consider with this design.

  • Configure the task schedule on the second server so that it runs soon after the primary server completes. By default the task schedule runs every 3 hours and launches at a time which is based on when it was installed, therefore the second server task schedule can launch up to a few hours after the primary server runs. Based on the average amount of work the primary server takes, configure the task schedule on the second server to launch say 5-10 minutes later
  • AAD Sync includes a tool called CSExportAnalyzer which displays changes that are staged to be exported to Active Directory or Azure Active Directory. This tool is useful to report on pending exports while the server is in ‘staging mode’
  • Consider using ‘custom sync groups’ which are located in your Active Directory domain. The default installation of the AAD Sync component will create the following groups locally on the server: ADSyncAdmins, ADSyncOperators, ADSyncBrowse and ADSyncPasswordSet. With more than one AAD Sync server, these groups need to be managed on the servers and kept in sync manually. Having the groups in your Active Directory will simplify this administration.

    NOTE: This feature is not yet working with the current AAD Connect download and this blog will be updated when working 100%.

The last two items will be detailed in future blogs.

Using a Proxy with Azure AD Sync Services

In this blog I am going to cover some tips and tricks for using Azure AD Sync Services with a proxy… including the specific URLs required for whitelisting, the proxy settings used during the installation, configuration and running of the tool, and a workaround for apps that do not support authenticating proxies.

URL Whitelisting

It is generally recommended to whitelist all the Office 365 URLs to bypass proxy infrastructure as this provides the best performance and avoids issues with applications that are not compatible with an authenticating proxies (OneDrive for Business client installations, Exchange Hybrid services, Azure AD Sync Services and so on…). Although this is the easiest path to adoption and least likely to encounter technical issues, it is not always possible. This is particularly true for security conscious organisations, where whitelisting wildcard addresses may be undesirable.

If you want to be specific with the URLs required for Azure AD Sync Services, the following URLs must bypass proxy authentication:


Proxy Settings

When you run through the DirectorySyncTool.exe wizard to install and configure Azure AD Sync Services, at the point where you first enter your Azure AD credentials the wizard will use the proxy settings defined for the current logged on Windows user. In this instance, make sure you’ve configured your proxy settings in Internet Options (inetcpl.cpl) for the user running the installation.

In step 8 (Configure), the installation wizard connects to and configures Azure Active Directory. This step of the wizard attempts an outbound HTTPS to using the proxy settings defined for the Azure AD Sync Services service account. This service account is either the one you specified during the installation (if you ran the DirectorySyncTool.exe with the /serviceAccount* paramater), or the one that was automatically created by the wizard.

I’ve previously written about my recommendations to specify a service account for the installation so that you know the credentials. In this case you can easily configure the proxy settings by launching inetcpl.cpl with the service account. For example:

runas /user:<domain>\<AADSync Service Account> "control.exe inetcpl.cpl"

Once the Azure AD Sync Services installation is complete, all synchronisation events are going to run under the context of the Azure AD Sync Services service account and will rely on the proxy settings defined in inetcpl.cpl.

AADSync with an authenticating Proxy

If for some reason you can’t bypass an authenticating proxy for AADSync, or you’re desperate to get AADSync up and running while you wait for the proxy admin to add the URLs to a whitelist (my scenario), CNTLM to the rescue! I used this recently to get Azure AD Sync Services working with an authenticating proxy and it’s as easy as:

  1. Download and install CNTLM on the AADSync server
  2. Configure the cntlm.ini with the proxy server and authentication details (you can save the account password or an NTLM hash, for those that are concerned about saving credentials in plain text)
  3. Start the CNTLM service
  4. Configure CNTLM as your proxy in Internet Settings (default is
  5. Install and Configure AADSync

ADSync Cmdlets

I really enjoyed the later versions of DirSync which included a native PowerShell Module to execute sync engine tasks and show some global configuration settings. Now that we are looking at moving over to the new tool AADSync there is a new module installed but with very little reference to it available on the web at time of writing this blog. I’ve outlined the name of the cmdlets below but the ‘Get-Help’ doesn’t offer any description or examples as yet so I’ve included some in this post.



From browsing over these cmdlets we can see that there is much more functionality available to use then there was in the DirSync module equivalent. If we take nothing else away from this list it’s that we can now not just run the engine but configure the tool itself.


Here are some nice examples of what we can achieve now that ADSync Module is available

Example 1

Scenario: Create a custom rule to not sync users with X121Address=NoSync

[code language=”PowerShell”]
#Get the AD Connector
$ADConnector = (Get-ADSyncConnector | ? {$_.Type -eq "AD"})
#Create the Scope Filter Object
$scopefilter = New-Object Microsoft.IdentityManagement.PowerShell.ObjectModel.ScopeCondition
$scopefilter.Attribute = "x121Address"
$scopefilter.ComparisonValue = "NoSync"
$scopefilter.ComparisonOperator = "EQUAL"
#Create the Attribute Flow
$AttributeFlowMappings = New-Object Microsoft.IdentityManagement.PowerShell.ObjectModel.AttributeFlowMapping
$AttributeFlowMappings.Source = "True"
$AttributeFlowMappings.Destination = "cloudFiltered"
$AttributeFlowMappings.FlowType = "constant"
$AttributeFlowMappings.ExecuteOnce = $False
$AttributeFlowMappings.ValueMergeType = "Update"
#Add the Scope Filter to a Scope Group
$scopefiltergroup = New-Object Microsoft.IdentityManagement.PowerShell.ObjectModel.ScopeConditionGroup
#Create the Rule
$GUID = $ADConnector.Identifier.Guid
Add-ADSyncRule -Connector $GUID -Name "In from AD – User DoNotSyncFilter" -SourceObjectType user -TargetObjectType person -Direction inbound -AttributeFlowMappings $AttributeFlowMappings -LinkType Join -Precedence "1" -ScopeFilter $scopefiltergroup

Example 2

Scenario: Add Additional Attributes to be imported from Active Directory

[code language=”PowerShell”]$ADConnector = (Get-ADSyncConnector | ? {$_.Type -eq "AD"}) | Add-ADSyncConnectorAttributeInclusion -AttributeTypes employeeID[/code]

Example 3

Scenario: Adjust the attribute flow of UPN so that AD Mail Attribute flows to UPN in Office 365

[code language=”PowerShell”]
#Define the Flow Mapping
$Mapping = New-object Microsoft.IdentityManagement.PowerShell.ObjectModel.AttributeFlowMapping
$Mapping.Source = "mail"
$Mapping.Destination = "userPrincipalName"
$Mapping.FlowType = "Direct"
$Mapping.ExecuteOnce = $false
$Mapping.Expression = $null
$Mapping.ValueMergeType = "update"
#Get the AD Connector
$ADConnector = (Get-ADSyncConnector | ? {$_.Type -eq "AD"})
$GUID = $ADConnector.Identifier.Guid
#Create the Rule with higher precedence
Add-ADSyncRule -Connector $GUID -Direction Inbound -Name "In From AD – User UPN Flow" -SourceObjectType user -TargetObjectType person -AttributeFlowMappings $Mapping -Description "Map Mail to UPN in the Metaverse" -LinkType Join -Precedence 99

Here is the user sync’d to the Metaverse without Attribute Flow Transformation. (Note the UPN value)


We run the PowerShell and preview the results


Commit the change and check with a Metavere search


Let’s check Azure Active Directory. It has the prefix.

Happy days!

Some of these cmdlets seem to still need a little TLC. I found that they didn’t give the desired results although committing in the shell. We all love agile, so give it time and they should get fixed up, and there is always the GUI if you really have too.

Extending Yammer SSO to Support Users Without an Email Address


Yammer Enterprise is offered through the Microsoft Office 365 Enterprise plan. Deployment of Yammer Single Sign-On (SSO) for Office 365 users with a valid primary email address is a relative simple and well documented process.

One of our customers had a requirement for Yammer as a social platform, however a large percentage of their workforce are not enabled for email services. In the ‘SSO Implementation FAQ‘ published by Microsoft, it suggests that it is possible to configure SSO support for user accounts that do not have an email address associated with them, however there isn’t any supporting documentation to go with it.

The process outlined here assumes that Yammer SSO has already been enabled for users with a valid primary email address and all user accounts have been configured with a publicly routable UserPrincipalName suffix (UPN) for logon. This blog post provides guidance for extending Yammer SSO to support users without an email address, requiring a custom claim configuration on ADFS and the Office 365 tenant to enable this scenario.

ADFS Configuration

As in the image below, you should have an existing ‘Relying Party Trust’ configuration on ADFS if Yammer SSO is enabled for ordinary email enabled users.

Note: The ‘E-Mail Address’ at right side column for ‘Outgoing Claim Type‘ should be replaced with ‘SAML_SUBJECT’.

In order to extend the support to users without primary email address the ‘samAccountName’ attribute will be used for the claim rule (you could also use the UserPrincipalName). Therefore the following four custom claim rules need to be created and configured on the ‘Issuance Transform Rules‘ tab under the ‘Relying Party Trusts‘ node of the ADFS management console.

1. Remove the existing rule for ‘E-Mail-Addresses‘ under ‘Issuances Transform Rules
2. Add following custom rules in the order specified below to ensure the logic flows

Rule 1: Check for Email Address
– Click on Add Rules and select custom rule
– Insert the following text and save

[code language=”text”]
@RuleName = &quot;Check for Email&quot;
c:[Type == &quot;;, Issuer == &quot;AD AUTHORITY&quot;]
=&gt; add(store = &quot;Active Directory&quot;, types = (&quot;;), query = &quot;;mail;{0}&quot;, param = c.Value);

Rule 2: Check for No Email Address
– Click on Add Rules and select custom rule
– Insert the following text and save

[code language=”text”]
@RuleName = &quot;No email&quot;
NOT EXISTS([Type == &quot;;])
=&gt; add(Type = &quot;http://emailCheck&quot;, Value = &quot;NoEmail&quot;);

Rule 3: If No Email Address Exists Use samAccountName Attribute
– Click on Add Rules and select custom rule
– Insert the following text and save

[code language=”text”]
@RuleName = &quot;Send samAccountName for users without email&quot;
c:[Type == &quot;;, Issuer == &quot;AD AUTHORITY&quot;]
&amp;&amp; [Type == &quot;http://emailCheck&quot;, Value == &quot;NoEmail&quot;]
=&gt; issue(store = &quot;Active Directory&quot;, types = (&quot;SAML_SUBJECT&quot;), query = &quot;;samAccountName;{0}&quot;, param = c.Value);

Rule 4: Use Primary Email Address if email address exists
– Click on Add Rules and select custom rule
– Insert the following text and save

[code language=”text”]
@RuleTemplate = &quot;LdapClaims&quot;
@RuleName = &quot;Send email to Yammer&quot;
c:[Type == &quot;;, Issuer == &quot;AD AUTHORITY&quot;]
=&gt; issue(store = &quot;Active Directory&quot;, types = (&quot;SAML_SUBJECT&quot;), query = &quot;;mail;{0}&quot;, param = c.Value);

The custom rules will be listed in order that they were created in as shown below:

Office 365 Tenant Configuration

You will need to raise a support request with Microsoft to set the ‘Allow Fake Email‘ option on the email domain being used for Yammer SSO. For all user accounts without a valid email address the ‘Fake Email: true‘ flag will be set after its authentication by ADFS and the Microsoft Office 365 Support Engineer will be able validate this for you.

Yammer Directory Synchronization Tool

Yammer DirSync is typically used for synchronising user account information between your Active Directory and Office 365 Yammer. Yammer DirSync does not officially support user accounts without a valid primary email address as stated in the Yammer Directory Synchronization FAQ:

As such, the recommended way to do this would be to manually synchronise your user list to Yammer by using a CSV. To automate the synchronisation for user accounts without an email address, custom coding through the Yammer REST API would be required.

As is documented in the Yammer configuration guide, Yammer DirSync only requires the two attributes of GUID and mail set on the user accounts for it to work. As a workaround it would be possible to populate the mail attribute in Active Directory with the ‘fake’ email address for the user accounts you would like to synchronise, however this may not be a suitable approach for every environment.

Failure Upgrading DirSync with a Remote SQL Instance

I’ve just recently come across an issue when performing the upgrade procedure for the Microsoft Azure Directory Sync tool with a remote SQL database. The procedure seems simple enough at first glance and is documented here.

To break down the process it is only a few simple steps:

Install the new dirsync –

Dirsync.exe /fullsql

Click next on the upgrade wizard until complete

Run Powershell –

Import-Module DirSync

Run the following PowerShell cmdlet to update the backend database –

Install-OnlineCoexistenceTool -UseSQLServer –SqlServer <ServerName> -Upgrade -Verbose -ServiceCredential (Get-Credential)

The Issue

This particular issue will occur during the upgrade procedure on the PowerShell step Install-OnlineCoexistenceTool with the following error –

VERBOSE: Running InstallOnlineCoexistenceTool in Upgrade mode.

Install-OnlineCoexistenceTool : The SQL Server Instance specified during an upgrade must match the previously

configured SQL Server Instance. Expected SQL parameter for upgrade were Server: Instance:

At line:1 char:1

+ Install-OnlineCoexistenceTool -UseSQLServer -SqlServer servername …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : InvalidOperation: (Microsoft.Onlin…CoexistenceTool:InstallOnlineCoexistenceTool) [Inst

all-OnlineCoexistenceTool], DirectorySyncInstallException

+ FullyQualifiedErrorId : 201,Microsoft.Online.Coexistence.PS.Install.InstallOnlineCoexistenceTool

The first time I got this error, I assumed that I had provided incorrect syntax for the cmdlet and proceeded to try every variant possible. Nothing seemed to satisfy the shell so I started to look elsewhere. Next step along the process was to go look at the possible FIM configuration settings listed in the registry that I knew of –


I found the two keys that I ‘presumed’ that the cmdlet was using for verification –

Server = <Server FQDN>

SQLInstance = <blank>

Based on these two key values I then went back to my shell and tried to enter the syntax exactly as I could see it. I thought that maybe because my ‘SQLinstance’ value was empty, PowerShell was finding it hard for me to process this in the cmdlet of a null value. To cut a long troubleshooting story short, it didn’t matter. I had stared at the cmdlet long enough and resided to the fact that it wasn’t happy about the values stored elsewhere, and I wasn’t going to find them any time soon.


There was an issue in previous versions of DirSync where the following two registry keys were not written when installed using the /FullSQL flag –



DirSync attempts to read these keys when performing the in-place upgrade to verify the SQL Server and Instance name, and then the upgrade fails when it cannot find them.


Copy value data from key –


Create a new string value of –


Paste the value data from above

Copy value data from key –


Create a new string value of –


Paste the value data from above (if any)

Note: For me this key was blank as the default instance was being used not a named instance.

Re-run cmdlet –

Install-OnlineCoexistenceTool -UseSQLServer –SqlServer <ServerName> -Upgrade -Verbose -ServiceCredential (Get-Credential)

Expected Output –

PS C:\Windows\system32> Install-OnlineCoexistenceTool -UseSQLServer -SqlServer <SQL Server NAME> -Upgrade -Verbose -ServiceCredential (Get-Credential)

cmdlet Get-Credential at command pipeline position 1

Supply values for the following parameters:


VERBOSE: Running InstallOnlineCoexistenceTool in Upgrade mode.

VERBOSE: Skipping Microsoft SQL Server 2012 SP1 Express installation.

VERBOSE: Upgrading Microsoft Forefront Identity Manager

VERBOSE: AbandonKeys: C:\Program Files\Windows Azure Active Directory Sync\SYNCBUS\Synchronization Service\Bin\miiskmu.exe /q /a

VERBOSE: AbandonKeys: C:\Program Files\Windows Azure Active Directory Sync\SYNCBUS\Synchronization Service\Bin\miiskmu.exe /q /a ExitCode:0

VERBOSE: Uninstalling msiexec.exe /quiet /l miisUninstall.log /x {C9139DEA-F758-4177-8E0F-AA5B09628136}


VERBOSE: Please wait while the Synchronization Engine is uninstalled.

VERBOSE: The Synchronization Engine was successfully removed. Msiexec /x returned 0.

VERBOSE: Installing msiexec.exe /quiet /l “C:\Program Files\Windows Azure Active Directory Sync\miissetup.log” /i “C:\Program Files\Windows Azure Active Directory Sync\SynchronizationService.msi” INSTALLDIR=”C:\Program Files\Windows Azure Active Directory Sync\SYNCBUS” storeserver=<servername> serviceaccount=<credentials> servicedomain=<domain> groupadmins=FIMSyncAdmins groupoperators=FIMSyncOperators groupbrowse=FIMSyncBrowse groupaccountjoiners=FIMSyncJoiners grouppasswordset=FIMSyncPasswordSet servicepassword=<Hidden>


VERBOSE: Please wait while the Synchronization Engine is installed.

VERBOSE: The installation of the Synchronization Engine was successful. Setup returned 0.

VERBOSE: The Synchronization Engine was installed successfully.

VERBOSE: Installing msiexec.exe /quiet /lv “C:\Program Files\Windows Azure Active Directory Sync\MSOIDCRLSetup.log” /i “C:\Program Files\Windows Azure Active Directory Sync\Msoidcli.msi” REBOOT=ReallySuppress…

VERBOSE: Please wait while the Microsoft Online Services Sign-in Assistant service is being installed.

VERBOSE: The Microsoft Online Services Sign-in Assistant service installation succeeded. Setup returned 0.

VERBOSE: The Microsoft Online Services Sign-in Assistant service was installed successfully.

VERBOSE: Installing msiexec.exe /quiet /lv “C:\Program Files\Windows Azure Active Directory Sync\dirsyncUpgrade.log” /i “C:\Program Files\Windows Azure Active Directory Sync\DirectorySync.msi” TARGETDIR=”C:\Program Files\Windows Azure Active Directory Sync\” REBOOT=ReallySuppress…

VERBOSE: Please wait while the Directory Sync tool is installed.

VERBOSE: The Directory Synchronization tool install succeeded. Setup returned 0.

VERBOSE: The Directory Synchronization tool was installed successfully.


Once again we must make a big thankyou to Yaran at Microsoft PSS for helping me resolve this issue.

Claims-Based Federation Service using Microsoft Azure

In this post I will discuss how you can setup Microsoft Azure to provide federation services with claims authentication in the same way that an Active Directory Federation Service (ADFS) farm would on-premises. This can be achieved with an Azure subscription, Access Control Services (ACS) and an Azure Active Directory (AAD) instance. The key benefit of using Azure SaaS is that Microsoft have taken care of all the high availability and load scaling configuration, therefor you have no need to manage multiple ADFS servers to gain the same desired functionality.

If you don’t have an Azure subscription then signup for a free trial.

To have this process work in Azure we are going to need 2 functions –

  1. A service supporting claims-based protocols and be our token issuer – ACS
  2. A synchronized directory with connectivity to our claims issuer service – DirSync/AAD.

Azure Access Control Services (ACS) is a cloud based federated identity provider which currently supports tokens issued from social identities such as Windows Live ID, Facebook, Google, Yahoo and also the over-looked feature of enterprise identities like Active Directory. ACS can do some great things with transitions between protocols and transformation of claims as it issues secure tokens from the identity provider to the relying party applications.

Azure Active Directory (AAD) will house the synchronized identities from the on-premises Active Directory and provide claims-based authentication as it sends secure tokens with embedded claims to ACS.

The Solution

This scenario will apply to the majority of organizations whom are wanting to map the identity attributes from a source Active Directory (LDAP) to the outgoing claims type for a single sign on (SSO) experience.

Step 1 – Get the Identity in the Cloud

We need to use either a new directory in Azure or use an existing Office 365 directory if you have a tenant syncing already (skip to step 2) –

  • Turn on Synchronization by selecting Directory Integration > ACTIVATE
  • Create a user account to authenticate from your directory synchronization tool to AAD
  • Download the directory sync tool
  • Install and configure the directory sync tool on a server that is joined to your local Active Directory domain, and then run an initial sync. More information go here
  • Remember to enable password copy in the configuration wizard

Step 2 – Create an ACS Namespace

  • Select > New > App Services > Access Control > Quick Create
  • Give it a useful name prefix example: ‘kloudfed’
  • Once finished creating, Select > Manage
  • Select > Application Integration and see the Endpoint References

Next we need to create the Azure Active Directory as the Identity Provider

Step 3 – Create AAD Endpoint Mapping

Currently we have no way of ACS connecting to the information in AAD. To do this we create an Application Endpoint –

  • Select Active Directory > Federated Identity Instance > Applications > ADD AN APPLICATION
  • Select Add an application my organization is developing
  • Give it a name example: ‘Access Control Services’

  • In APP Properties:
    • Sign-ON URL = < ACS Management Portal >
    • APP ID URI = < ACS Management Portal >

  • In Azure Management Portal > Open your newly created APP
  • Select > View Endpoints

  • Copy Federation Metadata document URL to add to ACS

Step 4 – Add AAD as an Identity Provider in ACS

With the Federation Metadata Endpoint configured this can be the Identity Provider defined in ACS –

  • In ACS Portal Under Trust Relationships Select > Identity Providers
  • Select Add
  • In the Add Identity Provider Page Select > WS-Federation identity provider (e.g. Microsoft AD FS 2.0) > Select Next
  • Give it a name and paste the Federation Metadata URL from the previous step

  • Click Save

Now we are ready to add a claims-aware application in ACS which is requiring federated identity authentication.

Step 5 – Create a Relying Party Application

I’m not a developer, but this is the quickest way I know to make a claims-aware application using my copy of Visual Studio Express –

  • Select File > New Project > Web > ASP .NET Web Application

  • Click OK
  • Click > Change Authentication

  • Select Organization Account > Select On-Premises
  • Enter the ACS WS-Federation Metadata URL and make up an Application Unique ID

Step 6 – Add Relying Party Application information to ACS

The ACS Namespace wont issue tokens till it trusts the application. In the ACS Management Portal –

  • Select Relying Party Applications > New
  • Important – Realm is your App ID URI entered in the above steps

  • Generate a default claim transformation rule

Step 7 – Run a claims aware application

Here is my web application which will redirect from default URL (localhost) which is requiring authentication from Azure Active Directory –

  • The Redirect takes me to Azure Active Directory login

  • Enter Username & Password
  • Then I’m taken to the trusted application redirect URL after successful authenticating and we can already see claims information highlighted in yellow. Success!


Fiddler2 Capture

Let’s look at the web requests in a Fiddler2 capture to see what’s happening behind the scenes of my browser. Here are some condensed capture snippets –


302 Redirectlocalhost to


302 to service:


302 to


200 Result – Token response with claims returned from

Filtering through the above #40 decrypted capture we find claims information. This is where we can validate if the web application is receiving the expected information in the token(I’ve removed the double-encoded values from the capture for readability) –

We can see the token is SAML2.0:


One example of a claim attribute:



If you’re after a claims-based federation service for SSO and installing a bunch of new servers in your existing infrastructure is something you’re not keen on undertaking, then maybe Azure gets a look. In an industry where everything must be called by its acronym to be thought of as a serious entity, I hereby label Microsoft Azure Federation Services “MAFS”.

Through reading Kloud blog posts you have solutions for creating a claims-based federation service in the cloud (MAFS) or an on-premises ADFS farm with Server 2012 R2 (both of which should only take you about 10 minutes to install!).

DirSync and Distribution Group Self Service Management

If you’re an Office 365 Exchange Online customer and currently utilizing Directory Synchronization (DirSync) to synchronize between an on premise Active Directory and the Azure Active Directory you’ll be all too familiar with the limitations that are imposed around the management of distribution group membership. Namely an Exchange online user specified as the owner of a distribution group will not be able to manage the membership of that group through the standard Outlook Address Book interface as detailed here

In the background, if we think about this in relation to DirSync functionality, the group is being pushed from the on premise Active Directory to the Azure Active Directory in a one way sync. Consequently the group object in the Azure Active Directory is read-only which explains the limitations that exist around group modification.

As distribution group self-service functionality has been in place for quite some time in the Exchange landscape, it often comes as a significant blow to businesses when they realize this functionality isn’t available by default in Exchange online. The net result is either an accepted loss of functionality or the prospect of a significant increase in calls to the helpdesk to facilitate the modification of group membership from the on premise Active Directory.

There are however a variety of ways to work around this issue, this is by no means an exhaustive list but gives some guidance/ideas around what’s possible;

Use the ‘Find Users, Contacts and Groups’ tool to allow group modification

For domain connected computers it is possible to run the following command to fire up the built in ‘Find Users, Contacts and Groups’ tool;

%systemroot%\system32\rundll32.exe dsquery.dll,OpenQueryWindow

This will allow an on premise Active Directory group to be searched for and modified by an end user which will in turn be synchronized back up to the Azure Active Directory via DirSync.


  • No changes required to the Exchange/Office 365 configuration
  • The source of authority for all directory information remains on premise


  • Only works from domain connected machines
  • Represents a change to the existing manner in which distribution group objects are currently modified using the Outlook Address Book interface.

Move the Distribution Group Objects to the Azure Active Directory

It is possible to delete distribution groups from the on premise Active Directory and recreate them in the Azure Active Directory. By doing so the groups created in the Azure AD are writable and can consequently be modified using the standard Outlook Address Book functionality from an Exchange online mailbox user.


  • Allows Distribution Group membership to be modified using the existing Outlook Address Book functionality and consequently means zero change to the way end users are used to working.


  • Requires Distribution Group objects to be moved to the Azure Active Directory. This involves both a level of change, risk and impact which I will touch upon in the points of consideration section below.
  • Requires a change in the way that groups are managed moving forward, namely security groups are managed in the on premise Active Directory and Distribution Groups are managed in the Azure Active Directory.

Points of Consideration

  • In a typical Dirsync deployment the source of authority for all directory information is on-prem. If distribution groups are moved to the Azure Active Directory, source of authority is split between the on premise Active Directory and Azure Active Directory. Whilst this isn’t necessary a negative aspect of this option, it is a worthy point of consideration, least of all because administration of groups will now be performed in both locations.
  • Consideration needs to be given to the implementation timing of this option. Whilst deleting and recreating distribution groups is a relatively straight forward change which can easily be scripted using Powershell, there’s likely to be a period, albeit potentially small between when the on premise group is deleted and recreated in the Azure Active Directory. During this time any email addressed to the group will obviously NDR. This can potentially be mitigated if your organization use a third party solution for AV/AS which can be used to hold inbound email for a short period until distribution groups have been recreated in the Azure Active Directory.
  • Consideration needs to be given as to when during the migration that Distribution Groups are moved. Typically this should be performed at the end of a migration after all mailboxes have been migrated to Exchange Online. Obviously any remaining on premise Exchange mailbox users will no longer be able to see/utilize distribution groups once they have been deleted/recreated in the Azure Active Directory.
  • Changes made to the Azure Active Directory are throttled by Microsoft which could potentially impact the speed at which distribution groups can be created. Whilst it is possible to request Microsoft to amend the throttle policy this is still likely to be a factor.

Use a ForeFront Identity Manager Management Agent for Office 365 to synchronize Distribution Groups

Use the FIM MA for Office 365 to manage the provisioning and synchronization of groups between the on premise Active Directory and the Azure Active Directory.


  • Allows Distribution Group membership to be modified using the existing Outlook Address Book functionality and consequently means zero change to the way end users are used to working.


  • Requires a FIM sync engine license
  • Involves a level of complexity regarding the implementation and management of FIM


The option you choose here is very much dependent on your organizations requirements and each option carries with it a different set of pros and cons. In scenarios where an organization want to retain the standard self-service functionality provided by the Outlook Address Book, the second option provides a good balance of functionality and low cost of implementation/management moving forward. The points of consideration for this option should however be carefully considered prior to any attempt at implementation.




Follow ...+

Kloud Blog - Follow