Security Vulnerability Revealed in Azure Active Directory Connect

Microsoft ADFS

The existence of a new and potentially serious privilege escalation and password reset vulnerability in Azure Active Directory Connect (AADC) was recently made public by Microsoft.

https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnectsync-whatis

Fixing the problem can be achieved by means of an upgrade to the latest available release of AADC 1.1.553.0.

https://www.microsoft.com/en-us/download/details.aspx?id=47594

The Microsoft security advisory qualifies the issue as important and was published on Technet under reference number 4033453:

https://technet.microsoft.com/library/security/4033453.aspx#ID0EN

Azure Active Directory Connect as we know takes care of all operations related to the synchronization of identity information between on-premises environments and Active Directory Federation Services (ADFS) in the cloud. The tool is also the recommended successor to Azure AD Sync and DirSync.

Microsoft were quoted as saying…

The update addresses a vulnerability that could allow elevation of privilege if Azure AD Connect Password writeback is mis-configured during enablement. An attacker who successfully exploited this vulnerability could reset passwords and gain unauthorized access to arbitrary on-premises AD privileged user accounts.

When setting up the permission, an on-premises AD Administrator may have inadvertently granted Azure AD Connect with Reset Password permission over on-premises AD privileged accounts (including Enterprise and Domain Administrator accounts)

In this case as stated by Microsoft the risk consists of a situation where a malicious administrator resets the password of an active directory user using “password writeback”. Allowing the administrator in question to gain privileged access to a customer’s on-premises active directory environment.

Password writeback allows Azure Active Directory to write passwords back to an on-premises Active Directory environment. And helps simplify the process of setting up and managing complicated on-premises self-service password reset solutions. It also provides a rather convenient cloud based means for users to reset their on-premises passwords.

Users may look for confirmation of their exposure to this vulnerability by checking whether the feature in question (password writeback) is enabled and whether AADC has been granted reset password permission over on-premises AD privileged accounts.

A further statement from Microsoft on this issue read…

If the AD DS account is a member of one or more on-premises AD privileged groups, consider removing the AD DS account from the groups.

CVE reference number CVE-2017-8613 was attributed to the vulnerability.

https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-8613

Resolving the ‘Double Auth’ prompt issue in ADFS with Azure AD Conditional Access MFA

As mentioned in my previous post, Using ADFS on-premises MFA with Azure AD Conditional Access, if you have implemented Azure AD Conditional Access to enforce MFA for all your Cloud Apps and you are using the SupportsMFA=true parameter to direct MFA execution to your ADFS on-premises MFA server you may have encountered what I call the ‘Double Auth’ prompt issue.

While this doesn’t happen across all Cloud Apps, you will see it on the odd occasion (in particular the Intune Company Portal and Azure AD Powershell Cmdlets) and it has the following symptoms:

  1. User signs into Azure AD App (e.g. Azure AD Powershell with Modern Auth support)
  2. User sees auth prompt, enters their username, which redirects to ADFS
  3. User enters credentials and clicks enter
  4. It looks like it signs in successfully but then ADFS reappears and the user is prompted to enter credentials again.
  5. After the second successful attempt, the user is then prompted for MFA as expected

 

DoubleAuth

Understanding the reason behind why this happens is reliant on two things:

  1. The background I provided in the blog post I referenced above, specifically that when SupportsMFA is being used, two requests to ADFS are sent by Azure AD instead of one as part of the authentication process when MFA is involved.
  2. Configuration and behaviour of the prompt=login behaviour of Azure AD, which is discussed in this Microsoft Docs article.

So to delve into this, let’s crack out our trusty Fiddler tool to look at what’s happening:

DoubleAuth3a.png

Highlighted in the image above is the culprit.  You’ll see in the request strings that the user is being sent to ADFS with two key parameters wauth=… and wfresh=0.  What is happening here is that this particular Azure AD application has decided that as part of sign in, they want to ensure that ‘fresh credentials’ are being provided (say, to ensure the correct user creds are used).  They do this by telling Azure AD to generate a request with prompt=login, however as noted in the article referenced, because some legacy ADFS systems don’t understand this ‘modern’ parameter, the default behaviour is for Azure AD to pre-emptively translate this request into two ‘WS-Fed’ parameters that they can understand.   In particular, wfresh=0 as per the WS-Fed specs means:

…  If specified as “0” it indicates a request for the IP/STS to re-prompt the user for authentication before issuing the token….

The problem of course is that ADFS sees the wfresh=0 parameter in both requests and will abide by that behaviour by prompting the user for credentials each time!

So, the fix for this is fairly simple and is in fact (very vaguely) called out in the technet article I’ve referenced above – which is to ensure that Azure AD uses the NativeSupport configuration so that it sends the parameter as-is to ADFS to interpret instead of pre-emptively translating it.

The specific command to run is:

Set-MsolDomainFederationSettings –DomainName yourFederatedDomain.com -PromptLoginBehavior NativeSupport

The prerequisite to this fix is to ensure that you are either running:

  • ADFS 2016
  • ADFS 2012 R2 with the July 2016 update rollup

Once this update is applied (remember that these DomainFederationSettings changes can take up to 15-30 mins) you’ll be able to see the difference via Fiddler – ADFS is sent with a prompt=login parameter instead and its only for the first request so  the overall experience is the credential prompt only occurs once.

DoubleAuth4.png

Hope that saves a few hairs for anyone out there who’s come across this issue!

[UPDATE 12/09/17]  Looks like there’s a Microsoft KB article around this issue now!  Helpful for those who need official references:  https://support.microsoft.com/en-us/help/4037806/federated-users-in-azure-active-directory-may-have-to-sign-in-two-time

Using ADFS on-premises MFA with Azure AD Conditional Access

With the recent announcement of General Availability of the Azure AD Conditional Access policies in the Azure Portal, it is a good time to reassess your current MFA policies particularly if you are utilising ADFS with on-premises MFA; either via a third party provider or with something like Azure MFA Server.

Prior to conditional MFA policies being possible, when utilising on-premises MFA with Office 365 and/or Azure AD the MFA rules were generally enabled on the ADFS relying party trust itself.  The main limitation with this of course is the inability to define different MFA behaviours for the various services behind that relying party trust.  That is, within Office 365 (Exchange Online, Sharepoint Online, Skype for Business Online etc.) or through different Azure AD Apps that may have been added via the app gallery (e.g. ServiceNow, SalesForce etc.).  In some circumstances you may have been able to define some level of granularity utilising custom authorisation claims, such as bypassing MFA for activesync and legacy  authentication scenarios, but that method was reliant on special client headers or the authentication endpoints that were being used and hence was quite limited in its use.

Now with Azure AD Conditional Access policies, the definition and logic of when to trigger MFA can, and should, be driven from the Azure AD side given the high level of granularity and varying conditions you can define. This doesn’t mean though that you can’t keep using your on-premises ADFS server to perform the MFA, you’re simply letting Azure AD decide when this should be done.

In this article I’ll show you the method I like to use to ‘migrate’ from on-premises MFA rules to Azure AD Conditional Access.  Note that this is only applicable for the MFA rules for your Azure AD/Office 365 relying party trust.  If you are using ADFS MFA for other SAML apps on your ADFS farm, they will remain as is.

Summary

At a high level, the process is as follows:

  1. Configure Azure AD to pass ‘MFA execution’ to ADFS using the SupportsMFA parameter
  2. Port your existing ADFS MFA rules to an Azure AD Conditional Access (CA) Policy
  3. Configure ADFS to send the relevant claims
  4. “Cutover” the MFA execution by disabling the ADFS MFA rules and enabling the Azure AD CA policy

The ordering here is important, as by doing it like this, you can avoid accidentally forcing users with a ‘double MFA’ prompt.

Step 1:  Using the SupportsMFA parameter

The crux of this configuration is the use of the SupportsMFA parameter within your MSOLDomainFederationSettings configuration.

Setting this parameter to True will tell Azure AD that your federated domain is running an on-premises MFA capability and that whenever it determines a need to perform MFA, it is to send that request to your STS IDP (i.e. ADFS) to execute, instead of triggering its own ‘Azure Cloud MFA’.

To perform this step is a simple MSOL PowerShell command:

Set-MsolDomainFederationSettings -domain yourFederatedDomain.com -SupportsMFA $true

Pro Tip:  This setting can take up to 15-30 mins to take effect.  So make sure you factor in this into your change plan.  If you don’t wait for this to kick in before cutting over your users will get ‘double MFA’ prompts.

Step 2:  Porting your existing MFA Rules to Azure AD Conditional Access Policies

There’s a whole article in itself talking about what Azure AD CA policies can do nowadays, but for our purposes let’s use the two most common examples of MFA rules:

  1. Bypass MFA for users that are a member of a group
  2. Bypass MFA for users on the internal network*

Item 1 is pretty straight forward, just ensure our Azure AD CA policy has the following:

  • Assignment – Users and Groups:
    • Include:  All Users
    • Exclude:  Bypass MFA Security Group  (simply reuse the one used for ADFS if it is synced to Azure AD)

MFABypass1

Item 2 requires the use of the Trusted Locations feature.  Note that at the time of writing, this feature is still the ‘old’ MFA Trusted IPs feature hosted in the Azure Classic Portal.   Note*:  If you are using Windows 10 Azure AD Join machines this feature doesn’t work.  Why this is the case will be an article in itself, so I’ll add a link here when I’ve written that up.

So within your Azure AD CA policy do the following:

  • Conditions – Locations:
    • Include:  All Locations
    • Exclude:  All Trusted IPs

MFABypass2.png

Then make sure you click on Configure all trusted locations to be taken to the Azure Classic Portal.  From there you must set Skip multi-factor authentication for requests from federated users on my intranet

MFABypass3.png

This effectively tells Azure AD that a ‘trusted location’ is any authentication requests that come in with a InsideCorporateNetwork claim.

Note:  If you don’t use ADFS or an IDP that can send that claim, you can always use the actual ‘Trusted IP addresses’ method.

Now you can define exactly which Azure AD apps you want MFA to be enabled for, instead of all of them as you had originally.

MFABypass7.png

Pro Tip:  If you are going to enable MFA on All Cloud Apps to start off with, check the end of this article for some extra caveats you should consider for, else you’ll start breaking things.

Finally, to make this Azure AD CA policy actually perform MFA, set the access controls:

MFABypass8.png

For now, don’t enable the policy just yet as there is more prep work to be done.

Step 3:  Configure ADFS to send all the relevant claims

So now that Azure AD is ready for us, we have to configure ADFS to actually send the appropriate claims across to ‘inform’ it of what is happening or what it is doing.

The first is to make sure we send the InsideCorporateNetwork claim so Azure AD can apply the ‘bypass for all internal users’ rule.  This is well documented everywhere, but the short version is, within your Microsoft Office 365 Identity Platform relying party trust in ADFS and Add a new Issuance Transform Rule to pass through the Inside Corproate Network Claim:

MFABypass4

Fun fact:   The Inside Corporate Network claim is automatically generated by ADFS when it detects that the authentication was performed on the internal ADFS server, rather then through the external ADFS proxy (i.e. WAP).  This is why it’s a good idea to always use an ADFS proxy as opposed to simply reverse proxying your ADFS.  Without it you can’t easily tell whether it was an ‘internal’ or ‘external’ authentication request (plus its more secure).

The other important claim to send through is the authnmethodsreferences claim.  Now you may already have this if you were following some online Microsoft Technet documentation when setting up ADFS MFA.  If so, you can skip this step.

This claim is what is generated when ADFS successfully performs MFA.  So think of it as a way for ADFS to tell Azure AD that it has performed MFA for the user.

MFABypass6

Step 4: “Cutover” the MFA execution

So now that everything is prepared, the ‘cutover’ can be performed by doing the following:

  1. Disable the MFA rules on the ADFS Relying Party Trust
    Set-AdfsRelyingPartyTrust -TargetName "Microsoft Office 365 Identity Platform" -AdditionalAuthenticationRules $null
  2. Enable the Azure AD CA Policy

Now if it all goes as planned, what should happen is this:

  1. User attempts sign into an Azure AD application.  Since their domain is federated, they are redirected to ADFS to sign in.
  2. User will perform standard username/password authentication.
    • If internal, this is generally ‘SSO’ with Windows Integrated Auth (WIA).  Most importantly this user will get a ‘InsideCorporateNetwork’ = true claim
    • If external, this is generally a Forms Based credential prompt
  3. Once successfully authenticated, they will be redirected back to Azure AD with a SAML token.  Now is actually when Azure AD will assess the CA policy rules and determines whether the user requires MFA or not.
  4. If they do, Azure AD actually generates a new ADFS sign in request, this time specifically stating via the wauth parameter to use multipleauthn. This will effectively tell ADFS to execute MFA using its configured providers
  5. Once the user successfully completes MFA, they will go back to Azure AD with this new SAML token that contains a claim telling Azure AD that MFA has now been performed and subsequently lets the user through

This is what the above flow looks like in Fiddler:

MFABypass9.png

This is what your end-state SAML token should like as well:

MFABypass10

The main takeaway is that Step 4 is the new auth flow that is introduced by moving MFA evaluation into Azure AD.  Prior to this, step 2 would have simply perform both username/password authentication and MFA in the same instance rather then over two requests.

Extra Considerations when enabling MFA on All Cloud Apps

If you decide to take a ‘exclusion’ approach to MFA enforcement for Cloud Apps, be very careful with this.  In fact you’ll even see Microsoft giving you a little extra warning about this.

MFABypass12

The main difference with taking this approach compared to just doing MFA enforcement at the ADFS level is that you are now enforcing MFA on all cloud identities as well!  This may very well unintentionally break some things, particularly if you’re using ‘cloud identity’ service accounts (e.g. for provisioning scripts or the like).  One thing that will definitely break is the AADConnect account that is created for directory synchronisation.

So at a very minimum, make sure you remember to add the On-Premises Directory Synchronization Service Account(s) into the exclusion list for for your Azure AD MFA CA policy.

The very last thing to call out is that some Azure AD applications, such as the Intune Company Portal and Azure AD Powershell cmdlets, can cause a ‘double ADFS prompt’ when MFA evaluation is being done in Azure AD.   The reason for this and the fix is covered in my next article Resolving the ‘double auth’ prompt issue with Azure AD Conditional Access MFA and ADFS so make sure you check that out as well.

 

Automate Secondary ADFS Node Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

Additional nodes in an ADFS farm are required to provide redundancy incase your primary ADFS node goes offline. This ensures your ADFS service is still up and servicing all incoming requests. Additional nodes also help in load balancing the incoming traffic, which provides a better user experience in cases of high authentication traffic.

Overview

Once an ADFS farm has been created, adding additional nodes is quite simple and mostly relies on the same concepts for creating the ADFS farm. I would suggest reading my previous blog Automate ADFS Farm Installation and Configuration as some of the steps we will use in this blog were documented in it.

In this blog, I will show how to automatically provision a secondary ADFS node to an existing ADFS farm. The learnings in this blog can be easily used to deploy more ADFS nodes automatically, if needed.

Install ADFS Role

After provisioning a new Azure virtual machine, we need to install the Active Directory Federation Services role on it.  To do this, we will use the same Desired State Configuration (DSC) script that was used in the blog Automate ADFS Farm Installation and Configuration. Please refer to the section Install ADFS Role in the above blog for the steps to create the DSC script file InstallADFS.ps1.

Add to an existing ADFS Farm

Once the ADFS role has been installed on the virtual machine, we will create a Custom Script Extension (CSE) to add it to the ADFS farm.

In order to do this, we need the following

  • certificate that was used to create the ADFS farm
  • ADFS service account credentials that was used to create the ADFS farm

Once the above prerequisites have been met, we need a method for making the files available to the CSE. I documented a neat trick to “sneak-in” the certificate and password files onto the virtual machine by using Desired State Configuration (DSC) package files in my previous blog. Please refer to Automate ADFS Farm Installation and Configuration under the section Create ADFS Farm for the steps.

Also note, for adding the node to the adfs farm, the domain user credentials are not required. The certificate file will be named adfs_certificate.pfx  and the file containing the encrypted adfs service account password will be named adfspass.key.

Assuming that the prerequisites have been satisfied, and the files have been “sneaked” onto the virtual machine, lets proceed to creating the CSE.

Open Windows Powershell ISE and paste the following.

param (
  $DomainName,
  $PrimaryADFSServer,
  $AdfsSvcUsername
)

The above shows the parameters that need to be passed to the CSE where

$DomainName is the name of the Active Directory domain
$PrimaryADFSServer is the hostname of the primary ADFS server
$AdfsSvcUsername is the username of the ADFS service account

Save the file with a name of your choice (do not close the file as we will be adding more lines to it). I named my script AddToADFSFarm.ps1

Next, we need to define a variable that will contain the path to the directory where the certificate file and the file containing the encrypted adfs service account password are stored. Also, we need a variable to hold the key that was used to encrypt the adfs service account password. This will be required to decrypt the password.

Add the following to the CSE file

Next, we need to decrypt the encrypted adfs service account password.

Now, we need to import the certificate into the local computer certificate store. To make things simple, when the certificate was exported from the primary ADFS server, it was encrypted using the adfs service account password.

After importing the certificate, we will read it to get its thumbprint.

Up until now, the steps are very similar to creating an ADFS farm. However, below is where they diverge.

Add the following lines to add the virtual machine to the existing ADFS farm

You now have a custom script extension file that will add a virtual machine as a secondary node to an existing ADFS Farm.

Below is the full CSE

All that is missing now is the method to bootstrap the scripts described above (InstallADFS.ps1 and AddToADFSFarm.ps1) using Azure Resource Manager (ARM) templates.

Below is part of an ARM template that can be added to your existing template to install the ADFS role on a virtual machine and then add the virtual machine as a secondary node to the ADFS farm

In the above ARM template, the parameter ADFS02VMName refers to the hostname of the virtual machine that will be added to the ADFS Farm.

Listed below are the variables that have been used in the ARM template above

The above method can be used to add as many nodes to the ADFS farm as needed.

I hope this comes in handy when creating an ARM template to automatically deploy an ADFS Farm with additional nodes.

Automate ADFS Farm Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

In this multi-part blog, I will be showing how to automatically install and configure a new ADFS Farm. We will accomplish this using Azure Resource Manager templates, Desired State Configuration scripts and Custom Script Extensions.

Overview

We will use Azure Resource Manager to create a virtual machine that will become our first ADFS Server. We will then use a desired state configuration script to join the virtual machine to our Active Directory domain and to install the ADFS role. Finally, we will use a Custom Script Extension to install our first ADFS Farm.

Install ADFS Role

We will be using the xActiveDirectory and xPendingReboot experimental DSC modules.

Download these from

https://gallery.technet.microsoft.com/scriptcenter/xActiveDirectory-f2d573f3

https://gallery.technet.microsoft.com/scriptcenter/xPendingReboot-PowerShell-b269f154

After downloading, unzip the file and  place the contents in the Powershell modules directory located at $env:ProgramFiles\WindowsPowerShell\Modules (unless you have changed your systemroot folder, this will be located at C:\ProgramFiles\WindowsPowerShell\Modules )

Open your Windows Powershell ISE and lets create a DSC script that will join our virtual machine to the domain and also install the ADFS role.

Copy the following into a new Windows Powershell ISE file and save it as a filename of your choice (I saved mine as InstallADFS.ps1)

In the above, we are declaring some mandatory parameters and some variables that will be used within the script

$MachineName is the hostname of the virtual machine that will become the first ADFS server

$DomainName is the name of the domain where the virtual machine will be joined

$AdminCreds contains the username and password for an account that has permissions to join the virtual machine to the domain

$RetryCount and $RetryIntervalSec hold values that will be used to  check if the domain is available

We need to import the experimental DSC modules that we had downloaded. To do this, add the following lines to the DSC script

Import-DscResource -Module xActiveDirectory, xPendingReboot

Next, we need to convert the supplied $AdminCreds into a domain\username format. This is accomplished by the following lines (the converted value is held in $DomainCreds )

Next, we need to tell DSC that the command needs to be run on the local computer. This is done by the following line (localhost refers to the local computer)

Node localhost

We need to tell the LocalConfigurationManager that it should reboot the server if needed, continue with the configuration after reboot,  and to just apply the settings only once (DSC can apply a setting and constantly monitor it to check that it has not been changed. If the setting is found to be changed, DSC can re-apply the setting. In our case we will not do this, we will apply the setting just once).

Next, we need to check if the Active Directory domain is ready. For this, we will use the xWaitForADDomain function from the xActiveDirectory experimental DSC module.

Once we know that the Active Directory domain is available, we can go ahead and join the virtual machine to the domain.

the JoinDomain function depends on xWaitForADDomain. If xWaitForADDomain fails, JoinDomain will not run

Once the virtual machine has been added to the domain, it needs to be restarted. We will use xPendingReboot function from the xPendingReboot experimental DSC module to accomplish this.

Next, we will install the ADFS role on the virtual machine

Our script has now successfully added the virtual machine to the domain and installed the ADFS role on it. Next, create a zip file with InstallADFS.ps1 and upload it to a location that Azure Resource Manager can access (I would recommend uploading to GitHub). Include the xActiveDirectory and xPendingReboot experimental DSC module directories in the zip file as well. Also add a folder called Certificates inside the zip file and put the ADFS certificate and the encrypted password files (discussed in the next section) inside the folder.

In the next section, we will configure the ADFS Farm.

The full InstallADFS.ps1 DSC script is pasted below

Create ADFS Farm

Once the ADFS role has been installed, we will use Custom Script Extensions (CSE) to create the ADFS farm.

One of the requirements to configure ADFS is a signed certificate. I used a 90 day trial certificate from Comodo.

There is a trick that I am using to make my certificate available on the virtual machine. If you bootstrap a DSC script to your virtual machine in an Azure Resource Manager template, the script along with all the non out-of-box DSC modules have to be packaged into a zip file and uploaded to a location that ARM can access. ARM then will download the zip file, unzip it, and place all directories inside the zip file to $env:ProgramFiles\WindowsPowerShell\Modules ( C:\ProgramFiles\WindowsPowerShell\Modules ) ARM assumes the directories are PowerShell modules and puts them in the appropriate directory.

I am using this feature to sneak my certificate on to the virtual machine. I create a folder called Certificates inside the zip file containing the DSC script and put the certificate inside it. Also, I am not too fond of passing plain passwords from my ARM template to the CSE, so I created two files, one to hold the encrypted password for the domain administrator account and the other to contain the encrypted password of the adfs service account. These two files are named adminpass.key and adfspass.key and will be placed in the same Certificates folder within the zip file.

I used the following to generate the encrypted password files

AdminPlainTextPassword and ADFSPlainTextPassword are the plain text passwords that will be encrypted.

$key  is used to convert the secure string into an encrypted standard string. Valid key lengths are 16, 24, 32

For this blog, we will use

$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Open Windows PowerShell ISE and paste the following (save the file with a name of your choice. I saved mine as ConfigureADFS.ps1)

param (
 $DomainName,
 $DomainAdminUsername,
 $AdfsSvcUsername
)

These are the parameters that will be passed to the CSE

$DomainName is the name of the Active Directory domain
$DomainAdminUsername is the username of the domain administrator account
$AdfsSvcUsername is the username of the ADFS service account

Next, we will define the value of the Key that was used to encrypt the password and the location where the certificate and the encrypted password files will be placed

$localpath = "C:\Program Files\WindowsPowerShell\Modules\Certificates\"
$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Now, we have to read the encrypted passwords from the adminpass.key and adfspass.key file and then convert them into a domain\username format

Next, we will import the certificate into the local computer certificate store. We will mark the certificate exportable and set the password same as the domain administrator password.

In the above after the certificate is imported,  $cert is used to hold the certificate thumbprint

Next, we will configure the ADFS Farm

The ADFS Federation Service displayname is set to “Active Directory Federation Service” and the Federation Service Name is set to fs.adfsfarm.com

Upload the CSE to a location that Azure Resource Manager can access (I uploaded my script to GitHub)

The full ConfigureADFS.ps1 CSE is shown below

Azure Resource Manager Template Bootstrapping

Now that the DSC and CSE scripts have been created, we need to add them in our ARM template, straight after the virtual machine is provisioned.

To add the DSC script, create a DSC extension and link it to the DSC Package that was created to install ADFS. Below is an example of what can be used

The extension will run after the ADFS virtual machine has been successfully created (referred to as ADFS01VMName)

The MachineName, DomainName and domain administrator credentials are passed to the DSC extension.

Below are the variables that have been used in the json file for the DSC extension (I have listed my GitHub repository location)

Next, we have to create a Custom Script Extension to link to the CSE for configuring ADFS. Below is an example that can be used

The CSE depends on the ADFS virtual machine being successfully provisioned and the DSC extension that installs the ADFS role to have successfully completed.

The DomainName, Domain Administrator Username and the ADFS Service Username are passed to the CSE script

The following contains a list of the variables being used by the CSE (the example below shows my GitHub repository location)

"repoLocation": "https://raw.githubusercontent.com/nivleshc/arm/master/",
"ConfigureADFSScriptUrl": "[concat(parameters('repoLocation'),'ConfigureADFS.ps1')]",

That’s it Folks! You now have an ARM Template that can be used to automatically install the ADFS role and then configure a new ADFS Farm.

In my next blog, we will explore how to add another node to the ADFS Farm and we will also look at how we can automatically create a Web Application Proxy server for our ADFS Farm.

Azure AD Application SSO and Provisioning – Things to consider

I’ve had the opportunity to work on a couple of customer engagements recently integrating SaaS based cloud applications with Azure Active Directory, one being against a cloud-only Azure AD tenant and the other federated with on-premises Active Directory using ADFS. The Azure AD Application Gallery now has over 2,700 applications listed which provide a supported and easy process to integrate applications with Azure AD, although not every implementation is the same. Most of them have a prescribed tutorial on how to perform the integration (listed here), while some application vendors have their own guides.

This blog won’t describe what is Single Sign-On (SSO) or User Provisioning with Azure AD (which is detailed here), but rather to highlight some things to consider when you start planning your application integrations with Azure AD.

User provisioning into the application

Azure AD supports user provisioning and de-provisioning into some target SaaS applications based on changes made in Windows Server Active Directory and/or Azure AD. Organisations will generally either be managing user accounts in these SaaS applications manually, using scripts or some other automated method. Some notes about provisioning in Azure AD:

  • ‘Featured’ apps in the Azure AD Application Gallery support automatic provisioning and de-provisioning. A privileged service account with administrative permissions within the SaaS application is required to allow Azure AD the appropriate access
  • Some applications (i.e. like Lucidchart and Aha!) are able to perform the function of provisioning of new users on their own, which is managed directly by the application with no control from Azure AD. Some applications (i.e. like Lucidchart) will also automatically apply a license to the new user account, which makes the correct user assignment to the application important. Other applications (i.e. like Aha!) will automatically create the new accounts but not assign a license or provide access to any information. License assignment in this case would be a manual task by an administrator in the target SaaS application
  • All other applications that do not provide the capability for automatic provisioning require the user accounts to be present within the target application. User accounts will therefore need to be either manually created, scripted, or make use of another form of Identity Management. You need to ensure the method on how the application matches the user from Azure AD is known so that accurate matching can be performed. For example, the ‘UserPrincipleName’ value from the Azure AD user account matches the ‘NameID’ value within the application

Role mapping between Azure AD and the application

Access to applications can be assigned either directly against user accounts in Azure AD or by using groups. Using groups is recommended to ease administration with the simplest form using a single Security Group to allow users access to the application. Some applications like Splunk support the capability to assign user access based on roles which is performed using ‘Role Mapping’. This provides the capability to have a number of Security Groups represent different roles, and assigning users to these groups in Azure AD not only enables access to the application but also assigns the user’s role. Consider how you manage user memberships to these groups, how they are named so that they are easily identified for management, and how the application knows about the groups. Splunk for example requires groups to be created using the ‘Group Object Id’ of the Azure AD group and not it’s name, as shown in this example:

The Group Object Id for groups can be found by going to your Directory Page and then navigating to the group whose Object Id is to be retrieved.


User interface changes

Some applications have a modified interface when SSO is enabled, allowing users to select whether to login with userID/password credentials or a federated login. You need to consider end-user communications for any sudden changes to the application that users may face and let them know which option they should select. For example, the Aha! application changes from this:

to this when SSO is enabled:

Have a backout plan

In addition to the point above, some applications require an all-or-nothing approach when SSO is enabled. If you have not sufficiently planned and prepared, you may be locked out of the application as an administrator. If you have access to a test subscription of your application, test, test and test! If you only have your production subscription, I would suggest having an open dialog with the application support team in case you inadvertently lock yourself out of authenticating. For example, the New Relic application allows the SSO configuration to be made in advance which then requires only the account owner to enable it. Once enabled, all authentication is using SSO and you had better hoped the configuration is correct or else you’ll be asking support to backout the changes.

Applications such as ServiceNow have the ability to have multiple SSO providers where you can implement a phased approach for the enabling of SSO to users with the ultimate goal to make SSO authentication default. ServiceNow also has a ‘side door’ feature where you can by-pass external authentication and login with a local ServiceNow user (as detailed here).

Accessing applications

Users will be able to access applications configured for SSO with Azure AD using either of the following methods:

  • the Microsoft MyApps application panel (https://myapps.microsoft.com)
  • the Office 365 application launch portal
  • Service Provider Initiated Authentication, where authentication is initiated directly at the SaaS application

Most federated applications that support SAML 2.0 also support the ability for users to start at the application login page, and then be signed in through Azure AD either by automatic redirection or by clicking on a SSO link to sign in as shown in the Aha! images above. This is known as Service Provider Initiated Authentication, and most federated applications in the Azure AD Application Gallery support this capability. Some applications such as AWS do not support Service Provider Initiated Authentication and SSO does not work if users attempt to authenticate from the application’s login screen. The alternate methods to access the application via SSO need to be followed and communication for end-users to inform them on how to access these types of applications. For AWS, you can access the application using SSO from the MyApps application panel, with an alternate method by providing users with a ‘Single Sign-On URL’ which can be used as a direct, deep link to access the application (as detailed here).

Conclusion

Hopefully you can see that although it can seem quite simple to integrate an application with Azure AD, take the time to plan and test the integration of your applications.

Debugging an Office 365 ADFS/SSO issue when accessing Office Store in browser

We recently came across an issue with a customer where they had configured a standard SSO experience with Office 365 using ADFS and it was working perfectly except for a specific use case.   When a user accesses the office store via the Office 365 portal (e.g. portal.office.com/store) they got into an endless SSO login loop.  Specfically, they would see the following:

  1. Connection to Portal.Office.com
  2. Redirection to login.microsoftonline.com
  3. Redirection to adfs.customerdomain.com (automatically signed in because of WIA SSO)
  4. Redirection to login.microsftonline.com
  5. Redirection to Portal.Office.com\Store page but loads partially and then redirection to login.microsoftonline.com
  6. Redirection to adfs.customerdomain.com (automatically signed in because of WIA SSO)
  7. Rinse and repeat steps 4-6 ad nauseum

Normally, steps 1-4 is expected, because what is normally happening here in laymen’s terms are:

  1. Portal.office.com provides a response to the user’s browser saying “please provide sign in credentials, and do so via redirecting to this url”
  2. login.microsoftonline.com provides a reponse to the user’s browser saying “we know you are trying to access @customerdomain.com resources, which is federated, so please connect to adfs.customerdomain.com and get a valid auth token to present to us”
  3. User connects to adfs.customerdomain.com, and because it’s in it’s trusted sites list, and trusted sites is configured to perform windows integrated auth (WIA), the user’s browser uses the computers cached kerberos/ntlm auth token to sign into ADFS.  ADFS responds with a valid SAML token which the user can present to Azure AD.
  4. User connects back to login.microsoftonline.com and presents the auth token.  From there it is validated and an auth token browser cookie is created.
  5. At this point, the user would normally then connect to portal.office.com and access the relevant resources.

Now this was certainly the case for the majority of services in Office 365, including the main portal, Exchange Online, Sharepoint etc.   It was just the Office Store that was the problem, and bizarrely it was doing a partial load and then getting into the loop.

The resolution to the problem was discovered by doing a Fiddler trace of the sign in traffic.

First, we confirmed the normal ADFS SSO components were working (highlighted in red)

  1. Connection to login.microsoftonline.com
  2. Redirection to our ADFS servers (sts.{domain}.com) against the WIA path
  3. Connection back to login.microsoftonline.com with the SAML token response
  4. Subsequent connection to portal.office.com/store which is what we’re trying to access

ADFS-SSO-Process

The cause of the looping however can be seen further down in the fiddler trace, shown here:

Store-Office-NoAuth

From this we can see what would be an unexpected connection to a different domain url, store.office.com, and looking at it we see an authentication error, and no auth cookies are being presented to it.

A quick inventory of the configuration on the client gave us the answer as to why.   While the customer had done the common practice of including the key Office 365 URLs into their trusted sites (office365.com, outlook.com, sharepoint.com etc.), they did not include store.office.com.   This in itself is not specifically a problem, BUT they also had ‘mixed’ mode set up where their ‘Internet Zone’ was set to run in Protected Mode, while their ‘Trusted Sites’ and ‘Intranet Zone’ were configured to have Protected Mode turned off.

For a bit of background around what IE Protected Mode is, refer to this article, but the short version is, Protected mode is a security feature in IE which ensures that sites run in an ‘isolated instance’.  An effect of this isolation is that sites in protected mode cannot access the regular cookie cache.

As such, there was a ‘zone mismatch’ between the two, and what effectively was happening was:

  1. ADFS sign in was working as expected, and the user was successfully accessing portal.office.com resources and creating an auth cookie
  2. Portal.office.com/store however was actually loading content that was hosted under the site store.office.com
  3. store.office.com needed authentication credentials in order to correctly serve the content.  But because it was not included in trusted sites, that content request was running in ‘protected mode’.
  4. Because of the isolation, it couldn’t see the valid auth token which was stored in the regular cookie cache, and so triggers an auth sign in request to login.microsoftonline.com to get one.  And thus begins the endless authentication cycle.

The simple fix for this was simply adding the *.office.com into the trusted sites zone to ensure that it did not execute in protected mode.   Once we did it, we could see that when a connection to store.office.com is made, the appropriate auth token is presented and the page load works as expected:

Store-Office-Auth

Now, I’m not personally aware of a Microsoft guidance in terms of what should go into trusted sites for Office 365 functionality, but generally at Kloud we recommend the following to be used to avoid SSO issues:

  • *.sharepoint.com
  • *.office.com
  • *.microsoftonline.com
  • *.lync.com
  • *.outlook.com
  • URL of ADFS Server (e.g. sts.customerdomain.com)

Hopefully that gives you a bit of insight around the thinking, components and tools needed in debugging SSO issues with ADFS and Office 365!

Modern Authentication and MAPI-HTTP

If you haven’t heard, Modern Authentication (aka ADAL), has now officially gone GA (https://blogs.office.com/2015/11/19/updated-office-365-modern-authentication-public-preview/) – which means that if you are utilising Office 365 services, particularly Exchange Online, and Office 2013/2016 as your client, you should really be looking at enabling this functionality for your end users.

For those unfamiliar with Modern Auth, there are numerous benefits, but one of the most obvious for end users is it removes the need for the use of ‘save my credentials’ when signing into Exchange Online and provides a true SSO experience when combined with ADFS Federation.

Now, the process for enabling Modern Auth is very well documented in the above blog post, but the short version is:

  1. Enable Modern Auth on the Tenant side via a powershell command
  2. Enable Modern Auth on the client side via a registry key

What isn’t explicity called out as a pre-requisite however is that your Outlook client also needs to also be running in MAPI over HTTP mode.  Now, for a large percentage of environments, this is probably not an issue – but if you are like a recent customer of ours, you may have specifically disabled the use of MAPI-HTTP.  Now there are a number of valid reasons of why you might have wanted to do this (theirs was they were using an old version of Riverbed that didn’t support optimization using the MAPI-HTTP protocol), but as it turns out, the introduction of the MAPI over HTTP protocol to replace the legacy ‘RPC over HTTP’ protocol over 3 years ago was actually one of the precursors into allowing all this fancy Modern Authentication stuff to work.

For full details around what MAPI-HTTP protocol brought in and the benefits it introduced, I recommend reading this great blog post from the Microsoft Exchange team.

But in short, if you find that you have enabled Modern Auth as per the described steps, and you’re still getting the ‘basic auth prompt’ – I’d go ahead and check to see if the following registry key has been set (via GPO or otherwise)

Key: HKEY_CURRENT_USER\Software\Microsoft\Exchange
DWORD: MapiHttpDisabled
Value: 1

The above needs to either be deleted, or set to ‘0’ in order for Modern Auth to work.  The support article KB2937684 also gives you some more info around ensuring MAPI-HTTP is enabled for your Office 2013/2016 client.

Note that changing this value does not take effect until the client next performs an ‘autodiscover’ to switch over.  Depending on the situation, this may cause the user to see the following pop up:

AdministratorChange

Generally speaking, I’d recommend you test the registry update first with a subset of pilot users, before rolling this out to the wider audience.  Once that is confirmed working, then you can look at rolling out Modern Auth to your end users.

Office 365 SSO: Configuring multiple Office 365 tenants to use a single AD FS instance

Q: Can multiple Office 365 tenants use a single AD FS instance to provide SSO?

A: Yes

Overview

  • Office 365 tenant 1 is configured with the domain contoso.com
  • Office 365 tenant 2 is configured with the domain sub.contoso.com
  • Single Active Directory Forest with multiple UPNs configured (contoso.com and sub.contoso.com)
  • Single AD FS instance including an AD FS Proxy/Web Application Proxy published with the name sts.contoso.com
  • Two instances of Azure ADConnect configured with container filtering to ensure users are only synchronised to a single tenant

Configuring SSO

The Federation Trust for Tenant 1 is configured by establishing a Remote PowerShell session (with the Azure Active Directory Module loaded) and running the standard ‘Convert-MsolDomainToFederated’ cmdlet:

Convert-MsolDomainToFederated -DomainName contoso.com -SupportMultipleDomain

When it comes to configuring Tenant 2, things become a little more tricky. One of the features of the ‘Convert-MsolDomainToFederated’ cmdlet is that it performs the required configuration on Office 365 as well as the AD FS Farm. If you attempt to run this cmdlet against an AD FS Farm that has a Federation Trust established with a different tenant, it will fail and return an error. Therefore, we need to make use of the ‘Set-MsolDomainAuthentication’ cmdlet which only makes configuration changes to Office 365 and is usually used for establishing Federation Trusts with third party IdPs.

The first step is to export the token-signing certificate from the AD FS farm either via Windows Certificate Manager or via PowerShell:

$certRefs=Get-AdfsCertificate -CertificateType Token-Signing
$certBytes=$certRefs[0].Certificate.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert)
[System.IO.File]::WriteAllBytes("c:\temp\tokensigning.cer", $certBytes)

Next, establish a Remote PowerShell session with Tenant 2 and then run the following script to configure the trust:

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("C:\temp\tokensigning.cer")
$certData = [system.convert]::tobase64string($cert.rawdata)
$dom="sub.contoso.com"
$url="https://sts.contoso.com/adfs/ls/"
$uri="http://sub.contoso.com/adfs/services/trust/"
$ura="https://sts.contoso.com/adfs/services/trust/2005/usernamemixed"
$logouturl="https://sts.contoso.com/adfs/ls/"
$metadata="https://sts.contoso.com/adfs/services/trust/mex"
#command to enable SSO
Set-MsolDomainAuthentication -DomainName $dom -Authentication Federated -ActiveLogOnUri $ura -PassiveLogOnUri $url -MetadataExchangeUri $metadata -SigningCertificate $certData -IssuerUri $uri -LogOffUri $logouturl -PreferredAuthenticationProtocol WsFed

Once configured, the configuration of both tenants can be validated using the ‘Get-MsolDomainFederationSettings’ cmdlet. The only difference when comparing the tenant configuration should be the ‘FederationBrandName’ and the ‘IssuerUri’ values.

ADFS sign-in error: “An error occurred. Contact your administrator for more information.”

Originally posted @ Lucian.Blog. Follow Lucian on twitter @Lucianfrango.


I’ve not had that much luck deploying Azure AD Connect and ADFS 3.0 in Azure for a client in the last few weeks. After some networking woes I’ve moved onto the server provisioning and again got stuck. Now, I know IT is not meant to be easy otherwise there wouldn’t be some of the salaries paid out to the best and brightest, this install though was simple and nothing out of the ordinary. A standard deployment that I and many others have done before.

Let me paint the picture: ADFS is now running, although not working, in Azure compute across a load balanced set of two servers with a further load balanced set of web application proxy (WAP) servers in front. Theres two domain controllers and a AAD Connect server all across a couple of subnets in a VNET.

Read More