Automate Secondary ADFS Node Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

Additional nodes in an ADFS farm are required to provide redundancy incase your primary ADFS node goes offline. This ensures your ADFS service is still up and servicing all incoming requests. Additional nodes also help in load balancing the incoming traffic, which provides a better user experience in cases of high authentication traffic.

Overview

Once an ADFS farm has been created, adding additional nodes is quite simple and mostly relies on the same concepts for creating the ADFS farm. I would suggest reading my previous blog Automate ADFS Farm Installation and Configuration as some of the steps we will use in this blog were documented in it.

In this blog, I will show how to automatically provision a secondary ADFS node to an existing ADFS farm. The learnings in this blog can be easily used to deploy more ADFS nodes automatically, if needed.

Install ADFS Role

After provisioning a new Azure virtual machine, we need to install the Active Directory Federation Services role on it.  To do this, we will use the same Desired State Configuration (DSC) script that was used in the blog Automate ADFS Farm Installation and Configuration. Please refer to the section Install ADFS Role in the above blog for the steps to create the DSC script file InstallADFS.ps1.

Add to an existing ADFS Farm

Once the ADFS role has been installed on the virtual machine, we will create a Custom Script Extension (CSE) to add it to the ADFS farm.

In order to do this, we need the following

  • certificate that was used to create the ADFS farm
  • ADFS service account credentials that was used to create the ADFS farm

Once the above prerequisites have been met, we need a method for making the files available to the CSE. I documented a neat trick to “sneak-in” the certificate and password files onto the virtual machine by using Desired State Configuration (DSC) package files in my previous blog. Please refer to Automate ADFS Farm Installation and Configuration under the section Create ADFS Farm for the steps.

Also note, for adding the node to the adfs farm, the domain user credentials are not required. The certificate file will be named adfs_certificate.pfx  and the file containing the encrypted adfs service account password will be named adfspass.key.

Assuming that the prerequisites have been satisfied, and the files have been “sneaked” onto the virtual machine, lets proceed to creating the CSE.

Open Windows Powershell ISE and paste the following.

param (
  $DomainName,
  $PrimaryADFSServer,
  $AdfsSvcUsername
)

The above shows the parameters that need to be passed to the CSE where

$DomainName is the name of the Active Directory domain
$PrimaryADFSServer is the hostname of the primary ADFS server
$AdfsSvcUsername is the username of the ADFS service account

Save the file with a name of your choice (do not close the file as we will be adding more lines to it). I named my script AddToADFSFarm.ps1

Next, we need to define a variable that will contain the path to the directory where the certificate file and the file containing the encrypted adfs service account password are stored. Also, we need a variable to hold the key that was used to encrypt the adfs service account password. This will be required to decrypt the password.

Add the following to the CSE file

Next, we need to decrypt the encrypted adfs service account password.

Now, we need to import the certificate into the local computer certificate store. To make things simple, when the certificate was exported from the primary ADFS server, it was encrypted using the adfs service account password.

After importing the certificate, we will read it to get its thumbprint.

Up until now, the steps are very similar to creating an ADFS farm. However, below is where they diverge.

Add the following lines to add the virtual machine to the existing ADFS farm

You now have a custom script extension file that will add a virtual machine as a secondary node to an existing ADFS Farm.

Below is the full CSE

All that is missing now is the method to bootstrap the scripts described above (InstallADFS.ps1 and AddToADFSFarm.ps1) using Azure Resource Manager (ARM) templates.

Below is part of an ARM template that can be added to your existing template to install the ADFS role on a virtual machine and then add the virtual machine as a secondary node to the ADFS farm

In the above ARM template, the parameter ADFS02VMName refers to the hostname of the virtual machine that will be added to the ADFS Farm.

Listed below are the variables that have been used in the ARM template above

The above method can be used to add as many nodes to the ADFS farm as needed.

I hope this comes in handy when creating an ARM template to automatically deploy an ADFS Farm with additional nodes.

Automate ADFS Farm Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

In this multi-part blog, I will be showing how to automatically install and configure a new ADFS Farm. We will accomplish this using Azure Resource Manager templates, Desired State Configuration scripts and Custom Script Extensions.

Overview

We will use Azure Resource Manager to create a virtual machine that will become our first ADFS Server. We will then use a desired state configuration script to join the virtual machine to our Active Directory domain and to install the ADFS role. Finally, we will use a Custom Script Extension to install our first ADFS Farm.

Install ADFS Role

We will be using the xActiveDirectory and xPendingReboot experimental DSC modules.

Download these from

https://gallery.technet.microsoft.com/scriptcenter/xActiveDirectory-f2d573f3

https://gallery.technet.microsoft.com/scriptcenter/xPendingReboot-PowerShell-b269f154

After downloading, unzip the file and  place the contents in the Powershell modules directory located at $env:ProgramFiles\WindowsPowerShell\Modules (unless you have changed your systemroot folder, this will be located at C:\ProgramFiles\WindowsPowerShell\Modules )

Open your Windows Powershell ISE and lets create a DSC script that will join our virtual machine to the domain and also install the ADFS role.

Copy the following into a new Windows Powershell ISE file and save it as a filename of your choice (I saved mine as InstallADFS.ps1)

In the above, we are declaring some mandatory parameters and some variables that will be used within the script

$MachineName is the hostname of the virtual machine that will become the first ADFS server

$DomainName is the name of the domain where the virtual machine will be joined

$AdminCreds contains the username and password for an account that has permissions to join the virtual machine to the domain

$RetryCount and $RetryIntervalSec hold values that will be used to  check if the domain is available

We need to import the experimental DSC modules that we had downloaded. To do this, add the following lines to the DSC script

Import-DscResource -Module xActiveDirectory, xPendingReboot

Next, we need to convert the supplied $AdminCreds into a domain\username format. This is accomplished by the following lines (the converted value is held in $DomainCreds )

Next, we need to tell DSC that the command needs to be run on the local computer. This is done by the following line (localhost refers to the local computer)

Node localhost

We need to tell the LocalConfigurationManager that it should reboot the server if needed, continue with the configuration after reboot,  and to just apply the settings only once (DSC can apply a setting and constantly monitor it to check that it has not been changed. If the setting is found to be changed, DSC can re-apply the setting. In our case we will not do this, we will apply the setting just once).

Next, we need to check if the Active Directory domain is ready. For this, we will use the xWaitForADDomain function from the xActiveDirectory experimental DSC module.

Once we know that the Active Directory domain is available, we can go ahead and join the virtual machine to the domain.

the JoinDomain function depends on xWaitForADDomain. If xWaitForADDomain fails, JoinDomain will not run

Once the virtual machine has been added to the domain, it needs to be restarted. We will use xPendingReboot function from the xPendingReboot experimental DSC module to accomplish this.

Next, we will install the ADFS role on the virtual machine

Our script has now successfully added the virtual machine to the domain and installed the ADFS role on it. Next, create a zip file with InstallADFS.ps1 and upload it to a location that Azure Resource Manager can access (I would recommend uploading to GitHub). Include the xActiveDirectory and xPendingReboot experimental DSC module directories in the zip file as well. Also add a folder called Certificates inside the zip file and put the ADFS certificate and the encrypted password files (discussed in the next section) inside the folder.

In the next section, we will configure the ADFS Farm.

The full InstallADFS.ps1 DSC script is pasted below

Create ADFS Farm

Once the ADFS role has been installed, we will use Custom Script Extensions (CSE) to create the ADFS farm.

One of the requirements to configure ADFS is a signed certificate. I used a 90 day trial certificate from Comodo.

There is a trick that I am using to make my certificate available on the virtual machine. If you bootstrap a DSC script to your virtual machine in an Azure Resource Manager template, the script along with all the non out-of-box DSC modules have to be packaged into a zip file and uploaded to a location that ARM can access. ARM then will download the zip file, unzip it, and place all directories inside the zip file to $env:ProgramFiles\WindowsPowerShell\Modules ( C:\ProgramFiles\WindowsPowerShell\Modules ) ARM assumes the directories are PowerShell modules and puts them in the appropriate directory.

I am using this feature to sneak my certificate on to the virtual machine. I create a folder called Certificates inside the zip file containing the DSC script and put the certificate inside it. Also, I am not too fond of passing plain passwords from my ARM template to the CSE, so I created two files, one to hold the encrypted password for the domain administrator account and the other to contain the encrypted password of the adfs service account. These two files are named adminpass.key and adfspass.key and will be placed in the same Certificates folder within the zip file.

I used the following to generate the encrypted password files

AdminPlainTextPassword and ADFSPlainTextPassword are the plain text passwords that will be encrypted.

$key  is used to convert the secure string into an encrypted standard string. Valid key lengths are 16, 24, 32

For this blog, we will use

$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Open Windows PowerShell ISE and paste the following (save the file with a name of your choice. I saved mine as ConfigureADFS.ps1)

param (
 $DomainName,
 $DomainAdminUsername,
 $AdfsSvcUsername
)

These are the parameters that will be passed to the CSE

$DomainName is the name of the Active Directory domain
$DomainAdminUsername is the username of the domain administrator account
$AdfsSvcUsername is the username of the ADFS service account

Next, we will define the value of the Key that was used to encrypt the password and the location where the certificate and the encrypted password files will be placed

$localpath = "C:\Program Files\WindowsPowerShell\Modules\Certificates\"
$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Now, we have to read the encrypted passwords from the adminpass.key and adfspass.key file and then convert them into a domain\username format

Next, we will import the certificate into the local computer certificate store. We will mark the certificate exportable and set the password same as the domain administrator password.

In the above after the certificate is imported,  $cert is used to hold the certificate thumbprint

Next, we will configure the ADFS Farm

The ADFS Federation Service displayname is set to “Active Directory Federation Service” and the Federation Service Name is set to fs.adfsfarm.com

Upload the CSE to a location that Azure Resource Manager can access (I uploaded my script to GitHub)

The full ConfigureADFS.ps1 CSE is shown below

Azure Resource Manager Template Bootstrapping

Now that the DSC and CSE scripts have been created, we need to add them in our ARM template, straight after the virtual machine is provisioned.

To add the DSC script, create a DSC extension and link it to the DSC Package that was created to install ADFS. Below is an example of what can be used

The extension will run after the ADFS virtual machine has been successfully created (referred to as ADFS01VMName)

The MachineName, DomainName and domain administrator credentials are passed to the DSC extension.

Below are the variables that have been used in the json file for the DSC extension (I have listed my GitHub repository location)

Next, we have to create a Custom Script Extension to link to the CSE for configuring ADFS. Below is an example that can be used

The CSE depends on the ADFS virtual machine being successfully provisioned and the DSC extension that installs the ADFS role to have successfully completed.

The DomainName, Domain Administrator Username and the ADFS Service Username are passed to the CSE script

The following contains a list of the variables being used by the CSE (the example below shows my GitHub repository location)

"repoLocation": "https://raw.githubusercontent.com/nivleshc/arm/master/",
"ConfigureADFSScriptUrl": "[concat(parameters('repoLocation'),'ConfigureADFS.ps1')]",

That’s it Folks! You now have an ARM Template that can be used to automatically install the ADFS role and then configure a new ADFS Farm.

In my next blog, we will explore how to add another node to the ADFS Farm and we will also look at how we can automatically create a Web Application Proxy server for our ADFS Farm.

Azure AD Application SSO and Provisioning – Things to consider

I’ve had the opportunity to work on a couple of customer engagements recently integrating SaaS based cloud applications with Azure Active Directory, one being against a cloud-only Azure AD tenant and the other federated with on-premises Active Directory using ADFS. The Azure AD Application Gallery now has over 2,700 applications listed which provide a supported and easy process to integrate applications with Azure AD, although not every implementation is the same. Most of them have a prescribed tutorial on how to perform the integration (listed here), while some application vendors have their own guides.

This blog won’t describe what is Single Sign-On (SSO) or User Provisioning with Azure AD (which is detailed here), but rather to highlight some things to consider when you start planning your application integrations with Azure AD.

User provisioning into the application

Azure AD supports user provisioning and de-provisioning into some target SaaS applications based on changes made in Windows Server Active Directory and/or Azure AD. Organisations will generally either be managing user accounts in these SaaS applications manually, using scripts or some other automated method. Some notes about provisioning in Azure AD:

  • ‘Featured’ apps in the Azure AD Application Gallery support automatic provisioning and de-provisioning. A privileged service account with administrative permissions within the SaaS application is required to allow Azure AD the appropriate access
  • Some applications (i.e. like Lucidchart and Aha!) are able to perform the function of provisioning of new users on their own, which is managed directly by the application with no control from Azure AD. Some applications (i.e. like Lucidchart) will also automatically apply a license to the new user account, which makes the correct user assignment to the application important. Other applications (i.e. like Aha!) will automatically create the new accounts but not assign a license or provide access to any information. License assignment in this case would be a manual task by an administrator in the target SaaS application
  • All other applications that do not provide the capability for automatic provisioning require the user accounts to be present within the target application. User accounts will therefore need to be either manually created, scripted, or make use of another form of Identity Management. You need to ensure the method on how the application matches the user from Azure AD is known so that accurate matching can be performed. For example, the ‘UserPrincipleName’ value from the Azure AD user account matches the ‘NameID’ value within the application

Role mapping between Azure AD and the application

Access to applications can be assigned either directly against user accounts in Azure AD or by using groups. Using groups is recommended to ease administration with the simplest form using a single Security Group to allow users access to the application. Some applications like Splunk support the capability to assign user access based on roles which is performed using ‘Role Mapping’. This provides the capability to have a number of Security Groups represent different roles, and assigning users to these groups in Azure AD not only enables access to the application but also assigns the user’s role. Consider how you manage user memberships to these groups, how they are named so that they are easily identified for management, and how the application knows about the groups. Splunk for example requires groups to be created using the ‘Group Object Id’ of the Azure AD group and not it’s name, as shown in this example:

The Group Object Id for groups can be found by going to your Directory Page and then navigating to the group whose Object Id is to be retrieved.


User interface changes

Some applications have a modified interface when SSO is enabled, allowing users to select whether to login with userID/password credentials or a federated login. You need to consider end-user communications for any sudden changes to the application that users may face and let them know which option they should select. For example, the Aha! application changes from this:

to this when SSO is enabled:

Have a backout plan

In addition to the point above, some applications require an all-or-nothing approach when SSO is enabled. If you have not sufficiently planned and prepared, you may be locked out of the application as an administrator. If you have access to a test subscription of your application, test, test and test! If you only have your production subscription, I would suggest having an open dialog with the application support team in case you inadvertently lock yourself out of authenticating. For example, the New Relic application allows the SSO configuration to be made in advance which then requires only the account owner to enable it. Once enabled, all authentication is using SSO and you had better hoped the configuration is correct or else you’ll be asking support to backout the changes.

Applications such as ServiceNow have the ability to have multiple SSO providers where you can implement a phased approach for the enabling of SSO to users with the ultimate goal to make SSO authentication default. ServiceNow also has a ‘side door’ feature where you can by-pass external authentication and login with a local ServiceNow user (as detailed here).

Accessing applications

Users will be able to access applications configured for SSO with Azure AD using either of the following methods:

  • the Microsoft MyApps application panel (https://myapps.microsoft.com)
  • the Office 365 application launch portal
  • Service Provider Initiated Authentication, where authentication is initiated directly at the SaaS application

Most federated applications that support SAML 2.0 also support the ability for users to start at the application login page, and then be signed in through Azure AD either by automatic redirection or by clicking on a SSO link to sign in as shown in the Aha! images above. This is known as Service Provider Initiated Authentication, and most federated applications in the Azure AD Application Gallery support this capability. Some applications such as AWS do not support Service Provider Initiated Authentication and SSO does not work if users attempt to authenticate from the application’s login screen. The alternate methods to access the application via SSO need to be followed and communication for end-users to inform them on how to access these types of applications. For AWS, you can access the application using SSO from the MyApps application panel, with an alternate method by providing users with a ‘Single Sign-On URL’ which can be used as a direct, deep link to access the application (as detailed here).

Conclusion

Hopefully you can see that although it can seem quite simple to integrate an application with Azure AD, take the time to plan and test the integration of your applications.

Debugging an Office 365 ADFS/SSO issue when accessing Office Store in browser

We recently came across an issue with a customer where they had configured a standard SSO experience with Office 365 using ADFS and it was working perfectly except for a specific use case.   When a user accesses the office store via the Office 365 portal (e.g. portal.office.com/store) they got into an endless SSO login loop.  Specfically, they would see the following:

  1. Connection to Portal.Office.com
  2. Redirection to login.microsoftonline.com
  3. Redirection to adfs.customerdomain.com (automatically signed in because of WIA SSO)
  4. Redirection to login.microsftonline.com
  5. Redirection to Portal.Office.com\Store page but loads partially and then redirection to login.microsoftonline.com
  6. Redirection to adfs.customerdomain.com (automatically signed in because of WIA SSO)
  7. Rinse and repeat steps 4-6 ad nauseum

Normally, steps 1-4 is expected, because what is normally happening here in laymen’s terms are:

  1. Portal.office.com provides a response to the user’s browser saying “please provide sign in credentials, and do so via redirecting to this url”
  2. login.microsoftonline.com provides a reponse to the user’s browser saying “we know you are trying to access @customerdomain.com resources, which is federated, so please connect to adfs.customerdomain.com and get a valid auth token to present to us”
  3. User connects to adfs.customerdomain.com, and because it’s in it’s trusted sites list, and trusted sites is configured to perform windows integrated auth (WIA), the user’s browser uses the computers cached kerberos/ntlm auth token to sign into ADFS.  ADFS responds with a valid SAML token which the user can present to Azure AD.
  4. User connects back to login.microsoftonline.com and presents the auth token.  From there it is validated and an auth token browser cookie is created.
  5. At this point, the user would normally then connect to portal.office.com and access the relevant resources.

Now this was certainly the case for the majority of services in Office 365, including the main portal, Exchange Online, Sharepoint etc.   It was just the Office Store that was the problem, and bizarrely it was doing a partial load and then getting into the loop.

The resolution to the problem was discovered by doing a Fiddler trace of the sign in traffic.

First, we confirmed the normal ADFS SSO components were working (highlighted in red)

  1. Connection to login.microsoftonline.com
  2. Redirection to our ADFS servers (sts.{domain}.com) against the WIA path
  3. Connection back to login.microsoftonline.com with the SAML token response
  4. Subsequent connection to portal.office.com/store which is what we’re trying to access

ADFS-SSO-Process

The cause of the looping however can be seen further down in the fiddler trace, shown here:

Store-Office-NoAuth

From this we can see what would be an unexpected connection to a different domain url, store.office.com, and looking at it we see an authentication error, and no auth cookies are being presented to it.

A quick inventory of the configuration on the client gave us the answer as to why.   While the customer had done the common practice of including the key Office 365 URLs into their trusted sites (office365.com, outlook.com, sharepoint.com etc.), they did not include store.office.com.   This in itself is not specifically a problem, BUT they also had ‘mixed’ mode set up where their ‘Internet Zone’ was set to run in Protected Mode, while their ‘Trusted Sites’ and ‘Intranet Zone’ were configured to have Protected Mode turned off.

For a bit of background around what IE Protected Mode is, refer to this article, but the short version is, Protected mode is a security feature in IE which ensures that sites run in an ‘isolated instance’.  An effect of this isolation is that sites in protected mode cannot access the regular cookie cache.

As such, there was a ‘zone mismatch’ between the two, and what effectively was happening was:

  1. ADFS sign in was working as expected, and the user was successfully accessing portal.office.com resources and creating an auth cookie
  2. Portal.office.com/store however was actually loading content that was hosted under the site store.office.com
  3. store.office.com needed authentication credentials in order to correctly serve the content.  But because it was not included in trusted sites, that content request was running in ‘protected mode’.
  4. Because of the isolation, it couldn’t see the valid auth token which was stored in the regular cookie cache, and so triggers an auth sign in request to login.microsoftonline.com to get one.  And thus begins the endless authentication cycle.

The simple fix for this was simply adding the *.office.com into the trusted sites zone to ensure that it did not execute in protected mode.   Once we did it, we could see that when a connection to store.office.com is made, the appropriate auth token is presented and the page load works as expected:

Store-Office-Auth

Now, I’m not personally aware of a Microsoft guidance in terms of what should go into trusted sites for Office 365 functionality, but generally at Kloud we recommend the following to be used to avoid SSO issues:

  • *.sharepoint.com
  • *.office.com
  • *.microsoftonline.com
  • *.lync.com
  • *.outlook.com
  • URL of ADFS Server (e.g. sts.customerdomain.com)

Hopefully that gives you a bit of insight around the thinking, components and tools needed in debugging SSO issues with ADFS and Office 365!

Modern Authentication and MAPI-HTTP

If you haven’t heard, Modern Authentication (aka ADAL), has now officially gone GA (https://blogs.office.com/2015/11/19/updated-office-365-modern-authentication-public-preview/) – which means that if you are utilising Office 365 services, particularly Exchange Online, and Office 2013/2016 as your client, you should really be looking at enabling this functionality for your end users.

For those unfamiliar with Modern Auth, there are numerous benefits, but one of the most obvious for end users is it removes the need for the use of ‘save my credentials’ when signing into Exchange Online and provides a true SSO experience when combined with ADFS Federation.

Now, the process for enabling Modern Auth is very well documented in the above blog post, but the short version is:

  1. Enable Modern Auth on the Tenant side via a powershell command
  2. Enable Modern Auth on the client side via a registry key

What isn’t explicity called out as a pre-requisite however is that your Outlook client also needs to also be running in MAPI over HTTP mode.  Now, for a large percentage of environments, this is probably not an issue – but if you are like a recent customer of ours, you may have specifically disabled the use of MAPI-HTTP.  Now there are a number of valid reasons of why you might have wanted to do this (theirs was they were using an old version of Riverbed that didn’t support optimization using the MAPI-HTTP protocol), but as it turns out, the introduction of the MAPI over HTTP protocol to replace the legacy ‘RPC over HTTP’ protocol over 3 years ago was actually one of the precursors into allowing all this fancy Modern Authentication stuff to work.

For full details around what MAPI-HTTP protocol brought in and the benefits it introduced, I recommend reading this great blog post from the Microsoft Exchange team.

But in short, if you find that you have enabled Modern Auth as per the described steps, and you’re still getting the ‘basic auth prompt’ – I’d go ahead and check to see if the following registry key has been set (via GPO or otherwise)

Key: HKEY_CURRENT_USER\Software\Microsoft\Exchange
DWORD: MapiHttpDisabled
Value: 1

The above needs to either be deleted, or set to ‘0’ in order for Modern Auth to work.  The support article KB2937684 also gives you some more info around ensuring MAPI-HTTP is enabled for your Office 2013/2016 client.

Note that changing this value does not take effect until the client next performs an ‘autodiscover’ to switch over.  Depending on the situation, this may cause the user to see the following pop up:

AdministratorChange

Generally speaking, I’d recommend you test the registry update first with a subset of pilot users, before rolling this out to the wider audience.  Once that is confirmed working, then you can look at rolling out Modern Auth to your end users.

Office 365 SSO: Configuring multiple Office 365 tenants to use a single AD FS instance

Q: Can multiple Office 365 tenants use a single AD FS instance to provide SSO?

A: Yes

Overview

  • Office 365 tenant 1 is configured with the domain contoso.com
  • Office 365 tenant 2 is configured with the domain sub.contoso.com
  • Single Active Directory Forest with multiple UPNs configured (contoso.com and sub.contoso.com)
  • Single AD FS instance including an AD FS Proxy/Web Application Proxy published with the name sts.contoso.com
  • Two instances of Azure ADConnect configured with container filtering to ensure users are only synchronised to a single tenant

Configuring SSO

The Federation Trust for Tenant 1 is configured by establishing a Remote PowerShell session (with the Azure Active Directory Module loaded) and running the standard ‘Convert-MsolDomainToFederated’ cmdlet:

Convert-MsolDomainToFederated -DomainName contoso.com -SupportMultipleDomain

When it comes to configuring Tenant 2, things become a little more tricky. One of the features of the ‘Convert-MsolDomainToFederated’ cmdlet is that it performs the required configuration on Office 365 as well as the AD FS Farm. If you attempt to run this cmdlet against an AD FS Farm that has a Federation Trust established with a different tenant, it will fail and return an error. Therefore, we need to make use of the ‘Set-MsolDomainAuthentication’ cmdlet which only makes configuration changes to Office 365 and is usually used for establishing Federation Trusts with third party IdPs.

The first step is to export the token-signing certificate from the AD FS farm either via Windows Certificate Manager or via PowerShell:

$certRefs=Get-AdfsCertificate -CertificateType Token-Signing
$certBytes=$certRefs[0].Certificate.Export([System.Security.Cryptography.X509Certificates.X509ContentType]::Cert)
[System.IO.File]::WriteAllBytes("c:\temp\tokensigning.cer", $certBytes)

Next, establish a Remote PowerShell session with Tenant 2 and then run the following script to configure the trust:

$cert = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2("C:\temp\tokensigning.cer")
$certData = [system.convert]::tobase64string($cert.rawdata)
$dom="sub.contoso.com"
$url="https://sts.contoso.com/adfs/ls/"
$uri="http://sub.contoso.com/adfs/services/trust/"
$ura="https://sts.contoso.com/adfs/services/trust/2005/usernamemixed"
$logouturl="https://sts.contoso.com/adfs/ls/"
$metadata="https://sts.contoso.com/adfs/services/trust/mex"
#command to enable SSO
Set-MsolDomainAuthentication -DomainName $dom -Authentication Federated -ActiveLogOnUri $ura -PassiveLogOnUri $url -MetadataExchangeUri $metadata -SigningCertificate $certData -IssuerUri $uri -LogOffUri $logouturl -PreferredAuthenticationProtocol WsFed

Once configured, the configuration of both tenants can be validated using the ‘Get-MsolDomainFederationSettings’ cmdlet. The only difference when comparing the tenant configuration should be the ‘FederationBrandName’ and the ‘IssuerUri’ values.

ADFS sign-in error: “An error occurred. Contact your administrator for more information.”

Originally posted on Lucian’s blog over at Clouduccino.com. Follow Lucian on twitter @Lucianfrango.


I’ve not had that much luck deploying Azure AD Connect and ADFS 3.0 in Azure for a client in the last few weeks. After some networking woes I’ve moved onto the server provisioning and again got stuck. Now, I know IT is not meant to be easy otherwise there wouldn’t be some of the salaries paid out to the best and brightest, this install though was simple and nothing out of the ordinary. A standard deployment that I and many others have done before.

Let me paint the picture: ADFS is now running, although not working, in Azure compute across a load balanced set of two servers with a further load balanced set of web application proxy (WAP) servers in front. Theres two domain controllers and a AAD Connect server all across a couple of subnets in a VNET.

Read More

How to implement Multi-Factor Authentication in Office 365 via ADFS, Part 5, the finale!

Originally posted in Lucians blog over @ clouduccino.com

I know what you’re thinking: does Lucian really have to create another part in this long MFA series? In short, probably not, but I’ll have saved your index finger the thousands of years or scrolling you would have done to read the entire brain dump in a one page post.

So to explain this ‘epilogue’, if you will, on MFA, using X.509 SSLs for your second factor of authentication is a powerful means to automate and manage a process for your mobile and external users. This blog post will explain how to leverage an on-prem Microsoft System Centre Configuration Manager (SCCM) 2012 R2 deployment linked to Microsoft InTune to deliver SSL’s to mobile and external devices to use in MFA.

Read More

How to implement Multi-Factor Authentication in Office 365 via ADFS – Part 4

Originally posted in Lucians blog over @ clouduccino.com

The final installment in the long series that’s taken me allot longer to get around to writing then initially I had thought. However, I hope it’s worth the wait and the solution that has been proven works well for you. Before I get into the technical aspects of the final piece of this MFA implementation puzzle, I’d like to make a quick shout out to all the awesome consultants at Kloud Solutions who helped both in the technical implementation but also with the initial design and work required to see this solution through- a big thank you!

In the previous blog post I went through essentially what an internal configuration of MFA would look like with everything ready for the ADAL component that was previously under NDA and preview only availability, is now generally available for testing. So let me quickly delve into that ADAL in Office 2013 and Office 365 component before an in-depth guide on how to utilize Microsoft InTune and System Centre Configuration Manager as a means to deliver SSL certificates to users and use those certificates as your second factor of authentication! Exciting as its been a long build up to get to this point with several moments where I was questioning whether this would work in the real world.. lets start..

Read More

How to implement Multi-Factor Authentication in Office 365 via ADFS – Part 3

Originally posted on Lucian’s blog over at clouduccino.com.

In this blog post I’ll go into the configuration and implementation of Active Directory Federation Services v3.0 Multi-Factor Authentication (MFA). This is in line with a recent proof-of-concept project I conducted for a large customer in the FMCG sector. ADFSv3 MFA coupled with some new functionality that Microsoft is working on in Office 365, MFA in Office 2013 which will be covered by part 4 of this series, offers a fantastic solution to organisations wanting to leverage MFA by way of adhering to company policy or simply to further secure their users accessing Office 365 cloud services.

The good we secure for ourselves is precarious and uncertain until it is secured for all of us and incorporated into our common life

-Jane Addams

Read More