Azure MFA: Architecture Selection Case Study

I’ve been working with a customer on designing a new Azure Multi Factor Authentication (MFA) service, replacing an existing 2FA (Two Factor Authentication) service based on RSA Authenticator version 7.

Now, typically Azure MFA service solutions in the past few years have been previously architected in the detail ie. a ‘bottom up’ approach to design – what apps are we enforcing MFA on? what token are we going to use? phone, SMS, smart phone app? Is it one way message, two way message? etc.

Typically a customer knew quite quickly which MFA ‘architecture’ was required – ie. the ‘cloud’ version of Azure MFA was really only capable of securing Azure Active Directory authenticated applications. The ‘on prem’ (local data centre or private cloud) version using Azure MFA Server (the server software Microsoft acquired in the PhoneFactor acquisition) was the version that was used to secure ‘on-prem’ directory integrated applications.  There wasn’t really a need to look at the ‘top down’ architecture.

In aid of a ‘bottom up’ detailed approach – my colleague Lucian posted a very handy ‘cheat sheet’ last year, in comparing the various architectures and the features they support which you can find here: https://blog.kloud.com.au/2016/06/03/azure-multi-factor-authentication-mfa-cheat-sheet

 

New Azure MFA ‘Cloud Service’ Features

In the last few months however, Microsoft have been bulking up the Azure MFA ‘cloud’ option with new integration support for on-premise AD FS (provided with Windows Server 2016) and now on-premise Radius applications (with the recent announcement of the ‘public preview’ of the NPS Extension last month).

(On a side note: what is also interesting, and which potentially reveals wider trends on token ‘popularity’ selection choices, is that the Azure ‘Cloud’ option still does not support OATH (ie. third party tokens) or two-way SMS options (ie. reply with a ‘Y’ to authenticate)).

These new features have therefore forced the consideration of the primarily ‘cloud service’ architecture for both Radius and AD FS ‘on prem’ apps.

 

“It’s all about the Apps”

Now, in my experience, many organizations share application architectures they like to secure with multi factor authentication options.  They broadly fit into the following categories:

1. Network Gateway Applications that use Radius or SDI authentication protocols, such as network VPN clients and application presentation virtualisation technologies such as Citrix and Remote App

2. SaaS Applications that choose to use local directory credentials (such as Active Directory) using Federation technologies such as AD FS (which support SAML or WS-Federation protocols), and

3. SaaS applications that use remote (or ‘cloud’) directory credentials for authentication such as Azure Active Directory.

Applications that are traditionally accessed via only the corporate network are being phased out for ones that exist either purely in the Cloud (SaaS) or exist in a hybrid ‘on-prem’ / ‘cloud’ architecture.

These newer application architectures allow access methods from untrusted networks (read: the Internet) and therefore these access points also apply to trusted (read: corporate workstations or ‘Standard Operating Environment’) and untrusted (read: BYOD or ‘nefarious’) devices.

In order to secure these newer points of access, 2FA or MFA solution architectures have had to adapt (or die).

What hasn’t changed however is that a customer when reviewing their 2FA or MFA choice of vendors will always want to choose a low number of MFA vendors (read: one), and expects that MFA provider to support all of their applications.  This keeps user training cost low and operational costs low.   Many are also fed up dealing with ‘point solutions’ ie. securing only one or two applications and requiring a different 2FA or MFA solution per application.

 

Customer Case Study

So in light of that background, this section now goes through the requirements in detail to really ‘flush’ out all the detail before making the right architectural decision.

 

Vendor Selection

This was taken place prior to my working with our customer, however it was agreed that Azure MFA and Microsoft were the ‘right’ vendor to replace RSA primarily based on:

  • EMS (Enterprise Mobility + Security) licensing was in place, therefore the customer could take advantage of Azure Premium licensing for their user base.  Azure Premium meant we would use the ‘Per User’ charge model for Azure MFA (and not the other choice of ‘Per Authentication’ charge model ie. being charged for each Azure MFA token delivered).
  • Tight integration with existing Microsoft services including Office 365, local Active Directory and AD FS authentication services.
  • Re-use of strong IT department skills in the use of Azure AD features.

 

 Step 1: App Requirements Gathering

The customer I’ve been working with has two ‘types’ of applications:

1. Network Gateway Applications – Cisco VPN using an ASA appliance and SDI protocol, and Citrix NetScaler using Radius protocol.

2. SaaS Applications using local Directory (AD) credentials via the use of AD FS (on Server 2008 currently migrating to Server 2012 R2) using both SAML & WS-Federation protocols.

They wanted a service that could replace their RSA service that integrated with their VPN & Citrix services, but also ‘extend’ that solution to integrate with AD FS as well.   The currently don’t use 2FA or MFA with their AD FS authenticated applications (which includes Office 365).

They did not want to extend 2FA services to Office 365 primarily as that would incur the use of static ‘app passwords’ for their Outlook 2010 desktop version.

 

Step 2:  User Service Migration Requirements

The move from RSA to Azure MFA was going to involve the flowing changes as well to the way users used two factor services:

  1. Retire the use of ‘physical’ RSA tokens but preserve a similar smart phone ‘soft token’ delivery capability
  2. Support two ‘token’ options going forward:  ‘soft token’ ie. use of a smart phone application or SMS received tokens
  3. Modify some applications to use the local AD password instead of the RSA ‘PIN’ as a ‘what you know’ factor
  4. Avoid the IT Service Desk for ‘soft token’ registration.  RSA required the supply of a static number to the Service Desk who would then enable the service per that user.  Azure MFA uses a ‘rotating number’ for ‘soft token’ registrations (using the Microsoft Authenticator application).  This process can only be performed on the smart phone itself.

So mapping out these requirements, I then had to find the correct architecture that met their requirements (in light of the new ‘Cloud’ Azure MFA features):

 

Step 3: Choosing the right Azure MFA architecture

I therefore had a unique situation, whereby I had to present an architectural selection – whether to use the Azure MFA on premise Server solution, or Azure MFA Cloud services.  Now, both services technically use the Azure MFA ‘Cloud’ to deliver the tokens, but the sake of simplicity, it boils down to two choices:

  1. Keep the service “mostly” on premise (Solution #1), or
  2. Keep the service “mostly” ‘in the cloud’ (Solution #2)

The next section goes through the ‘on-premise’ and ‘cloud’ requirements of both options, including specific requirements that came out of a solution workshop.

 

Solution Option #1 – Keep it ‘On Prem’

New on-premise server hardware and services required:

  • One or two Azure MFA Servers on Windows Server integrating with local (or remote) NPS services, which performs Radius authentication for three customer applications
  • On-premise database storing user token selection preferences and mobile phone number storage requiring backup and restoration procedures
  • One or two Windows Server (IIS) hosted web servers hosting the Azure MFA User Portal and Mobile App web service
  • Use of existing reverse proxy publishing capability of the user portal and mobile app web services to the Internet under an a custom web site FQDN.  This published mobile app website is used for Microsoft Authenticator mobile app registrations and potential user self-selection of factor e.g. choosing between SMS & mobile app for example.
New Azure MFA Cloud services required:
  • User using Azure MFA services must be in local Active Directory as well as Azure Active Directory
  • Azure MFA Premium license assigned to user account stored in Azure Active Directory

Option1.png

Advantages:

  • If future requirements dictate Office 365 services to use MFA, then ADFS version 3 (Windows Server 2012) directly integrates with on premise Server MFA.  Only AD FS version 4 (Windows Server 2016) has capability in integrating directly with the cloud based Azure MFA.
  • The ability to allow all MFA integrated authentications through in case Internet services (HTTPS) to Azure cloud are unavailable.  This is configurable with the ‘Approve’ setting for the Azure MFA server setting: “when Internet is not accessible:”

 

Disadvantages:

  • On-premise MFA Servers requiring uptime & maintenance (such as patching etc.)
  • Have to host on-premise Azure website and publish to the Internet under existing customer capability for user self service (if required).  This includes on-premise IIS web servers to host mobile app registration and user factor selection options (choosing between SMS and mobile app etc.)
  • Disaster Recovery planning and implementation to protect the local Azure MFA Servers database for user token selection and mobile phone number storage (although mobile phone numbers can be retrieved from local Active Directory as an import, assuming they are present and accurate).
  • SSL certificates used to secure the on-premise Azure self-service portal are required to be already supported by mobile devices such as Android and Apple. Android devices for example, do not support installing custom certificates and requires using an SSL certificate from an already trusted vendor (such as THAWTE)

 

Solution Option #2 – Go the ‘Cloud’!

New on-prem server hardware and services required:

  • One or two Windows Servers hosting local NPS services which performs Radius authentication for three customer applications.  These can be existing available Windows Servers not utilizing local NPS services for Radius authentication but hosting other software (assuming they also fit the requirements for security and network location)
  • New Windows Server 2016 server farm operating ADFS version 4, replacing the existing ADFS v3 farm.

New Azure MFA Cloud services required:

  • User using Azure MFA services must be in local Active Directory as well as Azure Active Directory
  • User token selection preferences and mobile phone number storage stored in Azure Active Directory cloud directory
  • Azure MFA Premium license assigned to user account stored in Azure Active Directory
  • Use of Azure hosted website: ‘myapps.microsoft.com’ for Microsoft Authenticator mobile app registrations and potential user self selection of factor e.g. choosing between SMS & mobile app for example.
  • Configuring Azure MFA policies to avoid enabling MFA for other Azure hosted services such as Office 365.

Option2.png

Advantages:

  • All MFA services are public cloud based with little maintenance required from the customer’s IT department apart from uptime for on-premise NPS servers and AD FS servers (which they’re currently already doing)
  • Potential to reuse existing Windows NPS server infrastructure (would have to review existing RSA Radius servers for compatibility with Azure MFA plug in, i.e. Windows Server versions, cutover plans)
  • Azure MFA user self-service portal (for users to register their own Microsoft soft token) is hosted in cloud, requiring no on-premise web servers, certificates or reverse proxy infrastructure.
  • No local disaster recovery planning and configuration required. NPS services are stateless apart from IP addressing configurations.   User information token selections and mobile phone numbers stored in Azure Active Directory with inherent recovery options.

 

Disadvantages:

  • Does not support AD FS version 3 (Windows Server 2012) for future MFA integration with AD FS SaaS enabled apps such as Office 365 or other third party applications (i.e. those that uses AD FS so users can use local AD authentication credentials). These applications require AD FS version 4 (Windows Server 2016) which supports the Azure MFA extension (similar to the NPS extension for Radius)
  • The Radius NPS extension and the Windows AD FS 2016 Azure MFA integration do not currently support the ability to approve authentications should the Internet go offline to the Azure cloud i.e. cannot reach the Azure MFA service across HTTPS however this may be because….
  • The Radius NPS extension is still in ‘public preview’.  Support from Microsoft at this time is limited if there are any issues with it.  It is expected that this NPS extension will go into general release shortly however.

 

Conclusion and Architecture Selection

After the workshop, it was generally agreed that Option #2 fit the customer’s on-going IT strategic direction of “Cloud First”.

It was agreed that the key was replacing the existing RSA service integrating with Radius protocol applications in the short term, with AD FS integration viewed as very much ‘optional’ at this stage in light of Office 365 not viewed as requiring two factor services (at this stage).

This meant that AD FS services were not going to be upgraded to Windows Server 2016 to allow integration with Option #2 services (particularly in light of the current upgrade to Windows Server 2012 wanting to be completed first).

The decision was to take Option #2 into the detailed design stage, and I’m sure to post future blog posts particularly into any production ‘gotchas’ in regards to the Radius NPS extension for Azure MFA.

During the workshop, the customer was still deciding whether to allow a user to select their own token ‘type’ but agreed that they wanted to limit it if they did to only three choices: one way SMS (code delivered via SMS), phone call (ie. push ‘pound to continue’) or the use of the Microsoft Authenticator app.   Since these features are available in both architectures (albeit with different UX), this wasn’t a factor in the architecture choice.

The limitation for Option #2 currently around the lack of automatically approving authentications in case the Internet service ‘went down’ was disappointing to the customer, however at this stage it was going to be managed with an ‘outage process’ in case they lost their Internet service. The workaround to have a second NPS server without the Azure MFA extension was going to be considered as part of that process in the detailed design phase.

 

 

 

 

 

 

 

 

 

ADFS v 2.0 Migration to ADFS 2016

Introduction

Some organisations may still have ADFS v2 or ADFS v2.1 running in their environment, and haven’t yet moved to ADFS v3. In this blog, we will discuss how can you move away from ADFS v2 or ADFS v2.1 and migrate or upgrade to ADFS 2016.

In previous posts, Part 1 and Part 2 we have covered the migration of ADFS v3.0 to ADFS 2016. I have received some messages on LinkedIn to cover the migration process from ADFS v2 to ADFS 2016 as there currently isn’t much information about this.

As you may have noticed from the previous posts, upgrading to ADFS 2016 couldn’t be any easier. In ADFS v2 however, the process is as simple, albeit differently than upgrading from ADFS v2 or ADFS v2.1 to ADFS v3.

Migration Process

Before we begin however, this post assumes you already have a running ADFS v2 or ADFS v2.1 environment. This blog post will not go into a step-by-step installation of ADFSv2/ADFSv2.1, and will only cover the migration or upgrade to ADFS 2016.

This blog post assumes you already have a running Windows Server 2016 with the ADFS 2016 role installed, if not, please follow the procedures outlined in part 2.

Prerequisites

Prior to commencing the upgrade, there are few tasks you need to perform.

  1. Take a note of your ADFS Server Display Name
  2. Take a note of your Federation Service Name
  3. Take note of your federation Service Identifier
  4. Export the service communication certificate (if you don’t already have a copy of it)
  5. Install/Import the service communication certificate onto ADFS 2016 server

Notes:

  • There is no need to make your ADFS 2016 server as primary, since this should have been a new installation. You can’t add to an ADFS v2/ADFS v2.1 farm anyway.
  • There is no need to raise the farm behavior level, since this is not a farm member like we did when migrating from ADFS v3 to ADFS 2016.
  • However, you will still need to upgrade the schema as outlined in part 2.

Before you begin, you will need to download the following PowerShell script found here.

Those scripts can also be found in the support\adfs location in the Windows Server 2016 installation file. Those scripts are provided by Microsoft and not Kloud.

The two main functions, are the export and import federation configuration.

Let’s begin

Firstly, we will need to export the federation configuration by running the “export-federationconfiguration.ps1”.

Here are the current relying party trust I have in the ADFS v2.1.

The “Claims Provider Trust” is Active Directory, this toll will be exported and imported.

1

1- Navigate to the folder you have just downloaded:

Then, type

.\export-federationconfiguration.ps1 -path "c:\users\username\desktop\exported adfs configuration"

 

 

 

3

Once successful, you will see the same results in the above picture.

Open your folder, and you should see the extracted configuration in .xml files.

4

2- Head over your ADFS 2016 Server and copy/paste both the folder in which you have extracted your federation configuration, and the one you downloaded that includes the scripts.

Then open PowerShell and run:

.\import-federationconfiguration.ps1 -path "c:\users\username\desktop\exported adfs configuration"

 

6

When successful, you will see a similar message as above.

And now when you open the ADFS management you should see the same relying party trust as you had in ADFS v2/ADFS v2.1.

7

Basically, by now you have completed the move from ADFSv2/ADFSv2.1 to ADFS 2016.

Notice how the token-signing and token-decrypting certificates are the same. The screenshots below are only of the token-signing certificates only, for the purpose of this blog.

ADFS v2.1:

8

ADFS 2016:

9

You can also check the certificates through PowerShell:

Get-AdfsCertificate

Last thing, make sure that service account that was running the Active Direction Federation Services in ADFS v2/ADFS v2.1 is the same one running in ADFS 2016.

Notice the message in the exported results:

Ensure that the destination farm has the farm name ‘adfs.iknowtech.com.au’ and uses service account ‘IKNOWTECH\svc-adfs’.

If this is not setup, then head over your services and select the account that you were originally using to run the ADFS service.

In addition, make sure that the service account has read-only access to the certificate private key.

10

Conclusion

This is a very straight forward process, all that you need to be sure of is to have the right components in place, service certificate installed, service account setup, etc.

After you have complete the steps, follow the steps here to configure your Web Application Proxy. Although this covers a migration, but it also helps you in configuring a new deployment.

I hope you’ve found this informative. Should you have any question or feedback please leave a comment below.

 

WAP (2012 R2) Migration to WAP (2016)

In Part 1, and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In part 3 we have discussed the integration of Azure MFA with ADFS 2016, and in this post (technically part 4) we will cover the migration or better yet upgrade WAP 2012 R2 to WAP 2016.

Again, this blog assumes you already have installed the Web Application Proxy feature while adding the Remote Access role. And have prepared the WAP server to be able to establish a trust with the Federation Service.

In addition, a friendly reminder once again that the domain name and federation service name have changed from swayit.com.au to iknowtech.com.au. The certificate of swayit.com.au expired before completing the lab, hence the change.

Before we begin however, the current WAP servers (WAP01, WAP02) are the only connected servers that are part of the cluster:

1

To install the WebApplicationProxy, run the following cmdlet in PowerShell:

Install-WindowsFeature Web-Application-Proxy -IncludeManagementTools

Once installed, follow these steps:

Step 1: Specify the federation service name, and provide the local Administrator account for your ADFS server.

2

Step2: Select your certificate

3

Step 3: Confirm the Configuration4

Do this for both the WAP servers you’re adding to the cluster.

Alternatively, you could do so via PowerShell:


$credential = Get-Credential
Install-WebApplicationProxy -FederationServiceName "sts.swayit.com.au" -FederationServiceTrustCredential $credential -CertificateThumbprint "071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" 

Once complete, the WAP servers will be added to the cluster.

Run the following cmdlet to get the WebApplication Configuration:

Get-WebApplicationProxyConfiguration

You will notice that we now have all four WAP servers in the ConnectedServersName and are part of the cluster.

You will also notice that the ConfigurationVersion is Windows Server 2012 R2. Which we will need to upgrade to Windows Server 2016.

5

Head back to the one of the previous WAP servers running in Windows Server 2012 R2, and run the following cmdlet to remove the old servers from the cluster, and keep only the WAP 2016 Servers:

 Set-WebApplicationProxyConfiguration -ConnectedServersName WAP03, WAP04 

Once complete, check the WebApplicationProxyConfiguration by running the Get-WebApplicationProxyConfiguration cmdlet.

Notice the change in the ConnectServersName (this information is obtained from WAP Server 2012 R2).

6

If you run the Get-WebApplicationProxyConfiguration from WAP 2016, you will get a more detailed information.

6-1

The last remaining step before publishing a Web Application (in my case) was to upgrade the ConfigurationVersion, as it was still in Windows Server 2012 R2 mode.

If you already have published Web Application, you can do this any time.

Set-WebApplicationProxyConfiguration -UpgradeConfigurationVersion

When successful, check again your WebApplicationProxyConfiguration by running

Get-WebApplicationProxyConfiguration

Notice the ConfigurationVersion:

8

You have now completed the upgrade and migration of your Web Application Proxy servers.

If this is a new deployment, of course you don’t need to go through this whole process. You WAP 2016 servers would already be in a Windows Server 2016 ConfigurationVersion.

Assuming this was a new deployment or if you simply need to publish a new Web Application, continue reading and follow the steps below.

Publishing a Web Application

There’s nothing new or different in publishing a Web Application in WAP 2016. It’s pretty similar to WAP 2012 R2. The only addition Microsoft added, is a redirection from HTTP to HTTPS.

Steps 1: Publishing a New Web Application

9

Step 2: Once complete, it will appear on the published Web Applications. Also notice that we only have WAP03, and WAP04 as the only WAP servers in the cluster as we have previously remove WAP01 and WAP02 running Windows Server 2012 R2.

7

There you have it, you have now upgraded your WAP servers that were previously running WAP 2012 R2 to WAP 2016.

By now, you have completed migrating from ADFS v3 to ADFS 2016, integrated Azure MFA with ADFS 2016, and upgraded WAP 2012 R2 to WAP 2016. No additional configuration is required, we have reached the end of our series, and this concludes the migration and upgrade of your SSO environment.

I hope you’ve enjoyed those posts and found them helpful. For any feedback or questions, please leave a comment below.

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 3 – Azure MFA Integration

In Part 1 and Part 2 of this series we have covered the migration from ADFS v3 to ADFS 2016. In this series we will continue our venture in configuring Azure MFA in ADFS 2016.

Azure MFA – What is it about?

It is a bit confusing when we mention that we need to enable Azure MFA on ADFS. Technically, this method is actually integrating Azure MFA with ADFS. MFA itself is authenticating on Azure AD, however, ADFS is prompting you enter an MFA code which will be verified with the Azure AD to sign you in.

In theory, this by itself is not a multi-factor authentication. When users choose to login with a multi-factor authentication on ADFS, they’re not prompted to enter a password, they simply will login with the six digit code they receive on their mobile devices.

Is this secure enough? Arguably. Of course users had to previously set up their MFA to be able to login by choosing this method, but if someone has control or possession of your device they could simply login with the six digit code. Assuming the device is not locked, or MFA is setup to receive calls or messages (on some phones message notifications will appear on the main display), almost anyone could login.

Technically, this is how Azure MFA will look once integrated with the ADFS server. I will outline the process below, and show you how we got this far.

7

Once you select Azure Multi-Factor Authentication you will be redirected to another page

8

And when you click on “Sign In” you will simply sign in to the Office or Azure Portal, without any other prompt.

The whole idea here is not much about security as much as it is about simplicity.

Integrating Azure MFA on ADFS 2016

Important note before you begin: Before integrating Azure MFA on ADFS 2016, please be aware that users should already have setup MFA using the Microsoft Authenticator mobile app. Or they can do it while first signing in, after being redirected to the ADFS page. The aim of this post is to use the six digit code generated by the mobile app.

If users have MFA setup to receive calls or texts, the configuration in this blog (making Azure MFA as primary) will not support that. To continue using SMS or a call, whilst using Azure MFA, the “Azure MFA” need to be configured as a secondary authentication method, under “Multi-Factor”, and “Azure MFA” under “Primary” should be disabled.

Integrating Azure MFA on ADFS 2016, couldn’t be any easier. All that is required, is running few PowerShell cmdlets and enabling the authentication method.

Before we do so however, let’s have a look at our current authentication methods.

0

As you have noticed, that we couldn’t enable Azure MFA without first configuring Azure AD Tenant.

The steps below are intended to be performed on all AD FS servers in the farm.

Step 1: Open PowerShell and connect to your tenant by running the following:

Connect-MsolService

Step 2: Once connected, you need to run the follow cmdlets to configure the AAD tenant:

$cert = New-AdfsAzureMfaTenantCertificate -TenantID swayit.onmicrosoft.com

When successful, head to the Certificate snap in, and check that a certificate with the name of your tenant has been added in the Personal folder.

2a22

Step 3: In order to enable the AD FS servers to communicate with the Azure Multi-Factor Auth Client, you need to add the credentials to the SPN for the Azure Multi-Factor Auth Client. The certificate that we generated in a previsou step,  will serve as these credentials.

To do so run the following cmdlet:

New-MsolServicePrincipalCredential -AppPrincipalId 981f26a1-7f43-403b-a875-f8b09b8cd720 -Type asymmetric -Usage verify -Value $cert

3

Note that the GUID 981f26a1-7f43-403b-a875-f8b09b8cd720 is not made up, and it is the GUID for the Azure Multi Factor Authentication client. So you basically can copy/paste the cmdlet as is.

Step 4: When you have completed the previous steps, you can now configure the ADFS Farm by running the following cmdlet:

Set-AdfsAzureMfaTenant -TenantId swayit.onmicrosoft.com -ClientId 981f26a1-7f43-403b-a875-f8b09b8cd720

Note how we used the same GUID from the previous step.

4

When that is complete, restart the ADFS service on all your ADFS farm servers.

net stop adfssrv

net start adfssrv

Head back to your ADFS Management Console and open the Authentication method and you will notice that Azure MFA has been enabled, and the message prompt disappeared.

5

6

If the Azure MFA Authentication methods were not enabled, then enable them manually and restart the services again (on all your ADFS servers in the farm).

Now that you have completed all the steps, when you try and access Office 365 or the Azure Portal you will be redirected to the pages posted above.

Choose Azure Multi-Factor Authentication

7

Enter the six digit code you have received.

8

And then you’re signed in.

10

By now you have completed migrating from ADFS v3 to ADFS 2016, and in addition have integrated Azure MFA authentication with your ADFS farm.

The last part in this series will be about WAP 2012 R2 upgrade to WAP 2016. So please make sure to come back tomorrow and check in details the upgrade process.

I hope you’ve enjoyed the posts so far. For any feedback or questions, please leave a comment below.

 

 

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 2

In Part 1 of this series we have been getting ready for our ADFS v3.0 migration to ADFS v4.0 (ADFS 2016).

In part 2 we will cover the migration process, step-by-step. However, a friendly reminder that this series does not cover installation of ADFS and federation from scratch. This post assumes you already have a federated domain and Single Sign On (SSO) for your applications.

You may notice domain change and federation service name change from swayit.com.au to iknowtech.com.au. This doesn’t impact our migration, the certificate for swayit.com.au expired before completing the lab. : )

Migration Process – ADFS – Phase 1:

Assuming you already have installed the Active Directory Federation Services on your new ADFS 2016 servers, and if not, then you could do so through PowerShell:

Install-windowsfeature adfs-federation -IncludeManagementTools

Once complete, follow these steps:

Step 1: Add the new ADFS 2016 server to the existing farm

1-add-farm

Step 2: Connect to AD2

Step 3: Specify the primary Federation server (or federation service).3

Step 4: Select your certificate4

Step 5: Select your service account. For the sake of this lab, I created a user and gave it permission to run the ADFS service. It is advisable however, to use a group managed service account (gMSA).5

Step 6: Complete.

The warnings below are irrelevant to the ADFS 2016 server being added to the farm.6

Alternatively, you could do so through PowerShell:

If you’re using Windows Internal Database:

 Import-Module ADFS

#Get the credential used for the federation service account

$serviceAccountCredential = Get-Credential -Message "Enter the credential for the Federation Service Account."

Add-AdfsFarmNode `
-CertificateThumbprint:"071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" `
-PrimaryComputerName:"sts.swayit.com.au" `
-ServiceAccountCredential:$serviceAccountCredential 

or:

Import-Module ADFS

$credentials = Get-Credential

Install-AdfsFarm `
-CertificateThumbprint:"071E6FD450A9D10FEB42C77F75AC3FD16F4ADD5F" `
-FederationServiceDisplayName:"SwayIT" `
-FederationServiceName:"sts.swayit.com.au" `
-GroupServiceAccountIdentifier:"SWAYIT\ADFSgMSA`$"
Once the machine has restarted, open the ADFS Management Console, and you’ll notice it’s not the primary federation server in the farm. Now you need to make the newly added ADFS 2016 server as primary. Follow the steps below.

Once the newly added ADFS 2016 server run the following cmdlet:

Step 1:

Set-AdfsSyncProperties -Role PrimaryComputer

8

Open the ADFS Management Console and you’ll notice that ADFS03 (our ADFS2016 server) is now primary:

10

Step 2: Run the following cmdlet on all other federation servers in the farm

Set-AdfsSyncProperties -Role SecondaryComputer -PrimaryComputerName ADFS03.swayit.com.au 

Step 3: Run the following cmdlet on the secondary server to confirm that the ADFS Properties are correct.

Get-AdfsSyncProperties

9-5

Migration Process – ADPREP – Phase 2

Now that we’ve made our new ADFS 2016 Server as primary, it is time to upgrade the schema.

Assuming you had already downloaded Windows Server 2016 ISO file, and if not, you can obtain a copy from TechNet Evaluation Centre.

I performed these steps on the ADFS2016 server:

  1. Open a command prompt and navigate to support\adprep directory.
  2. Type in: adprep /forestprep

1

Once the first step is complete, you will get “The command has completed successfully.”

Next run: adprep /domainprep2

Migration Process – ADFS – Phase 3:

At this stage we had already completed:

  • Adding ADFS 2016 to the existing farm
  • Promoting one of the new ADFS2016 server as primary
  • Pointing all secondary server to the primary server
  • Upgraded the schema

Next phase is to remove the existing ADFS v3 (ADFS 2012 R2) from the Azure Load Balancer (or any load balancer you have in place).

After you have removed ADFS v3 from the load balancer, and possibly from the farm (or simply by having them turned off) you will need to raise the Farm Behavior Level (FBL).

When raising the FBL, any ADFS v3 server will be removed from the farm automatically. So you don’t have to remove them yourself.

When the ADFS v3 servers are no longer part of the farm, I would like to recommend to keep them turned off, should anything go wrong you simply can go back on turning the ADFS v3 servers, make one primary, and in this case you may avoid impacting the business.

If you find yourself in this situation, just make sure everything else is pointing to the ADFS v3.

When you’re ready again, just start from the beginning in adding ADFS 2016 back to the farm.

Here are the steps:

  1. On the Primary ADFS 2016 server open an elevated PowerShell and run the following cmdlet:
Invoke-AdfsFarmBehaviorLevelRaise

12

As you may have noticed, it automatically detected which ADFS servers the operation will be performed on. Both ADFS03 and ADFS04 are ADFS 2016 versions.

During the process, you will see the usual PowerShell execution message:

12-1

Once complete, you will see a successful message:

13

If the service account had failed to be added to the Enterprise Key Admins group, do it manually.

In order to confirm the Farm Behavior Level, run the following cmdlet:

Get-AdfsFarmInformation

14

If you go to https://portal.office.com, and enter the email address of a federated domain, you should be redirected to your ADFS login page:

15

And this is it. You have successfully migrated from ADFS v3.0 to ADFS 2016.

The next post in our series is on Azure MFA integration with ADFS 2016, so make sure to please come back tomorrow to check in details the configuration process.

I hope you’ve enjoyed this post. For any feedback or questions, please leave a comment below.

ADFS v 3.0 (2012 R2) Migration to ADFS 4.0 (2016) – Part 1

Introduction

With the release of Windows Server 2016, Microsoft has introduced new and improved features. One of those features is ADFS 4.0, better known as ADFS 2016.

Organisations have already started leveraging ADFS 2016 as it covers most of their requirement, specially in terms of security.

In this series of blog posts, I will demonstrate how you can upgrade from ADFS v 3.0 (Running Windows Server 2012 R2) to ADFS 2016 (Running Windows Server 2016 Datacenter). In the series to come I will also cover Web Application Proxy (WAP) migration from Windows Server 2012 R2 to Windows Server 2016. Moreover, I will cover integration of Azure MFA with the new ADFS 2016.

The posts in this series assume you have knowledge in Windows Servers, AD, ADFS, WAP, and MFA. This blog post will not go into detailed step-by-step installation of roles and features. This blog post also assumes you have a running environment of AD, ADFS/WAP (2012 R2), AAD Connect already configured.

What’s New in ADFS 2016?

ADFS 2016 offers new and improved features included:

  • Eliminate Passwords from the Extranet
  • Sign in with Azure Multi-factor Authentication
  • Password-less Access from Compliant Devices
  • Sign in with Microsoft Passport
  • Secure Access to Applications
  • Better Sign in experience
  • Manageability and Operational Enhancements

For detailed description on the aforementioned points, please refer to this link.

Current Environment

  • 2x ADFS v3 Servers (behind an internal load balancer)
  • 2x WAP 2012 R2 Server (behind an external load balancer)
  • 2x AD 2012 R2 Servers
  • 1x AAD Connect server

At a high level design, this is how the ADFS/WAP environment looks:

sso

Future environment:

  • 2x ADFS 2016 Servers (behind the same internal load balancer)
  • 2x WAP 2016 Servers (behind the same external load balancer)
  • 2x AD 2012 R2 Servers
  • 1x AAD Connect Server

Planning for your ADFS and WAP Migration

At first, you need to make sure that your applications can support ADFS 2016, some legacy applications may not be supported.

The steps to implement SSO are as follows:

  1. Active Directory schema update using ‘ADPrep’ with the Windows Server 2016 additions
  2. Build Windows Server 2016 servers with ADFS and install into the existing farm and add the servers to the Azure load balancer
  3. Promote one of the ADFS 2016 servers as “primary” of the farm, and point all other secondary servers to the new “primary”
  4. Build Windows Server 2016 servers with WAP and add the servers to the Azure load balancer
  5. Remove the WAP 2012 servers from the Azure load balancer
  6. Remove the ADFSv3 servers from the Azure load balancer
  7. Raise the Farm Behavior Level feature (FBL) to ‘2016’
  8. Remove the WAP servers from the cluster
  9. Upgrade the WebApplicationProxyConfiguration version to ‘2016’
  10. Configure ADFS 2016 to support Azure MFA and complete remaining configuration

The steps for the AD schema upgrade are as follows:

  1. Prior to starting, the Active Directory needs to be in a health state, in particular replication needs to be performing without error.
  2. The Active Directory needs to be backed-up. Best to backup (at a minimum) a few Active Directory Domain Controllers including the ‘system state’
  3. Identify which Active Directory Domain Controller maintains the Schema Master role
  4. Perform the update using an administrative account by temporarily adding the account to the Schema Admin group
  5. Download and have handy the Windows Server 2016 installation media
  6. When ready to update the schema, perform the following:
  7. Open an elevated command prompt and navigate to support\adprep directory in the Windows Server 2016 installation media. Run the following: adprep /forestprep.
  8. Once that completes run the following: adprep/domainprep

Upgrading the Active Directory schema will not impact your current environment, nor will it raise the domain/forest level.

Part 2 of this series will be published early next week. Therefore make sure to please come back and check in details the migration process.

 

Azure AD Connect pass-through authentication. Yes, no more AD FS required.

Originally posted on Lucian’s blog at clouduccino.com. Follow Lucian on Twitter: @LucianFrango.

***

Yesterday I received a notification email from Alex Simons (Director of PM, Microsoft Identity Division) which started like this:

Todays news might well be our biggest news of the year. Azure AD Pass-Through Authentication and Seamless Single Sign-on are now both in public preview!

So I thought I’d put together a streamlined overview of what this means for authentication with regards to the Microsoft Cloud and my thoughts on if I’d use it.

Before:

2016-12-09-before-01

What is Azure AD pass-through auth?

When working with the Microsoft Cloud, organisations from small to enterprise leverage the ability to have common credentials across on-premises directories (ADDS) and the cloud. It’s the best user experience, it’s the best IT management experience as well. The overhead of facilitating this can be quite a large endeavour. There’s all the complexities of AD FS and AADConnect to work through and build with high availability and disaster recovery in mind as this core identity infrastructure needs to be online 24/7/365.

Azure AD Pass-through authentication (public preview) simplifies this down to Azure AD Connect. This new feature can, YES, do away with AD FS. I’m not in the “I hate AD FS” boat. I think as a tool it does a good job: proxying authentication from external to ADDS and from Kerberos to SAML. For all those out there, I know you’re out there, this would be music to your ears.

Moreover, if you wanted to enjoy SINGLE SIGN ON, you needed AD FS. Now, with pass-through authentication, SSO works with just Azure AD Connect. This is a massive win!

So, what do I need?

Nothing too complicated or intricate. The caveat is that the support for this in public preview feature is limited, as with all preview offerings from Azure. Don’t get to excited and roll this out into production as yet! Dev or test it as much as possible and get an understanding of it, but, don’t go replacing AD FS just yet.

Azure AD Connect generally needs a few ports to communicate with ADDS on-premises and Azure AD in the cloud. The key port being TCP443. With pass-through authentication, there are ~17 other ports (with 10 of which included in a range) that need to be opened up for communication. While this could be locked down at the firewall level to just Azure IP’s, subnets and hosts, it will still make security question and probe for details.

The version of AADConnect also needs to be updated to the latest, released on December 7th (2016). From version 1.1.371.0, the preview feature is now publicly available. Consider upgrade requirements as well before taking the plunge.

What are the required components?

Apart from the latest version of Azure AD Connect, I touched on a couple more items required to deploy pass-though authentication. The following rapid fire list outlines what is required:

  • A current, latest, version of Azure AD Connect (v 1.1.371.0 as mentioned above)
  • Windows Server 2012 R2 or higher is listed as the operating system for Azure AD Connect
    • If you still have Server 2008 R2, get your wiggle on and upgrade!
  • A second Windows Server 2012 R2 instance to run the Azure App Proxy .exe to leverage high availability
    • Yes, HA, but not what you think… more details below
  • New firewall rules to allow traffic to a couple wildcard subdomains
  • A bunch of new ports to allow communication with the Azure Application proxy

For a complete list, check out the docs.microsoft.com article with detailed config.

What is this second server business? AADConnect has HA now?

Much like AD FS being designed to provide high availability for that workload, there is the ability to provide some HA around the connectors required for the pass-through auth process. This is where it gets a little squirly. You won’t be deploying a second Azure AD Connect server and load balance the two.

The second server is actually an additional server, I would imagine a vanilla Windows Server instance, in which a deployment of Azure AD Application Proxy is downloaded, executed and run.

The Azure Application Proxy is enabled in Azure AD, the client (AADApplicationProxyConnectorInstaller.exe) downloaded and run (as mentioned above). This then introduces two services that run on the Windows Server instance and provide that connector to Azure AD. For more info, review the docs.microsoft.com article outlines the setup process.

After:

2016-12-09-after-01

Would I use this?

I’ll answer this in two ways, then I’ll give you my final verdict.

Yes, I would use this. Simplifying what was “federated identity” to “hybrid identity” has quite a few benefits. Most importantly the reduced complexity and reduced requirements to maintain the solution. This reduced overhead can maintain a higher level of security, no credentials (rather passwords), being stored in the cloud at all. No hashes. Nadda. Zilch. Zero. Security managers and those tightly aligned to government regulations: small bit of rejoice for you.

No, I would not use this. AD FS is a good solution. If I had an existing AD FS solution that tied in with other applications, I would not go out of my way to change that just for Office 365 / Azure AD. Additionally, in preview functionality does not have the same level of SLA support. In fact, Azure in preview features are provided “as is”. That is not ideal for production environments. Sure, I’d put this in a proof of concept, but, I wouldn’t recommend rolling this out anytime soon.

Personally, I think this is a positive piece of SSO evolution within the Microsoft stack. I would certainly be trying to get customers to proof of concept this and trial it in dev or test environments. It can further streamline identity deployments and could well defer customers from ever needing or deploying AD FS, thus saving in instance cost and managed services cost supporting the infrastructure. Happy days!

Final thoughts

Great improvement, great step forward. If this was announced at Microsoft Ignite a couple of months back, I think it would have been big news. Try it, play with it, send me your feedback. I want to always been learning, improving and finding out new gotchas.

Best,

Lucian

Auto-Acceleration for SharePoint Online

Working with one of my colleagues recently, we were tasked with implementing Smart Links to speed up the login processes for a client’s SharePoint Online implementation.

The client was working towards replacing their on-premises implementation of SharePoint and OpenSpaces with SharePoint Online. The issue they faced was that when when a user tries to access a SharePoint Online site collection and is not already authenticated with Office 365, the user will be directed to the default Microsoft Online login page. The user then provides their email address and something called home-realm discovery is performed. This is where the Microsoft Online login screen will use the domain portion of the users email address to determine whether the domain is managed by Office 365, in which case the user will need to provide a password. If the domain is Federated (i.e. not managed by Office 365), the user is re-directed to the Identity Provider (IdP) for the user’s organisation – i.e. AD FS. At this point providing internet security settings within the user’s web browser are correct, they will be authenticated using their logged in credentials. Although this is the default SharePoint Online experience, it was different to the seamless on-premises SharePoint experience with Windows Integrated Authentication.

After spending a little time working through David Ross’ great blog on the topic, we discovered that due to a change in the URL construct with AD FS 3.0, the Smart Link process is no longer supported by Microsoft and has been replaced with Auto-Acceleration for SharePoint Online. This feature reduces logon prompts for the user by instructing SharePoint Online to use a pre-defined home-realm, rather than perform home-realm discovery, when a user accesses the site and “accelerating” the user through the logon process.

However, there are two caveats for getting Auto-Acceleration to work:

  • You can have multiple domains but there must be a single SSO end-point to authenticate users (AD FS or a 3rd party IdP such as Okta, can be used).
  • Auto-acceleration only works with sites that are accessible internally to the organisation, it will not work for external sites.

There isn’t much documentation around for this feature, however it’s pretty simple:

  1. If you haven’t already, download the SharePoint Online Management Shell https://www.microsoft.com/en-au/download/details.aspx?id=35588
  2. Connect to SharePoint Online using the SPO Management Shell:
    $adminUPN="[UPN for Office 365 Admin]"
    $orgName="[Your tenant name]"
    $userCredential = Get-Credential -UserName $adminUPN -Message "Type the password."
    Connect-SPOService -Url https://$orgName-admin.sharepoint.com -Credential $userCredential 
  3. Type the command Set-SPOTenant –SignInAccelerationDomain “yourdomain.com”
  4. If your IdP supports guest users, you can also run Set-SPOTenant -EnableGuestSignInAcceleration $true

Automate Secondary ADFS Node Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

Additional nodes in an ADFS farm are required to provide redundancy incase your primary ADFS node goes offline. This ensures your ADFS service is still up and servicing all incoming requests. Additional nodes also help in load balancing the incoming traffic, which provides a better user experience in cases of high authentication traffic.

Overview

Once an ADFS farm has been created, adding additional nodes is quite simple and mostly relies on the same concepts for creating the ADFS farm. I would suggest reading my previous blog Automate ADFS Farm Installation and Configuration as some of the steps we will use in this blog were documented in it.

In this blog, I will show how to automatically provision a secondary ADFS node to an existing ADFS farm. The learnings in this blog can be easily used to deploy more ADFS nodes automatically, if needed.

Install ADFS Role

After provisioning a new Azure virtual machine, we need to install the Active Directory Federation Services role on it.  To do this, we will use the same Desired State Configuration (DSC) script that was used in the blog Automate ADFS Farm Installation and Configuration. Please refer to the section Install ADFS Role in the above blog for the steps to create the DSC script file InstallADFS.ps1.

Add to an existing ADFS Farm

Once the ADFS role has been installed on the virtual machine, we will create a Custom Script Extension (CSE) to add it to the ADFS farm.

In order to do this, we need the following

  • certificate that was used to create the ADFS farm
  • ADFS service account credentials that was used to create the ADFS farm

Once the above prerequisites have been met, we need a method for making the files available to the CSE. I documented a neat trick to “sneak-in” the certificate and password files onto the virtual machine by using Desired State Configuration (DSC) package files in my previous blog. Please refer to Automate ADFS Farm Installation and Configuration under the section Create ADFS Farm for the steps.

Also note, for adding the node to the adfs farm, the domain user credentials are not required. The certificate file will be named adfs_certificate.pfx  and the file containing the encrypted adfs service account password will be named adfspass.key.

Assuming that the prerequisites have been satisfied, and the files have been “sneaked” onto the virtual machine, lets proceed to creating the CSE.

Open Windows Powershell ISE and paste the following.

param (
  $DomainName,
  $PrimaryADFSServer,
  $AdfsSvcUsername
)

The above shows the parameters that need to be passed to the CSE where

$DomainName is the name of the Active Directory domain
$PrimaryADFSServer is the hostname of the primary ADFS server
$AdfsSvcUsername is the username of the ADFS service account

Save the file with a name of your choice (do not close the file as we will be adding more lines to it). I named my script AddToADFSFarm.ps1

Next, we need to define a variable that will contain the path to the directory where the certificate file and the file containing the encrypted adfs service account password are stored. Also, we need a variable to hold the key that was used to encrypt the adfs service account password. This will be required to decrypt the password.

Add the following to the CSE file

Next, we need to decrypt the encrypted adfs service account password.

Now, we need to import the certificate into the local computer certificate store. To make things simple, when the certificate was exported from the primary ADFS server, it was encrypted using the adfs service account password.

After importing the certificate, we will read it to get its thumbprint.

Up until now, the steps are very similar to creating an ADFS farm. However, below is where they diverge.

Add the following lines to add the virtual machine to the existing ADFS farm

You now have a custom script extension file that will add a virtual machine as a secondary node to an existing ADFS Farm.

Below is the full CSE

All that is missing now is the method to bootstrap the scripts described above (InstallADFS.ps1 and AddToADFSFarm.ps1) using Azure Resource Manager (ARM) templates.

Below is part of an ARM template that can be added to your existing template to install the ADFS role on a virtual machine and then add the virtual machine as a secondary node to the ADFS farm

In the above ARM template, the parameter ADFS02VMName refers to the hostname of the virtual machine that will be added to the ADFS Farm.

Listed below are the variables that have been used in the ARM template above

The above method can be used to add as many nodes to the ADFS farm as needed.

I hope this comes in handy when creating an ARM template to automatically deploy an ADFS Farm with additional nodes.

Automate ADFS Farm Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

In this multi-part blog, I will be showing how to automatically install and configure a new ADFS Farm. We will accomplish this using Azure Resource Manager templates, Desired State Configuration scripts and Custom Script Extensions.

Overview

We will use Azure Resource Manager to create a virtual machine that will become our first ADFS Server. We will then use a desired state configuration script to join the virtual machine to our Active Directory domain and to install the ADFS role. Finally, we will use a Custom Script Extension to install our first ADFS Farm.

Install ADFS Role

We will be using the xActiveDirectory and xPendingReboot experimental DSC modules.

Download these from

https://gallery.technet.microsoft.com/scriptcenter/xActiveDirectory-f2d573f3

https://gallery.technet.microsoft.com/scriptcenter/xPendingReboot-PowerShell-b269f154

After downloading, unzip the file and  place the contents in the Powershell modules directory located at $env:ProgramFiles\WindowsPowerShell\Modules (unless you have changed your systemroot folder, this will be located at C:\ProgramFiles\WindowsPowerShell\Modules )

Open your Windows Powershell ISE and lets create a DSC script that will join our virtual machine to the domain and also install the ADFS role.

Copy the following into a new Windows Powershell ISE file and save it as a filename of your choice (I saved mine as InstallADFS.ps1)

In the above, we are declaring some mandatory parameters and some variables that will be used within the script

$MachineName is the hostname of the virtual machine that will become the first ADFS server

$DomainName is the name of the domain where the virtual machine will be joined

$AdminCreds contains the username and password for an account that has permissions to join the virtual machine to the domain

$RetryCount and $RetryIntervalSec hold values that will be used to  check if the domain is available

We need to import the experimental DSC modules that we had downloaded. To do this, add the following lines to the DSC script

Import-DscResource -Module xActiveDirectory, xPendingReboot

Next, we need to convert the supplied $AdminCreds into a domain\username format. This is accomplished by the following lines (the converted value is held in $DomainCreds )

Next, we need to tell DSC that the command needs to be run on the local computer. This is done by the following line (localhost refers to the local computer)

Node localhost

We need to tell the LocalConfigurationManager that it should reboot the server if needed, continue with the configuration after reboot,  and to just apply the settings only once (DSC can apply a setting and constantly monitor it to check that it has not been changed. If the setting is found to be changed, DSC can re-apply the setting. In our case we will not do this, we will apply the setting just once).

Next, we need to check if the Active Directory domain is ready. For this, we will use the xWaitForADDomain function from the xActiveDirectory experimental DSC module.

Once we know that the Active Directory domain is available, we can go ahead and join the virtual machine to the domain.

the JoinDomain function depends on xWaitForADDomain. If xWaitForADDomain fails, JoinDomain will not run

Once the virtual machine has been added to the domain, it needs to be restarted. We will use xPendingReboot function from the xPendingReboot experimental DSC module to accomplish this.

Next, we will install the ADFS role on the virtual machine

Our script has now successfully added the virtual machine to the domain and installed the ADFS role on it. Next, create a zip file with InstallADFS.ps1 and upload it to a location that Azure Resource Manager can access (I would recommend uploading to GitHub). Include the xActiveDirectory and xPendingReboot experimental DSC module directories in the zip file as well. Also add a folder called Certificates inside the zip file and put the ADFS certificate and the encrypted password files (discussed in the next section) inside the folder.

In the next section, we will configure the ADFS Farm.

The full InstallADFS.ps1 DSC script is pasted below

Create ADFS Farm

Once the ADFS role has been installed, we will use Custom Script Extensions (CSE) to create the ADFS farm.

One of the requirements to configure ADFS is a signed certificate. I used a 90 day trial certificate from Comodo.

There is a trick that I am using to make my certificate available on the virtual machine. If you bootstrap a DSC script to your virtual machine in an Azure Resource Manager template, the script along with all the non out-of-box DSC modules have to be packaged into a zip file and uploaded to a location that ARM can access. ARM then will download the zip file, unzip it, and place all directories inside the zip file to $env:ProgramFiles\WindowsPowerShell\Modules ( C:\ProgramFiles\WindowsPowerShell\Modules ) ARM assumes the directories are PowerShell modules and puts them in the appropriate directory.

I am using this feature to sneak my certificate on to the virtual machine. I create a folder called Certificates inside the zip file containing the DSC script and put the certificate inside it. Also, I am not too fond of passing plain passwords from my ARM template to the CSE, so I created two files, one to hold the encrypted password for the domain administrator account and the other to contain the encrypted password of the adfs service account. These two files are named adminpass.key and adfspass.key and will be placed in the same Certificates folder within the zip file.

I used the following to generate the encrypted password files

AdminPlainTextPassword and ADFSPlainTextPassword are the plain text passwords that will be encrypted.

$key  is used to convert the secure string into an encrypted standard string. Valid key lengths are 16, 24, 32

For this blog, we will use

$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Open Windows PowerShell ISE and paste the following (save the file with a name of your choice. I saved mine as ConfigureADFS.ps1)

param (
 $DomainName,
 $DomainAdminUsername,
 $AdfsSvcUsername
)

These are the parameters that will be passed to the CSE

$DomainName is the name of the Active Directory domain
$DomainAdminUsername is the username of the domain administrator account
$AdfsSvcUsername is the username of the ADFS service account

Next, we will define the value of the Key that was used to encrypt the password and the location where the certificate and the encrypted password files will be placed

$localpath = "C:\Program Files\WindowsPowerShell\Modules\Certificates\"
$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Now, we have to read the encrypted passwords from the adminpass.key and adfspass.key file and then convert them into a domain\username format

Next, we will import the certificate into the local computer certificate store. We will mark the certificate exportable and set the password same as the domain administrator password.

In the above after the certificate is imported,  $cert is used to hold the certificate thumbprint

Next, we will configure the ADFS Farm

The ADFS Federation Service displayname is set to “Active Directory Federation Service” and the Federation Service Name is set to fs.adfsfarm.com

Upload the CSE to a location that Azure Resource Manager can access (I uploaded my script to GitHub)

The full ConfigureADFS.ps1 CSE is shown below

Azure Resource Manager Template Bootstrapping

Now that the DSC and CSE scripts have been created, we need to add them in our ARM template, straight after the virtual machine is provisioned.

To add the DSC script, create a DSC extension and link it to the DSC Package that was created to install ADFS. Below is an example of what can be used

The extension will run after the ADFS virtual machine has been successfully created (referred to as ADFS01VMName)

The MachineName, DomainName and domain administrator credentials are passed to the DSC extension.

Below are the variables that have been used in the json file for the DSC extension (I have listed my GitHub repository location)

Next, we have to create a Custom Script Extension to link to the CSE for configuring ADFS. Below is an example that can be used

The CSE depends on the ADFS virtual machine being successfully provisioned and the DSC extension that installs the ADFS role to have successfully completed.

The DomainName, Domain Administrator Username and the ADFS Service Username are passed to the CSE script

The following contains a list of the variables being used by the CSE (the example below shows my GitHub repository location)

"repoLocation": "https://raw.githubusercontent.com/nivleshc/arm/master/",
"ConfigureADFSScriptUrl": "[concat(parameters('repoLocation'),'ConfigureADFS.ps1')]",

That’s it Folks! You now have an ARM Template that can be used to automatically install the ADFS role and then configure a new ADFS Farm.

In my next blog, we will explore how to add another node to the ADFS Farm and we will also look at how we can automatically create a Web Application Proxy server for our ADFS Farm.