Quickly generating a dataset of fictitious Users using Randomised Real Data and PowerShell

Introduction

I’ve lost count of the number of times I’ve had the need to generate a representative dataset of users. Of course I have access to many production datasets but for many reasons they can’t be used. Finding previous datasets I’ve randomly generated always seems to take longer than it should, so with my most recent iteration of having to generate a fictitious list of users with Australian addresses, I’ve documented how I went about it, along with the source data I used and the script to create it.

Source Data

For my data sources to base my dataset off, I wanted representative data for Australia for both people names and locations. After a few quick searches I found;

  • that Data South Australia has lists of baby names for both male and female babies in SA. I downloaded the 2017 lists as CSV’s.
  • for Surname, also from Data South Australia I borrowed the 19th Century Arrivals list and manipulated the Fullname column to separate it on “,” then used the Excel Function to remove duplicates. I deleted all other columns so that I was left with just over 13,000 surnames in a CSV file.
  • Matthew Proctor’s list of Australian Postcodes as a CSV. This provides Postcode, Suburb and State.
  • Brisbane City Council (Australia’s largest Council) has a dataset with all bus locations that includes Street names as a CSV. Like I did for Surname I used the Excel Function to remove duplicates, removed the blanks and the other columns and then had just over 1600 street names.

The Script

The script is pretty simple. It imports each of the CSV’s listed above and generates a random number based on the number of records in each file.

The GitHub Repo contains the PowerShell script along with the source files. Change line 3 for the location where you store the CSV files and change line 66 for the number of users to generate. I’ve left the end of the script empty. I either insert the API call to create the users, or the PowerShell cmdlet with the data to do the creation depending on where I’m creating the users.

PowerShell Script

The Output

Here is a sample output in JSON format.

{
"Street": "370 Miskin St",
"Surname": "Burne",
"Suburb": "WOODBROOK",
"Postcode": "3451",
"State": "VIC",
"GivenName": "Miro"
}
{
"Street": "293 Preston Rd",
"Surname": "Partingale",
"Suburb": "MARRARA",
"Postcode": "812",
"State": "NT",
"GivenName": "Daniella"
}
{
"Street": "409 Orchard St",
"Surname": "Liaseyer",
"Suburb": "THURGOONA",
"Postcode": "2640",
"State": "NSW",
"GivenName": "Ariana"
}
{
"Street": "775 Station Rd",
"Surname": "Nevin",
"Suburb": "AVON DOWNS",
"Postcode": "862",
"State": "NT",
"GivenName": "Naria"
}

Summary

Using data publicly available and PowerShell it is possible to quickly generate a dataset of representative users and addresses. Generating other attributes is as easy as extrapolating from the existing data or supplementing it with additional source data files.

 

A Voice Assistant for Microsoft Identity Manager

This is the third and final post in my series around using your voice to query/search Microsoft Identity Manager or as I’m now calling it, the Voice Assistant for Microsoft Identity Manager.

The two previous posts in this series detail some of my steps and processes in developing and fleshing out this concept. The first post detailed the majority of the base functionality whilst the second post detailed the auditing and reporting aspects into Table Storage and Power BI.

My final architecture is depicted below.

Identity Manager integration with Cognitive Services and IoT Hub 4x3

I’ve put together more of an overview in a presentation format using GitPitch you can checkout here.

The why and how of the Voice Assistant for Microsoft Identity Manager

If you’re interested in building the solution checkout the Github Repo here which includes the Respeaker Python Script, Azure Function etc.

Let me know how you go @darrenjrobinson

Replace Personal Privilege Account into Shareable Broker Accounts

Introduction

Most of the organizations still have the practice of Personal Privilege Accounts in their corporate platforms and application. It’s very challenging when comes to managing and monitoring those accounts which gives non-restrictive access to the most valuable systems in the Organizations. Effective procedures around managing these privileged accounts are extremely difficult without specialized tools.

CyberArk Privileged Account Management solution enable these organizations to secure, provision, manage, control and monitor all activities associated with privileged accounts present in their IT landscape.

One of the primary goals of implementing Privilege Account Management solution will be replacing personal privileged accounts with shareable broker Accounts. This will drastically reduce the total number of privilege accounts for each application and systems. And, these broker accounts will get the other benefits from CyberArk PAM solution for example, One Time Password, enforce corporate Password Policy, tamper proof audit trails etc.

Replace AD Personal Privileged Accounts into Shareable Broker Accounts

Typical CyberArk approach to replace Active Directory personal privilege accounts into shareable broker Accounts are graphically depicted in the below picture.

1

Note: Assume all green line connectors are the customization needed to implement this use case.

1) In this scenario, two new AD shared accounts (App_Broker_Acc1|2) are created and added as members of domain admin groups (after this implementation we can disable all the existing personal privilege accounts which are members of this group ex S-XXXX, S-YYYY)

2

2) A new AD group (PAM_Domain Admins) will be created specifically to map users normal AD id to CyberArk Safe (Safes are logical containers with in the CyberArk Vault). This will provide end user (289705, 289706) access to fetch password and initiate a session to target platforms.

3) The normal AD IDs of the administrators will be added as members of the newly created AD groups for PAM.

3

4) A Safe (AD_Domain Admins_Safe) will be created in CyberArk. The AD group (PAM_Domain Admins) which we’ve created in step2 will be made as member of this safe with required permission enabled.

4

5) On-board the Shared account which are created in Step 1 into CyberArk. These accounts will be stored under the Safe, which we have created as part of step 4.

5

6) Now the administrators will be able to logon to CyberArk Web Portal (PVWA) using their normal AD ID and then they can connect to the target platform by selecting a broker account without knowing its credentials.

6

7) Session initiated through shareable broker account without end user knowing its password.

7

Using your Voice to Search Microsoft Identity Manager – Part 2

Introduction

Last month I wrote this post that detailed using your voice to search/query Microsoft Identity Manager. That post demonstrated a working solution (GitHub repository coming next month) but was still incomplete if it was to be used in production within an Enterprise. I hinted then that there were additional enhancements I was looking to make. One is an Auditing/Reporting aspect and that is what I cover in this post.

Overview

The one element of the solution that has visibility of each search scenario is the IoT Device. As a potential future enhancement this could also be a Bot. For each request I wanted to log/audit;

  • Device the query was initiated from (it is possible to have many IoT devices; physical or bot leveraging this function)
  • The query
  • The response
  • Date and Time of the event
  • User the query targeted

To achieve this my solution is to;

  • On my IoT Device the query, target user and date/time is held during the query event
  • At the completion of the query the response along with the earlier information is sent to the IoT Hub using the IoT Hub REST API
  • The event is consumed from the IoT Hub by an Azure Event Hub
  • The message containing the information is processed by Stream Analytics and put into Azure Table Storage and Power BI.

Azure Table Storage provides the logging/auditing trail of what requests have been made and the responses.  Power BI provides the reporting aspect. These two services provide visibility into what requests have been made, against who, when etc. The graphic below shows this in the bottom portion of the image.

Auditing Reporting Searching MIM with Speech.png

Sending IoT Device Events to IoT Hub

I covered this piece in a previous post here in PowerShell. I converted it from PowerShell to Python to run on my device. In PowerShell though for initial end-to-end testing when developing the solution the body of the message being sent and sending it looks like this;

[string]$datetime = get-date
$datetime = $datetime.Replace("/","-")
$body = @{
 deviceId = $deviceID
 messageId = $datetime
 messageString = "$($deviceID)-to-Cloud-$($datetime)"
 MIMQuery = "Does the user Jerry Seinfeld have an Active Directory Account"
 MIMResponse = "Yes. Their LoginID is jerry.seinfeld"
 User = "Jerry Seinfeld"
}

$body = $body | ConvertTo-Json
Invoke-RestMethod -Uri $iotHubRestURI -Headers $Headers -Method Post -Body $body

Event Hub and IoT Hub Configuration

First I created an Event Hub. Then on my IoT Hub I added an Event Subscription and pointed it to my Event Hub.

IoTHub Event Hub.PNG

Streaming Analytics

I then created a Stream Analytics Job. I configured two Inputs. One each from my IoT Hub and from my Event Hub.

Stream Analytics Inputs.PNG

I then created two Outputs. One for Table Storage for which I used an existing Storage Group for my solution, and the other for Power BI using an existing Workspace but creating a new Dataset. For the Table storage I specified deviceId for Partition key and messageId for Row key.

Stream Analytics Outputs.PNG

Finally as I’m keeping all the data simple in what I’m sending, my query is basically copying from the Inputs to the Outputs. One is to get the events to Table Storage and the other to get it to Power BI. Therefore the query looks like this.

Stream Analytics Query.PNG

Events in Table Storage

After sending through some events I could see rows being added to Table Storage. When I added an additional column to the data the schema-less Table Storage obliged and dynamically added another column to the table.

Table Storage.PNG

A full record looks like this.

Full Record.PNG

Events in Power BI

Just like in Table Storage, in Power BI I could see the dataset and the table with the event data. I could create a report with some nice visuals just as you would with any other dataset. When I added an additional field to the event being sent from the IoT Device it magically showed up in the Power BI Dataset Table.

PowerBI.PNG

Summary

Using the Azure IoT Hub REST API I can easily send information from my IoT Device and then have it processed through Stream Analytics into Table Storage and Power BI. Instant auditing and reporting functionality.

Let me know what you think on twitter @darrenjrobinson

Using your Voice to Search Microsoft Identity Manager – Part 1

Introduction

Yes, you’ve read the title correctly. Speaking to Microsoft Identity Manager. The concept behind this was born off the back of some other work I was doing with Microsoft Cognitive Services. I figured it shouldn’t be that difficult if I just break down the concept into individual elements of functionality and put together a proof of concept to validate the idea. That’s what I did and this is the first post of the solution as an overview.

Here’s a quick demo.

Overview

The diagram below details the basis of the solution. There are a few extra elements I’m still working on that I’ll cover in a future post if there is any interest in this.

Searching MIM with Speech Overview

The solution works like this;

  1. You speak to a microphone connected to a single board computer with the query for Microsoft Identity Manager
  2. The spoken phrase is converted to text using Cognitive Speech to Text (Bing Speech API)
  3. The text phrase is;
    1. sent to Cognitive Services Language Understanding Intelligent Service (LUIS) to identify the target of the query (firstname lastname) and the query entity (e.g. Mailbox)
    2. Microsoft Identity Manager is queried via API Management and the Lithnet REST API for the MIM Service
  4. The result is returned to the single board computer as a text result phase which it then uses Cognitive Services Text to Speech to convert the response to audio
  5. The result is spoken back

Key Functional Elements

  • The microphone array I’m using is a ReSpeaker Core v1 with a ReSpeaker Mic Array
  • All credentials are stored in an Azure Key Vault
  • An Azure Function App (PowerShell) interfaces with the majority of the Cognitive Services being used
  • Azure API Management is used to front end the Lithnet MIM Webservice
  • The Lithnet REST API for the MIM Service provides easy integration with the MIM Service

Summary

Leveraging a lot of Serverless (PaaS) Services, a bunch of scripting (Python on the ReSpeaker and PowerShell in the Azure Function) and the Lithnet REST API it was pretty simple to integrate the ReSpeaker with Microsoft Identity Manager. An alternative to MIM could be any other service you have an API interface into. MIM is obviously a great choice as it can aggregate from many other applications/services.

Why a female voice? From a small response it was the popular majority.

Let me know what you think on twitter @darrenjrobinson

Auto-redirect ADFS 4.0 home realm discovery based on client IP

As I mentioned in my previous post here that I will explain how to auto-redirect the home realm discovery page to an ADFS namespace (claims provider trust) based on client’s IP so here I am.

Let’s say you have many ADFS servers (claims providers trusts) linked to a central ADFS 4.0 server and you want to auto-redirect the user to a linked ADFS server login page based on user’s IP instead of letting the user to choose a respective ADFS server from the list on the home realm discovery page as explained in the below request flow diagram.

You can do so by doing some customization as mentioned below:

  1. Create a database of IP ranges mapped to ADFS namespaces
  2. Develop a Web API which returns the relevant ADFS namespace based on request IP
  3. Add custom code in onload.js file on the central ADFS 4.0 server to call the Web API and do the redirection

It is assumed that all the boxes including Central ADFS, linked ADFS, Web Server, SQL Server are setup. All the nitties and gritties are sorted out in terms of firewall rules, DNS lookups, SSL certificates. If not then you can get help from an infrastructure guy on that.

Lets perform the required action on SQL, Web and ADFS Server.

SQL Server

Perform the following actions on the SQL Server:

  1. Create a new database
  2. Create a new table called Registration as shown below

  1. Insert some records in the table for the linked ADFS server IP range, for example

Start IP: 172.31.117.1, End IP: 172.31.117.254, Redirect Name: http://adfs.adminlab.com/adfs/services/trust

Web Server

Perform the following actions for the Web API development and deployment:

  1. Create a new ASP.NET MVC Web API project using Visual Studio
  2. Create a new class called Redirect.cs as shown below (I would have used the same name as database table name ‘Registration’ but it’s OK for now)

  1. Insert a new Web API controller class called ResolverController.cs as shown below. What we are doing here is getting the request IP address and getting the IP ranges from the database, comparing the request IP address with the IP ranges from the database by converting both to long IP address. If the request IP is in range then returning the redirect object.

  1. Add a connection string in the web.config named DbConnectionString pointing to the database we created above.
  2. Deploy this web API project to the web server IIS
  3. Configure the HTTPS binding as well for this web API project using the SSL certificate
  4. Note down the URL of the web API, something like ‘https://{Web-Server-Web-API-URL}/api/resolver/get’, this will be used in the onload.js

Central ADFS 4.0 Server

Perform the following actions on the central ADFS 4.0 server:

  1. Run the following PowerShell command to export current theme to a location

Export-AdfsWebTheme -Name default -DirectoryPath D:\Themes\Custom

  1. Run the following PowerShell command to create a new custom theme based on current theme

New-AdfsWebTheme -Name custom -SourceName default 

  1. Update onload.js file extracted in step 1 at D:\Themes\Custom\theme\script with following code added at the end of the file. What we are doing here is calling the web API which returns the matched Redirect object with RedirectName as ADFS namespace and setting the HRD.selection as that redirect name.

  1. Run the following PowerShell command to update back the onload.js file in the theme

Set-AdfsWebTheme -TargetName custom -AdditionalFileResource @{Uri=’/adfs/portal/script/onload.js’;path=”D:\Themes\Custom\theme\script\onload.js”} 

  1. Run the following PowerShell command to make the custom theme as your default theme

Set-AdfsWebConfig -ActiveThemeName custom -HRDCookieEnabled $false

Now when you test from your linked ADFS server or a client machine linked to the linked ADFS server (which is linked to a central ADFS server), the auto-redirect kicks in from onload.js and forwards it to web API which gets the client IP and matches it with relevant ADFS where the request came from and redirects the user to the relevant ADFS login page, instead of user selecting the relevant ADFS namespace from the available list on home realm discovery page.

If the relevant match is not found, the default home realm discovery page with list of available ADFS namespaces is shown.

Implementing Azure API Management with the Lithnet Microsoft Identity Manager Rest API

Introduction

Earlier this week I wrote this post that detailed implementing the Lithnet REST API for FIM/MIM Service. I also detailed using PowerShell to interact with the API Endpoint.

Now lets imagine you are looking to have a number of Azure Serverless features leverage your Rest API enabled Microsoft Identity Manager environment. Or even offer it “as-a-Service”. You’ll want to have some visibility as to how it is performing, and you’ll probably want to implement features such as caching and rate limiting let alone putting more security controls around it. Enter Azure API Management, which provides all those functions and more.

In this post I detail getting started with Azure API Management by using it to front-end the Lithnet FIM/MIM Rest API.

Overview

In this post I will detail;

  • Enabling Azure API Management
  • Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management
  • Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell
  • Reporting

Prerequisites

For this particular scenario I’m interfacing Azure API Management with a Rest API that uses Digest Authentication. So even though it is a Windows WCF Webservice you could do something similar with a similar API Endpoint. If the backend API endpoint is using SSL it will need to have a valid certificate. Even though Azure API Management allows you to add your own certificates I had issues with Self Signed Certificates. I have it working fine with Lets Encrypt issued certificates. Obviously you’ll need an Azure Subscription as well as an App/Servive with an API.

Enabling Azure API Management

From the Azure Portal select Create a resource and search for API management and select it.

Add API Mgmt.PNG

Select Create

Create API Mgmt.PNG

Give your API Management Service a name, select a subscription, resource group etc and select Create.

API Mgmt Config 1.PNG

Once you select Create it will take about 30 minutes to be deployed.

Configuring the Lithnet FIM/MIM Rest API integration with Azure API Management

Once your new API Management service has been deployed, from the Azure Portal select the API Management services blade and select the API Management service that you just created. Select APIs.

API Config 1.PNG

Select Add API and then select Add a new API

API Mgmt Config 2.PNG

Give the API a name, description, enter the URI for your API EndPoint, and select HTTPS. I’m going to call this MIMSearcher so have entered that under API URL Suffix. For initial testing under Products select starter. Finally select Create.

API Mgmt Config 4.PNG

We now have our base API setup. From the Backend tile select the Edit icon.

API Mgmt Config 5.PNG

As the backed is authenticated using Basic Authentication, select Basic in Gateway credentials and enter the details of an account with access that will be used by the API Gateway. Select Save.

API Mgmt Config 6.PNG

Now from our API Configuration select Add operation.

API Mgmt Config 7.PNG

First we will create a test operation for the Help page on the Lithnet FIM/MIM Rest API. Provide a Display name, and for the URL add /v2/help. Give it a description and select Create.

Note: I could have had v2 as part of the base URI for the API in the previous steps. I didn’t as I will be using API’s from both v1 and v2 and didn’t want to create multiple operations.

API Mgmt Config 8.PNG

Select the new Operation (Help)

API Mgmt Config 9.PNG

Select the Test menu. Select Send.

API Mgmt Config 10.PNG

If everything is set up correctly you will get a 200 Success OK response as below.

API Mgmt Config 11.PNG

Accessing MIM via Azure API Management and the Lithnet FIM/MIM Rest API using PowerShell

Head over to your API Portal. The URL is https://.portal.azure-api.net/ where is the name you gave your API Management Service shown in the third screenshot at the top of this post. If you are doing this from the browser you used to create the API Management Service you should be signed in already. From the Administrator menu on the right select Profile.

Test API Mgmt 1.PNG

Click on Show under one of the keys and record its value.

Test API Mgmt 2.PNG

Using PowerShell ISE or VSCode update the following Code Snippet and test.

$APIURL = 'https://.azure-api.net//v2/help'
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$response = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$response

The snippet will create a Web Request to the new API and display the results.

Test API Mgmt 3.PNG

Querying the Lithnet Rest API via Azure API Management

Now that we have a working solution end-to-end, let’s do something useful with it. Looking at the Lithnet Rest API, the Resources URI is the key one exposing Resources from the MIM Service.

Resources.PNG

Let’s create a new Operation for Resources similar to what we did for the Help. After selecting Create configure the Backend for Basic Authentication like we did for Help.

Get Resources.PNG

Testing out the newly exposed endpoint is very similar to before. Just a new APIURL with the addition of /?Person to return all Person Resources from the MIM Portal. It lets us know it’s returned 7256 Person Objects, and the Results are still paged (100 by default).

Get Persons.PNG

Let’s now Search for just a single user. Search for a Person object whose Display Name is ‘darrenjrobinson’.

$query = "Person[DisplayName='darrenjrobinson']"
$queryEncoded = [System.Web.HttpUtility]::UrlEncode($query)

$APIURL = "https://.azure-api.net//v2/resources/?filter=/$($queryEncoded)" 
$secret = 'yourSecret'
$Headers = @{'Ocp-Apim-Subscription-Key' = $secret} 
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

$user = Invoke-RestMethod -Uri $APIURL -Headers $Headers -ContentType "application/json" -UseBasicParsing -Method Get
$user

Executing, we get a single user returned.

Search for User.PNG

Reporting

Using the Publisher Portal we can get some Stats on what is happening with our API Management implementation.

Go to https://.portal.azure-api.net/admin and select Analytics.

We then have visibility to what has been using the API Management Service. At a Glance gives and overview and you can drill down into;

  • Top Users
  • Top Products
  • Top subscriptions
  • Top APIs
  • Top Operations

At a glance looks like this;

At a Glance Stats.PNG

And Top Operations looks like this;

Top Operations.PNG

Summary

That is a quick start guide to implementing Azure API Management in front of a Rest API and using PowerShell to integrate with it. Next steps would be to enable caching, and getting into more of the advanced features. Enjoy.

 

Getting started with the Lithnet REST API for the Microsoft Identity Manager Service

Introduction

A common theme with my posts on Microsoft Identity is the extensibility of it particularly with the Lithnet tools that Ryan has released.

One such tool that I’ve used but never written about is the Lithnet REST API for the Microsoft Identity Manger Service. For a small proof of concept I’m working on I was again using this REST API and I needed to update it as Ryan has recently added some new functionality. I realised I hadn’t set it up in a while and while Ryan’s documentation is very good it was written some time ago when IIS Manager looked a little different. So here is a couple of screenshots and a little extra info to get you started if you haven’t used it before to supplement Ryan’s documentation located here.

Configuring the Lithnet REST API for the Microsoft Identity Manager Service

You can download the Lithnet REST API for the FIM/MIM Service from here

If you are using the latest version of the Lithnet Rest API you will need to make sure you have .NET 4.6.1 installed. If you are running Windows Server 2012 R2 you can get it from here.

When configuring your WebSite make sure you choose .NET v4.5 Classic for the Application Pool.

WebSite AppPool Settings.PNG

The web.config must match your MIM version. Currently the latest is 4.4.1749.0 as detailed here. That therefore looks like this.

WebConfig Resource Management Version.PNG

Finally you’ll need an SSL Certificate. For development environments a Self-Signed Certificate is fine. Personally I use this Cert Generator. Make sure you put the certificate in the cert store on the machine you will be testing access with. Here’s an example of my command line for generating a cert.

Cert Generation.PNG

You could also use Lets Encrypt.

In your bindings in IIS have the Host Name match your certificate.

Bindings.PNG

If you’ve done everything right you will be able to hit the v2 endpoint help. By default with Basic Auth enabled you’ll be prompted for a username and password.

v2 EndPoint.PNG

Using PowerShell to query MIM via the Lithnet Rest API

Here is an example script to query MIM via the Lithnet MIM Rest API. Update for your credentials (Lines 2 and 3), the URL of the server running the API Endpoint (Line 11) and what you are querying for (Line 14). My script takes into account Self Signed Certs in a Development environment.

Example output from a query is shown below.

Example Output.PNG

Summary

Hopefully that helps you quickly get started with the Lithnet REST API for the FIM/MIM Service. I showed an example using PowerShell directly, but using an Azure Function is also a valid pattern. I’ve covered similar functionality in the past.

 

Preventing double-app-loading behaviour with ADAL.js

Microsoft’s JavaScript implementation of its Azure Active Directory Authentication Library (ADAL.js) allows for some great client-side-only Single Page App (SPA) scenarios.

Unfortunately (as with most things auth-related), there are some gotcha’s to be aware of. One relates to how ADAL obtains refresh tokens in this crazy world of implicit auth.

Implicit Auth Flow

Implicit auth allows for the application developer to not have to host their own token authentication service. The ADAL.js and the Azure AD auth endpoint do all the heavy lifting:

It’s the bottom third of the diagram (after the token expires) that causes the issue I am addressing in this post. This is where ADAL.js creates a hidden iframe (browser fragment) that sends a request to get a fresh token. This will show up in the DOM (if you inspect in the browser dev tools) as an iframe element with an ID of “adalRenewFrame” followed by the endpoint that it is renewing the token for (in the below example this is https://graph.microsoft.com).

What’s the problem?

So what’s the big deal? Little hidden iframe refreshing a token never did anyone any harm, right? Well, if you just leave the ADAL config with the default redirectUri setting, the iframe logs in to the Azure AD endpoint and then heads on back to your SPA and… reloads. So, we have our app’s JavaScript (and any HTTP calls) loading twice: once in the main window and again in the iframe. Bummer.

What’s the solution?

The ADAL.js team recommend a couple of different approaches to getting around this issue in their FAQ page on github. The simpler solution is to control how your app is bootstrapped so that it only loads if the window === window.parent (i.e. it isn’t in an iframe), which is fine if you have this kind of control over how your app starts (like with AngularJS or React). But this won’t always suit.

The other option is to have an alternative redirect page that is targeted after the iframe renews the token (by specifying it in the ADAL config with the redirectUri property). N.B. you have to specify the exact url in the Azure AD app settings for your app as well.

Suffice it to say, just pointing to an empty page doesn’t do the trick and there is a bunch of hassle with getting everything working (see the comments on this gist for the full adventure), but to cut a long story short – here’s what worked for me.

The redirect page itself redirects back to our SPA (in this case, the root of the web app) only if the window === window.parent (not an iframe) and passes the token etc in the window.location.hash as well. See the below example.

Hope this saves you some time/grief with getting this all working. I haven’t seen a satisfactory write up of a solution to this issue before (hence this post).

Any questions, suggested improvements, please let me know in the comments!

‘Till next time…

Some advanced ADFS 4.0 branding customization

As you are aware that you can use some of the PowerShell commands to update the logo, banner/illustration images as well as home, privacy and other links of the ADFS 4.0 home realm discovery or sign in page. Below is an example of doing so

Set-AdfsWebTheme -TargetName custom -Logo @{path=”P:\Theme\Logo\logo.png”}

The above command would update the current logo image on the custom theme.

Set-AdfsGlobalWebContent -HomeLink https://{www.YourWebsite.Com}/ -HomeLinkText Home

Above command would update the “Home” link on all pages of your ADFS theme.

How about doing below customization?

 

You can do above customizations by writing custom code in the “onload.js”. The process of exporting the “onload.js” file and importing it back again to ADFS 4.0 theme is explained in my previous post here.

So our requirements are mentioned below:

  1. Change the home realm discovery page sign in label message from “Sign in with one of these accounts” to something else
  2. Change home realm discovery page title
  3. Add “User agreement” and “Code of conduct” links along with some text message on home realm discovery page
  4. Change sing in page title
  5. Change the sign in label message on sign in page from “Sign in with your organizational account” to something else

But just to repeat the some of the required steps to export/import onload.js file, Run the following PowerShell command to export current theme to a location

Export-AdfsWebTheme -Name default -DirectoryPath D:\Themes\Custom

Run the following PowerShell command to create a new custom theme based on current theme

New-AdfsWebTheme -Name custom -SourceName default

Update onload.js file extracted at D:\Themes\Custom\theme\script with following code snippets added at the end of the file:

Run the following PowerShell command to update back the onload.js file to the theme

Set-AdfsWebTheme -TargetName custom -AdditionalFileResource @{Uri=’/adfs/portal/script/onload.js’;path=”D:\Themes\Custom\theme\script\onload.js”}

Run the following PowerShell command to make the custom theme as your default theme

Set-AdfsWebConfig -ActiveThemeName custom -HRDCookieEnabled $false

You must be thinking that we have missed point no. 6, 7 and 8. No, we haven’t. We don’t need to update anything in the onload.js, this can be achieved by some PowerShell commands mentioned below:

Requirement no. 6 and 7 (combined) can be done by below PowerShell command

Set-AdfsGlobalWebContent -SignInPageDescriptionText “View the {Your Company Name} <A href='{User Agreement URL}’ target=’_blank’>User Agreement</A> and <A href='{Code of Conduct URL}’ target=’_blank’>Code of Conduct</A>. </br></br>Forgotten your password? Please click <A href='{Staff password change URL}’ target=’_blank’>staff</A> or <A href='{Student password change URL}’ target=’_blank’>student</A> to reset your password.”

Requirement no. 8: Run the following PowerShell commands to set the Home, Privacy and HelpDesk link texts

Set-AdfsGlobalWebContent -HomeLink {Your company URL} -HomeLinkText Home

Set-AdfsGlobalWebContent -PrivacyLink {Your company privacy policy URL} -PrivacyLinkText Privacy

Set-AdfsGlobalWebContent -HelpDeskLink {Your company disclaimer URL} -HelpDeskLinkText Disclaimer

In the next article I will explain you, how to auto-redirect the home realm discovery to an ADFS namespace (claims provider trust) based on client’s IP.