Creating Azure AD B2C Service Principals with PowerShell


I’ve been lucky enough over the last few months to be working on some cool consumer-facing solutions with one of my customers. A big part of the work we’ve been doing in building Minimum Viable Product (MVP) solutions to allow us to quickly test concepts in-market using stable, production ready technologies.

As these are consumer solutions, the Azure Active Directory (AAD) B2C service was an obvious choice for identity management, made even more so by AAD B2C’s ability to act as a source-of-truth for consumer identity and profile information across a portfolio of applications and services.

AAD B2C and Graph API

The AAD B2C schema is extensible which allows you to add custom attributes to an identity. Some of these extension attributes you may wish the user to manage themselves (i.e. mobile phone number), and some may be system-managed or remotely-sourced value associated with the identity (i.e. Salesforce ContactID) that…

View original post 480 more words


Simple reporting from the FIM/MIM Metaverse to PowerBI using the Lithnet FIM/MIM Sync Service PowerShell Module

I have a customer that is looking to report on FIM/MIM identity information. The reports they are looking for aren’t overly complex and don’t necessarily justify the need the full FIM/MIM reporting infrastructure. So I spent a few hours over a couple of days looking at alternatives. In this blog post I give an overview of using the awesome Lithnet FIM/MIM Sync Service PowerShell Module recently released from Ryan Newington to do basic reporting on the Microsoft (Forefront) Identity Manager Metaverse into PowerBI.

I’ll briefly show how to leverage the Lithnet FIM/MIM Sync Service PowerShell Module to extract Person objects and their metadata (based on a search filter criteria) from the MIM/FIM Metaverse and output to a file for PowerBI.

I cover;

  • Building a query
  • Executing the query
  • Filtering the results for output to a file (CSV)
  • Importing to PowerBI as a dataset, creating a report showing results in a Dashboard

First up you’ll need to download and install the module from

Using the FIM/MIM Sync Service PowerShell Module to query the Metaverse

What operators you can choose on your attribute types (boolean, string, integer, reference etc) in the Metaverse Search function in the Synchronisation Service Manager you can also perform using the Lithnet FIM/MIM Sync Service PowerShell Module.

By creating a search with multiple criteria in the Metaverse Search you can filter the results from the Metaverse.

As shown below you can see that we get 302 results.

So let’s import the Lithnet FIM/MIM Sync Service PowerShell Module, create a filter execute it and look at the results. As you’d expect we get the same result. Excellent.

Remember that using this PowerShell automation module, the backend is still the WMI interface behind the Synchronisation Service Manager. This means you can’t for example create a query filter using “greater than/less than” if you can’t do it in the UI.

Take my Twitter FriendsCount attribute of type Number/Integer as an example.

I can’t create a query filter that would return results where FriendsCount > 20,000. I can only use the IsPresent, IsNotPresent and Equals.

On a sidenote the PowerShell error message will give you a hint at what operators you can use as shown below.

However, if you try and use StartsWith for an Integer attribute the search will execute but just return no results. My tip then is define your query in the Metaverse Search GUI and when you get what results you want/expect, create the equivalent query in PowerShell and validate you get the same number of results.

Final note on query filters. Multiple criteria are an AND operation filter, NOT OR.

Let’s do something with the results

Now that we have a query sorted let’s do something with the results. The result set is the full attribute list and values for each associated object that matched our query from the Metaverse. That’s way more info than what I and probably you need as well. So iterate through the results, pull out the attribute values that we want to do something with and export them as a CSV file.

What to do with the output ?

For this overview I’ve just chosen the local file (CSV) that I exported as part of the script as the input dataset in PowerBI.

On the right hand side I’ve chosen the columns that were exported to the CSV and they appear in the main window.

Click Pin to Live Page. You’ll be prompted to save the report first so do that then I choose New Dashboard for the report. Click Pin live.

I can then refine and get some visual reports quickly using text based queries using keywords from the dataset columns. Like Top 10 by number of friends from the dataset.

Create a couple of queries and pin them to the Dashboard and the data comes to life.


The Lithnet FIM/MIM Sync Service PowerShell Module provides a really easy way to expose information from the Metaverse that may satisfy many reporting and other requirements. Taking the concept further it wouldn’t be too complex to export the data to an Azure SQL DB on a schedule and have the results dynamically update on a PowerBI Dashboard.
The concept of exporting data for reporting is just one practical example using the tools. Huge thanks to Ryan for creating the Lithnet tools and publishing to the community. Keep in mind the tools disclaimer too.

Here is the sample PowerShell.

Follow Darren on Twitter @darrenjrobinson

Modern Authentication and MAPI-HTTP

If you haven’t heard, Modern Authentication (aka ADAL), has now officially gone GA ( – which means that if you are utilising Office 365 services, particularly Exchange Online, and Office 2013/2016 as your client, you should really be looking at enabling this functionality for your end users.

For those unfamiliar with Modern Auth, there are numerous benefits, but one of the most obvious for end users is it removes the need for the use of ‘save my credentials’ when signing into Exchange Online and provides a true SSO experience when combined with ADFS Federation.

Now, the process for enabling Modern Auth is very well documented in the above blog post, but the short version is:

  1. Enable Modern Auth on the Tenant side via a powershell command
  2. Enable Modern Auth on the client side via a registry key

What isn’t explicity called out as a pre-requisite however is that your Outlook client also needs to also be running in MAPI over HTTP mode.  Now, for a large percentage of environments, this is probably not an issue – but if you are like a recent customer of ours, you may have specifically disabled the use of MAPI-HTTP.  Now there are a number of valid reasons of why you might have wanted to do this (theirs was they were using an old version of Riverbed that didn’t support optimization using the MAPI-HTTP protocol), but as it turns out, the introduction of the MAPI over HTTP protocol to replace the legacy ‘RPC over HTTP’ protocol over 3 years ago was actually one of the precursors into allowing all this fancy Modern Authentication stuff to work.

For full details around what MAPI-HTTP protocol brought in and the benefits it introduced, I recommend reading this great blog post from the Microsoft Exchange team.

But in short, if you find that you have enabled Modern Auth as per the described steps, and you’re still getting the ‘basic auth prompt’ – I’d go ahead and check to see if the following registry key has been set (via GPO or otherwise)

Key: HKEY_CURRENT_USER\Software\Microsoft\Exchange
DWORD: MapiHttpDisabled
Value: 1

The above needs to either be deleted, or set to ‘0’ in order for Modern Auth to work.  The support article KB2937684 also gives you some more info around ensuring MAPI-HTTP is enabled for your Office 2013/2016 client.

Note that changing this value does not take effect until the client next performs an ‘autodiscover’ to switch over.  Depending on the situation, this may cause the user to see the following pop up:


Generally speaking, I’d recommend you test the registry update first with a subset of pilot users, before rolling this out to the wider audience.  Once that is confirmed working, then you can look at rolling out Modern Auth to your end users.

Goodbye Set-MsolUser, Hello Set-AzureADUser & Azure Graph API

Recently Microsoft released the preview of the v2.0 Azure AD PowerShell cmdlets.

I’ve got a project coming up where I’m looking to change my approach for managing users in Azure using Microsoft Identity Manager. Good timing to do a quick proof of concept to manage users with the new cmdlets and directly using the Graph API in preparation to move away from the msol cmdlets.

New Modules

First up, the Azure AD v2.0 PowerShell module was released in public preview on July 13, 2016. There will likely be changes before they become GA, so keep that in mind.

The v2.0 Azure AD PowerShell Module modules themselves are available for download from here

If you have Windows Management Framework v5 installed you can download and install from PowerShell (as below).

Once installed, pretty quickly you can import the module, authenticate to your tenant, retrieve a user and update a few attributes (as below).

Whilst functional it doesn’t really work for how we need to interact with Azure from an Identity Management perspective. So how can we still use PowerShell but enumerate and manipulate identities in Azure ?

Now that we have the AzureAD v2.0 module installed we can reference the Active Directory library it installs (Microsoft.IdentityModel.Clients.ActiveDirectory.dll), authenticate to our Tenant retrieve users, and update them. That’s exactly what is shown in the commands below.

Where interacting with the GraphAPI directly really shines however is at the directory services layer and the Differential Query functionality.

As such this is the approach that I’ll be taking for integration of Azure with Microsoft Identity Manager for managing users for entitlements (such as Azure licensing).
I hope this though also saves a few people time in working out how to use PowerShell to manage Azure objects via the Graph API (using both the PowerShell Module or via the RestAPI).

Effective Error Handling in PowerShell Scripting

There are two typical cases we meet almost everyday while writing PowerShell scripts. Seeing red-coloured error messages is the one and number of result messages displayed on the screen is the other. Have you ever thought you wanted to see nothing or, at least, minimise those messages? In this post, we are going to run a few tricks not to see those error (or unwanted) messages for your happy PowerShell scripting.

Hiding Execution Messages

By design, PowerShell shows result messages after running cmdlets. Let’s take a look at the cmdlet below and run.

This ends up with the following screen.

As we can see, sensitive values such as Account, TenantId, SubscriptionId are displayed on the screen. If your company or client are seriously concerned at security, they would want to hide them from the screen. What should we do then? Let’s try the cmdlet below.

We can’t see the result on the screen! What we did is that we declared a variable, $result, then assign the result value into the variable. If you actually call the $result value, you’ll be able to see the result which was previously displayed on the screen.

Most PowerShell cmdlets returns an instance that has a type of PSObject and this instance renders some value on the screen. Therefore, a temporary variable like $result that the return value is assigned can prevent from being displayed on the screen. Using the temporary variable approach gives us another benefit. Because of its strongly-typed nature, we can access to other values like $result.Context.Subscription.SubscriptionId, if necessary.

Do you ever want to remove the $result object after the use for more security? Look at the following:

Now, we completely removes the temporary variable, too.

So far, we’ve briefly looked how to hide unnecessary or sensitive information from the screen after running a cmdlet. It’s not too bad, huh?

Handling Error Messages

This time, let’s remove the scary red error messages with the blue background. Well, more specifically, let’s handling those error messages properly. We have two different ways of error handling in PowerShell scripting.

Try…Catch…Finally Block

Try the following cmdlet which causes an exception.

This is because it doesn’t provide the cmdlet with a credential for the service principal. In order to handle the exception, we can simply use the try...catch...finally block like:

This is the result from the cmdlet being executed.

No more scary error message, right? Instead we can see a right error message.

ErrorAction Parameter

If we use the try...catch...finally block, we will be able to handle most error messages. Wait, not all of them? Unfortunately, not. Not all exceptions can be handled by the try...catch block. Let’s run the following cmdlet.

Even though we use the try...catch block, it still displays the red error messages, which we don’t want to see. In this case, the -ErrorAction parameter is particularly useful.

with the ErrorAction parameter, the cmdlet with the try...catch block works well.

ErrorVariable Parameter

Using the try...catch block is good but it also has some drawbacks. One of those drawbacks is its verbosity. In an ordinary coding world like C#, we’re recommended using the try...catch block, if necessary. However, in the scripting world, most scripts are one-time used and disposed. With this nature, using the try...catch seems to be overkilling. The -ErrorVariable parameter would be useful in this case. Let’s have a look at the following cmdlet.

This cmdlet is supposed to display the content by reading from the readme.txt file, but it actually doesn’t. This implies that there is an error. This error details is stored into the $ex variable, if we use the ErrorVariable ex. $ex[0].Exception.Message returns the actual error message, if we want to find out.

If we want to avoid the try...catch block, then those ErrorVariable and ErrorAction will give us a more convenient way.

We’ve walked through how to control messages onto the screen when running a PowerShell script. Do you think it looks easy? Yes, it’s easy! Let’s do error-free PowerShell scripting!

The art of managing a service

In the age of technology, it’s important to deliver value to customers in terms of outcomes. Service typifies what this means, providing value to the customer.

There is an art to managing a service, it’s here I want to focus the attention. “Art,” implies mastery of any sort of craft.  I love the picture this paints.  One could imply that managing a service is a mastering it, in turn mastering value.

Here’s a few pointers on how to do that.

Know thy customer

Relationship is key when it comes to managing a service. This might sound strange, how does managing a service (technological) fall into this category? Surely it’s all just binary? Either the service works or it doesn’t. Not necessarily, take the scenario where a service is in use and people don’t really use it. Relationship ensures you are listening to the people who utilize the service and understand if the value is in place. Having the ability to listen is crucial. Listeners have the ability to bounce their way to a positive outcome. Effectively an act of support, encouragement and growth which ultimately reflects value.


At Kloud we focus extensively on ensuring we have a healthy rhythm in place, it ensures we move faster, able to respond in a timely manner. It introduces agility onto our canvas.

Some examples of what we have at our heartbeat;

Daily Kuddle were we focus on what’s been happening, important Metrics and any of our customer’s challenges.

Weekly we take the time to improve what we do where we focus on news, numbers, customer and employee data, the one thing we personally each can do to improve our contribution.


This might seem odd in the context of this blog, it could even find its way under the rhythm but we need to call this out. The ability to discover events, make sense of them and determine the appropriate control action is critical to service management. Like an artist studies his canvas for its creativity expression, we study our systems to ensure its expression is of a healthy nature. It here we can build a visual of how we are progressing and touch it up if required.

Cost a new approach

Cost is an important factor when it comes to managing a service. Customers want to see value at this layer. Imagine the cost actually varying? Whereas a customer you see the cost decrease? At Kloud we have introduced consumption-based service management (to be fair, it’s not a new concept but a model that will potentially redefine the way services are being delivered). In this model the consumer pays for the value that you as the managed service provider will deliver on a consumption, rather than a fixed, basis.


Spending far to much time doing Post incident reviews as opposed to pre incident reviews. Governance is important to the way that we manage our services. We spend time ensuring we have a considerable and robust governance framework in place. This framework allows us to build on hindsight, stay relevant and anticipate what’s to come in the digital age.


This could be my all-time favourite.  I think of our team when this comes to mind.  Wikipedia describes passion as a very strong feeling about a person or thing. Passion is an intense emotion, a compelling enthusiasm or desire for something.  We care about the Cloud and the value that it will provide for the modern day organisation. It’s very essence will allow organisations to concentrate on their core business function therefore we passionately throw ourselves at our mission of the inevitable shift to the Cloud.

Something to Remember


  • Know thy customer, listen more than you talk.
  • Passion, care about your customer and the services you deliver.
  • Evaluate your cost, ensure you are competitive and relevant in today’s cost climate.


  • Forsake a healthy rhythm.
  • Ignore the events.

Exception from HRESULT 0x80230729 creating a new FIM/MIM Management Agent

Another day, another piece of FIM/MIM experimentation. I had built a fresh MIM 2016 environment in Azure to test a few scenarios out. That all went quick and seamlessly thanks to some great templates and a few scripts. Until I came to create the management agent (the purpose of today’s experimentation).

It didn’t matter if I tried to Create a New Management Agent or Import the Management Agent. I just got “Exception from HRESULT 0x80230729”. The common element however was that the Management Agent I was creating was based off a 3rd party MA based on Microsoft’s Extensible Connectivity Management Agent (ECMA). Specifically I was using Soren Granfeldts PowerShell MA.

HResult 0x80230729

Now I’ve used this MA extensively and not had a problem previously.

So I retraced my steps, clean build, pre-requisites etc. All good. I then tried creating an MA from the out of the box connectors. That worked. I successfully created an Active Directory Management Agent.

In the Windows Application Log I found the following from when I was trying to create the PSMA. A little more to go on from that information.


The link in the error message provides some info but it is a generic .NET article. Having experience with MIIS/ILM/FIM/MIM I figured the SyncEngine WebServices Config file would be appropriate place for the information provided in the MSDN link to go.

The Fix

The miiserver.exe.config file located in the default installation directory C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin is what you need to edit.

Near the end of the misserver.exe.config file find the <runtime> section. Insert the line <loadFromRemoteSources enabled=”true”/> as shown below.


Restart the Forefront Identity Manger Server Service from the Services Control Panel and you’re back in action.

Management Agent created and back to the task at hand. Happy days.


Follow Darren on Twitter @darrenjrobinson






The new Azure AD Connect built in user filter: adminDescription.

Originaly posted here on Lucian’s blog, Follow Lucian on Twitter @LucianFrango. Like the Facebook page here.



Really? I need to shorten an already short post? Well, you’re welcome Generation-Y.

  • New Azure AD Connect user filter
  • Inbound rule
  • Leverages ADDS attribute: adminDescription
  • Add in a value with a prefix of User_ or Group_ to filter out that object


Azure AD Connect, like previous version of the directory synchronisation application, is able filter users, groups or contacts that are synchronised to Azure AD / Office 365 through a number of methods. The Microsoft Azure documentation page – –

– – outlines this and explains the various methods available to filter out objects. The various filter options include:

  • domain based filtering
  • organisational unit filtering
  • filtering by group membership
  • filtering by a certain attribute

In numerous customer sites that I’ve worked with, or even deployed AADConnect, a handy filter was also to create a manual filter based on a CustomAttribute or CustomExtensionAttribute. This process usually went along the lines of: if the attribute contained the word or phrase like “DoNotSync”, it would filter this user out. This is most certainly handy when you have multiple forests synced with Azure AD and want to filter out a user that has a secondary account in your second forest. Often required when group based or organisational unit filtering is hard to maintain due to large data object counts.

Change in Azure AD Connect

A few months back though, an update to Azure AD Connect added this user based filter functionality “out of the box”. I came about this when working on a clients site who was using the attribute “adminDescription” for a custom purpose. This customer upgraded Azure AD Connect and found a fault with their custom rule. So, what happened?

AADConnect now has an INBOUND rule that when the attribute “adminDescription” in Active Directory has a value set with a prefix of User_ or Group_, it will filter out and not sync that into the metaverse.

An example to use would be: “User_DoNotSync” or “Group_DoNotSync”.

Sidebar – theres two types of AADConnect rules: Inbound and Outbound. Inbound rules sync data from Active Directory to the metaverse and Outbound rules sync data from the metaverse to Azure AD.

Final words

It’s not every project that I work with the same technology. Part of the charm and positive of working at Kloud is that there’s opportunity to work across multiple public clouds like Azure, Office 365 or AWS.  I say it’s been a good day if you you learn something new that day. I firmly believe in always learning or trying to learn. Always! Never stop! So, coming across this was a great find and a trivial piece of luck having come across the client and project I happen to be working on. Timing is everything.

I would recommend putting into place steps to move away, as much as possible, from custom filters, used for user or group filtering, and leverage the now built in filter via attribute adminDescription.

The more standard the deployment, the easier it is to manage, monitor, upgrade and/or maintain moving forward.



Run Login-AzureRmAccount to login.

PowerShell error “Run Login-AzureRmAccount to login.” in AzureRM when already logged in

Usually when I’m writing PowerShell scripts I do it from a development virtual machine with a known environment state. However, yesterday I was trying to do something simple and quick and was writing it on my everyday laptop.

My script was using Windows Management Framework 5.0 and I was creating a new burn environment in AzureRM. I was authenticated and could query and enumerate most of my AzureRM environment, however I was getting erroneous responses on some cmdlets and was unable to create a new resource group. Essentially whenever I tried to perform anything of value to what I was trying to achieve PowerShell would return ”Run Login-AzureRmAccount to login.”

I was authenticated and all looks as it should.


Query ARM RG and get the error, ”Run Login-AzureRmAccount to login.”


I started digging to find out what my environment was looking like. Did I have WMF5 installed?  $PSVersionTable showed I did and all looked as it should.


What modules did I have installed? Get-InstalledModule | Out-GridView


Lots of differing versions is what I saw *Note: screenshot above is after resolving my issue.

So what fixed my issue? I ran Update-Module and watched PowerShell update my modules and get my environment back into spec. A restart of my laptop and back in action.

What caused my issue? I’m thinking I updated an individual module at some point and dependencies were missed.

Hope this helps someone else and saves some of those “this should just be working, why isn’t it” moments.


Accelerating Azure Multi Factor Authentication in Enterprise Organisation

At Kloud we get incredible opportunities to partner with organisations who are global leaders in their particular industry.

Recently we were asked to accelerate Microsoft’s Azure Multi factor authentication for Office 365 users in the cloud throughout an enterprise organisation.

This blog is not so much focused on the technical implementation (there is an incredible amount of technical documentation provided by Microsoft that covers this) but more around what we discovered whilst accelerating the technology throughout the organisation.

Implementing Microsoft multi-factor authentication for Office 365 users is relatively straight forward, its actually quite easy from a technical point of view.

The technical steps as detailed by Microsoft;

Our approach was pieced into a few key building blocks.

Its post enablement that I want to take the time to focus on, and my hope is to stimulate a few thoughts around the areas that we spent the most time in the hope that it will help you successfully role this out successfully to a large usage base. (thousands of people!)

We endeavoured to keep this relatively simple by focusing on the Standard Operating Environment and Communicating  around Azure Multi factor authentication.

Lets unpack these key points in a little more detail.

The Standard Operating Environment

With this particular enterprise organisation they had majority of their SOEs running an instance of Microsoft Office 2010.

The Office 2013 device apps support multi-factor authentication through the use of the Active Directory Authentication Library (ADAL).

Therefore a key task was to ensure all the office clients were able to support multi-factor authentication as outlined above.

As a result a key dependency to accelerating Azure MFA was in having a reliable removal and installation package for Microsoft Office. e.g. automated removal of Office 2010 and process for packaging any new Office 2013 plugins.  Its important to factor this time into your deployment of Multi factor Authentication.

If you don’t have a package that removes and re installs the correct version of Microsoft Office you will encounter a roadblock.

Under communicating the Multi Factor Authentication transformation technology

I cannot stress how important the communication part actually is.

We decided on conducting  a couple wave of pilots over a short period of time. The benefit of this approach was to weed out any minor issues that might confuse the larger groups, taking a position of refining our learnings as we progress through the pilots in a healthy order.

We prioritised in the following groups;

  1. Technology group (Security and End user Computing)
  2. Executive Team (Yes we did this early)
  3. Technology IT (the whole department in one go)
  4. We targeted two Business units (around 300 users combined)
  5. Bulk Azure Multi Factor Authentication production rollout (all remaining Business units)

We noticed a few patterns with respect to communication, all very common as detailed as follows;

In the first group,

The Technology pilot we conducted a workshop, ensuring that they had all the relevant requirements beforehand  and stepped them through the enablement process. We find that they tend to stumble along initially and figure it out for themselves. Not a lot of noise is generated at this level and generally well received, which is fantastic bearing in mind that they are generally drivers around a more secure environment.

The second group,

The Executive Team are not overly concerned and generally welcome in the additional layer of security knowing the current climate around data loss and the publicity it can generate.  Its almost as if they are relieved it has finally arrived. They have an executive support team who are agile and ready to process any communication around the technology and how to successfully deploy.

In the third group,

The Technology department we find much more effort goes into communicating the technology, including workshops on the how and why, but some very visible senior individuals still behave in ways that are antithetical to the technology transformation. “Do I have to add this additional step to authenticate?” The net result is that cynicism among the people goes up, while belief in the communication goes down.  Its important to spend time at this layer (over communicate) to ensure they understand the importance of safeguarding the organisations data. I cannot stress how important the why is in this instance. Its at this level where having done the executive layer is crucial as they would have already seen the benefits of using multi factor authentication and you would have the senior stakeholder actively engaged in the technology component.

Transformation is impossible unless tens, hundreds or thousands of people are willing to help, often to the point of making short-term sacrifices. “I know it’s a pain to click verify on the Azure Multi factor Authentication application but by doing so I am safeguarding my organisations data” They get this revelation by understanding why they are doing what they are doing, the very essence of communication.

Employees will not make sacrifices, even if they are unhappy with the status quo, unless they believe that useful change is possible. Without credible communication, and a lots of it, the hearts and minds of the team are never captured and you run the risk of another failed transformation project.

In more successful transformation pilots, we used all existing communication channels to broadcast the technology transformation. Our guiding principle was simple: Use whatever we can to communicate why Multi factor authentication is critical to the organisation and how it will take place.

Perhaps even more important, most of the successful pilots of this change learned to “walk the talk.”  Communication comes in both words and actions. Nothing undermines change more than behaviour by important individuals that is inconsistent with their words.

Where we landed

The technology component is relatively straight forward the challenge lies in user adoption, spend most of your time in this space, over communicate  so people understand why you are making the leap forward! Your data will be all the more safer for it.