commands

How to configure a Graphical PowerShell Dev/Admin/Support User Interface for Azure/Office365/Microsoft Identity Manager

During the development of an identity management solution I find myself with multiple PowerShell/RDP sessions connected to multiple environments using different credentials often to obtain trivial data/information. It is easy to trip yourself up as well with remote powershell sessions to differing environments. If only there was a simple UI that could front-end a set of PowerShell modules and make those simple queries quick and painless. Likewise to allow support staff to execute a canned set of queries without providing them elevated permissions.

I figured someone would have already solved this problem and after some searching with the right keywords I found the powershell-command-executor-ui from bitsofinfo . Looking into it he had solved a lot of the issues with building a UI front-end for PowerShell with the powershell-command-executor and the stateful-process-command-proxy. That solution provided the framework for what I was thinking. The ability to provide a UI for PowerShell using powershell modules including remote powershell was exactly what I was after. And it was built on NodeJS and AngularJS so simple enough for some customization.

Introduction

In this blog post I’ll detail how I’ve leveraged the projects listed above for integration with;

Initially I had a vision of serving up the UI from an Azure WebApp. NodeJS on Azure WebApp’s is supported, however with all the solution dependencies I just couldn’t get it working.

My fallback was to then look to serve up the UI from a Windows Server 2016 Nano Server. I learnt from my efforts that a number of the PowerShell modules I was looking to provide a UI for, have .NET Framework dependencies. Nano Server does not have full .NET Framework support. Microsoft state to do so would mean the server would no longer be Nano.

For now I’ve deployed an Azure Windows Server 2016 Server secured by an Azure NSG to only allow my machine to access it. More on security later.

Overview

Simply, put the details in Github for the powershell-command-executor provide the architecture and integration. What I will detail is the modifications I’ve made to utilize the more recent AzureADPreview PowerShell Module over the MSOL PowerShell Module. I also updated the dependencies of the solution for the latest versions and hooked it into Microsoft Identity Manager. I also made a few changes to allow different credentials to be used for Azure and Microsoft Identity Manager.

Getting Started

I highly recommend you start with your implementation on a local development workstation/development virtual machine. When you have a working version you’re happy with you can then look at other ways of presenting and securing it.

NodeJS

NodeJS is the webserver for this solution. Download NodeJS for your Windows host here. I’m using the 64-bit version, but have also implemented the solution on 32-bit. Install NodeJS on your local development workstation/development virtual machine.

You can accept all the defaults.

Following the installation of NodeJS download the powershell-command-executor-ui from GitHub. Select Clone 0r Download, Download ZIP and save it to your machine.

Right click the download when it has finished and select Extract All. Select Browse and create a folder at the root of C:\ named nodejs. Extract powershell-command-executor-ui.

Locate the c:\nodejs\powershell-command-executor-ui-master\package.json file.

Using an editor such as Notepad++ update the package.json file ……

…… so that it looks like the following. This will utilise the latest versions of the dependencies for the solution.

From an elevated (Administrator) command prompt in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\program files\nodejs npm” installThis will read the package.json file you edited and download the dependencies for the solution.

You can see in the screenshot below NodeJS has downloaded all the items in package.json including the powershell-command-executor and stateful-process-command-proxy.

When you now list the directories under C:\nodejs\powershell-command-executor-ui-master\node_modules you will see those packages and all their dependencies.

We can now test that we have a working PowerShell UI NodeJS website. From an elevated command prompt whilst still in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\Program Files\nodejs\node.exe” bin\www

Open a browser on the same host and go to http://localhost:3000”. You should see the default UI.

Configuration and Customization

Now it is time to configure and customize the PowerShell UI for our needs.

The files we are going to edit are:

  • C:\nodejs\powershell-command-executor-ui-master\routes\index.js
    • Update Paths to the encrypted credentials files used to connect to Azure, MIM. We’ll create the encrypted credentials files soon.
  • C:\nodejs\powershell-command-executor-ui-master\public\console.html
    • Update for your customizations for CSS etc.
  • C:\nodejs\powershell-command-executor-ui-master\node_modules\powershell-command-executor\O365Utils.js
    • Update for PowerShell Modules to Import
    • Update for Commands to make available in the UI

We also need to get a couple of PowerShell Modules installed on the host so they are available to the site. The two I’m using I’ve mentioned earlier. With WMF5 intalled using Powershell we can simply install them as per the commands below.

Install-Module AzureADPreview
Install-Module LithnetRMA

In order to connect to our Microsoft Identity Manager Synchronization Server we are going to need to enable Remote Powershell on our Microsoft Identity Manager Synchronization Server. This post I wrote here details all the setup tasks to make that work. Test that you can connect via RPS to your MIM Sync Server before updating the scripts below.

Likewise for the Microsoft Identity Manager Service Server. Make sure after installing the LithnetRMA Powershell Module you can connect to the MIM Service using something similar to:

# Import LithnetRMA PS Module
import-module lithnetrma

# MIM AD User Admin
$username = "mimadmin@mim.mydomain.com"
# Password 
$password = "Secr3tSq1rr3l!" | convertto-securestring -AsPlainText -Force
# PS Creds
$credentials = New-Object System.Management.Automation.PSCredential $Username,$password

# Connect to the FIM service instance
# Will require an inbound rule for TCP 5725 (or your MIM Service Server Port) in you Resource Group Network Security Group Config
Set-ResourceManagementClient -BaseAddress http://mymimportalserver.:5725 -Credentials $credentials

 

\routes\index.js

This file details the encrypted credentials the site uses. You will need to generate the encrypted credentials for your environment. You can do this using the powershell-credentials-encryption-tools. Download that script to your workstation and unzip it. Open the credentialEncryptor.ps1 script using an Administrator PowerShell ISE session.

I’ve changed the index.js to accept two sets of credentials. This is because your Azure Admin Credentials are going to be different from your MIM Administrator Credentials (both in name and password). The username for my Azure account looks something like myname@mycompany.com whereas for MIM it is Domainname\Username.

Provide an account name for your Azure environment and the associated password.

The tool will create the encrypted credential files.

Rename the encrypted.credentials file to whatever makes sense for your environment. I’ve renamed it creds1.encrypted.credentials.

Now we re-run the script to create another set of encrypted credentials. This time for Microsoft Identity Manager. Once created, rename the encrypted.credentials file to something that makes sense in your environment. I’ve renamed the second set to creds2.encrypted.credentials.

We now need to copy the following files to your UI Website C:\nodejs\powershell-command-executor-ui-master directory:

  • creds1.encrypted.credentials
  • creds2.encrypted.credentials
  • decryptUtil.ps1
  • secret.key

Navigate back to Routes.js and open the file in an editor such as Notepad++

Update the index.js file for the path to your credentials files. We also need to add in the additional credentials file.

The changes to the file are, the paths to the files we just copied above along with the addition var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE for the second set of credentials used for Microsoft Identity Manager.

var PATH_TO_DECRYPT_UTILS_SCRIPT = "C:\\nodejs\\powershell-command-executor-ui-master\\decryptUtil.ps1";
var PATH_TO_ENCRYPTED_CREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds1.encrypted.credentials";
var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds2-encrypted.credentials";
var PATH_TO_SECRET_KEY = "C:\\nodejs\\powershell-command-executor-ui-master\\secret.key";


Also to initCommands to pass through the additional credentials file


initCommands: o365Utils.getO365PSInitCommands(
 PATH_TO_DECRYPT_UTILS_SCRIPT,
 PATH_TO_ENCRYPTED_CREDENTIALS_FILE,
 PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE,
 PATH_TO_SECRET_KEY,
 10000,30000,3600000),

Here is the full index.js file for reference.

 

public/console.html

The public/console.html file is for formatting and associated UI components. The key things I’ve updated are the Bootstrap and AngularJS versions. Those are contained in the top of the html document. A summary is below.

https://ajax.googleapis.com/ajax/libs/angularjs/1.6.1/angular.min.js
https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.6.1/angular-resource.min.js
http://javascripts/ui-bootstrap-tpls-2.4.0.min.js
http://javascripts/console.js
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap-theme.min.css">

You will also need to download the updated Bootstrap UI (ui-bootstrap-tpls-2.4.0.min.js). I’m using v2.4.0 which you can download from here. Copy it to the javascripts directory.

I’ve also updated the table types, buttons, colours, header, logo etc in the appropriate locations (CSS, Tables, Div’s etc). Here is my full file for reference. You’ll need to update for your colours, branding etc.

powershell-command-executor\O365Utils.js

Finally the O365Utils.js file. This contains the commands that will be displayed along with their options, as well as the connection information for your Microsoft Identity Manager environment.

You will need to change:

  • Line 52 for the address of your MIM Sync Server
  • Line 55 for the addresses of your MIM Service Server
  • Line 141 on-wards for what commands and parameters for those commands you want to make available in the UI

Here is an example with a couple of AzureAD commands, a MIM Sync and a MIM Service command.

Show me my PowerShell UI Website

Now that we have everything configured let’s start the site and browse to it. If you haven’t stopped the NodeJS site from earlier go to the command window and press Cntrl+C a couple of times. Run “c:\Program Files\nodejs\node.exe” bin\www again from the C:\nodejs\powershell-command-executor-ui-master directory unless you have restarted the host and now have NodeJS in your environment path.

In a browser on the same host go to http://localhost:3000 again and you should see the site as it is below.

Branding and styling from the console.html, menu options from the o365Utils.js and when you select a command and execute it data from the associated service …….

… you can see results. From the screenshot below a Get-AzureADUser command for the associated search string executed in milliseconds.

 

Summary

The powershell-command-executor-ui from bitsofinfo is a very extensible and powerful NodeJS website as a front-end to PowerShell.

With a few tweaks and updates the look and feel can be easily changed along with the addition of any powershell commands that you wish to have a UI for.

As it sits though keep in mind you have a UI with hard-coded credentials that can do whatever commands you expose.

Personally I am running one for my use only and I have it hosted in Azure in its own Resource Group with an NSG allowing outgoing traffic to Azure and my MIM environment. Incoming traffic is only allowed from my personal management workstations IP address. I also needed to allow port 3000 into the server on the NSG as well as the firewall on the host. I did that quickly using the command below.

# Enable the WebPort NodeJS is using on the firewall 
netsh advfirewall firewall add rule name="NodeJS WebPort 3000" dir=in action=allow protocol=TCP localport=3000

Follow Darren on Twitter @darrenjrobinson

mim-db-not-populated

Resolving “The Microsoft Identity Manager server database could not be successfully populated” installation error

Here is yet another of those Microsoft Identity Manager installation errors that doesn’t give you much information and when looking for a resolution you can’t find an exact match through Dr Google.

Nearing the end of the Microsoft Identity Manager Service and Portal installation you receive the “The Microsoft Identity Manager server database could not be successfully populated” error.

Looking into the installation log (which I’m in the good practice of always initiating when doing an installation of the MIM Service/Portal these days eg. msiexec /i “e:\Service and Portal\Service and Portal.msi” /l*v c:\temp\install.log )  didn’t give up much information at all. Fatal Error. Dialog created.

Looking at the server that the installation was being done on I could see that it was being spanked. This server is for a customers development environment, hosted in Azure but also done rather frugally (my Virtual Machine was running, SQL, MIM Sync, all the dependencies for the MIM Portal and then the MIM Service/Portal itself). FWIW the initially provisioned VM was an Azure DS1v2 server. Seems a character may have got lost in the VM request where an Azure DS11v2 server would have been more appropriate.

I re-sized the Azure VM and actually chose to go with an Azure DS3v2 VM size. I kicked off the Microsoft Identity Manager Service & Portal installation again and ….

….. SUCCESS.

Hope this helps someone else who may find themselves in a similar position.

Follow Darren on Twitter @darrenjrobinson

2337-error-only

Microsoft Identity Manager installation error “Internal Error 2337. 0, Microsoft.MetadirectoryServices.host.dll”

Today I was doing a fresh installation of Microsoft Identity Manger 2016 with Service Pack 1 into a new development environment. The exact binary is “en_microsoft_identity_manager_2016_with_service_pack_1_x64_dvd_9270854”

Not too far into the installation of the Microsoft Identity Manager Synchronization Server I got the “Internal Error 2337. 0, Microsoft.MetadirectoryServices.host.dll” error as shown below.

Doing a few searches didn’t throw me any bones. I could see that the installation had added the MIM Sync Server Service Account to the Logins on the SQL Server.

I then recalled that there was an updated version that was released just before Xmas (14 Dec 2016). The exact binary is “en_microsoft_identity_manager_2016_with_service_pack_1_x64_dvd_9656597” and you can get the updated Microsoft Identity Manager 2016 with SP1 media here.

Re-running the installation and ….

….. SUCCESS.

Now I’m not going to spend any time trying to figure out was is bad with the base MIM 2016 SP1 media, I’m just going to pretend it never existed and use the latest. Onward and upwards.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

 

Real world Azure AD Connect: the case for TWO Azure AD Connect servers

Originally posted on Lucian’s blog at clouduccino.com. Follow Lucian on Twitter @LucianFrango.

***

I was exchanging some emails with an account manager (Andy Walker) at Kloud and thought the exchange would be for some interesting reading. Here’s the outcome in an expanded and much more helpful (to you dear reader) format…

***

Background

When working with the Microsoft Cloud and in particular with identity, depending on some of the configuration options, it can be quite important to have Azure AD Connect highly available. Unfortunately for us, Microsoft has not developed AADConnect to be highly available. That’s not ideal in today’s 24/7/365 cloud-centric IT world.

When disaster strikes, and yes that was a deliberate use of the word WHEN, with email, files, real-time communication and everything in between now living up in the sky with those clouds, should your primary data centre be out of action, there is a high probability that a considerable chunk of functionality will be affected by your on-premises outage.

Azure AD Connect

AADConnect’s purpose, it’s only purpose, is to synchronise on-premises directories to Azure AD. There is no other option. You cannot sync an ADDS forest to another with Azure AD Connect. So this simple and lightweight tool only requires a single deployment in any given ADDS forest.

Even when working with multiple forests, you only need one. It can even be deployed on a domain controller; which can save on an instance when working with the cloud.

The case for two Azure AD Connect servers

As of Monday April 17th 2017, just 132 days from today (Tuesday December 6th), DirSync and Azure AD Sync, the predecessors to Azure AD Connect will be deprecated. This is good news. This means that there is an opportunity to review your existing synchronisation architecture to take advantage of a feature of AADConnect and deploy two servers.

2016-12-06-two-aadc-v01

Staging mode is fantastic. Yes, short and sweet and to the point. It’s fantastic. Enough said, blog done; use it and we’re all good here? Right?

Okay, I’ll elaborate. Staging mode allows for the following rapid fire benefits which greatly improves IT as a result:

1 – Redundancy and disaster recovery, not high availability

I’ve read in certain articles that staging mode offers high availability. I disagree and argue it offers redundancy and disaster recovery. High availability is putting into practice architecture and design around applications, systems or servers that keeps those operational and accessible as near as possible to 100% uptime, or always online, mostly through automated solutions.

I see staging mode as offering the ability to manually bring online a secondary AADConnect server should the primary be unavailable. This could be due to a disaster or a scheduled maintenance window. Therefore it makes sense to deploy a secondary server in an alternate data centre, site or geographic location to your primary server.

To do the manual update of the secondary AADConnect server config and bring it online, the following needs to take place:

  • Run Azure AD Connect
    • That is the AzureADConnect.exe which is usually located in the Microsoft Azure Active Directory Connect folder in Program Files
  • Select Configure staging mode (current state: enabled), select next
  • Validate credentials and connect to Azure AD
  • Configure staging mode – un-tick
  • Configure and complete the change over
  • Once this has been run, run a full import/full sync process

This being a manual process which takes time and breaks service availability to complete (importantly: you can’t have two AADConnect servers synchronising to a single Azure AD tenant), it’s not highly available.

2 – Update and test management

AADConnect now offers (since February 2016) an in-place automatic upgrade feature. Updates are rolled out by Microsoft and installed automatically. Queue the alarm bells!

I’m going to take a stab at making an analogy here: you wouldn’t have your car serviced while it’s running and taking you from A to B, so why would you have a production directory synchronisation server upgraded while its running. A bit of a stretch, but, can you see where I was going there?

Having a secondary AADConnect server allows updates and or server upgrades without impacting the core functionality of directory synchronisation to Azure AD. While this seems trivial, it’s certainly not trivial in my books. I’d want my AADConnect server functioning correctly all the time with minimal downtime. Having the two servers can allow for a 15 min (depending on how fast you can click) managed change over from server to server.

3 – Improved management and less impact on production

Lastly, this is more of a stretch, but, I think it’s a good practice to not have user accounts assigned Domain Admin rights. The same applies to logging into production servers to do trivial tasks. Having a secondary AADConnect server allows for IT administrators or service desk engineers to complete troubleshooting of sync processes for the service or objects away from the production server. That, again in my books, is a better practice and just as efficient as using primary or production servers.

Final words

Staging mode is fantastic. I’ve said it again. Since I found out about staging mode around this time last year, I’ve recommended to every customer I’ve mentioned AADConnect too, to use two servers referencing the 3 core arguments listed above. I hope that’s enough to convince you dear reader to do so as well.

Best,

Lucian

Real world Azure AD Connect: multi forest user and resource + user forest implementation

Originally posted on Lucian’s blog at clouduccino.com. Follow Lucian on Twitter @LucianFrango.

***

Disclaimer: During October I spent a few weeks working on this blog posts solution at a customer and had to do the responsible thing and pull the pin on further time as I had hit a glass ceiling. I reached what I thought was possible with Azure AD Connect. In comes Nigel Jones (Identity Consultant @ Kloud) who, through a bit of persuasion from Darren (@darrenjrobinson), took it upon himself to smash through that glass ceiling of Azure AD Connect and figured this solution out. Full credit and a big high five!

***

tl;dr

  • Azure AD Connect multi-forest design
  • Using AADC to sync user/account + resource shared forest with another user/account only forest
  • Why it won’t work out of the box
  • How to get around the issue and leverage precedence to make it work
  • Visio’s on how it all works for easy digestion

***

In true Memento style, after the quick disclaimer above, let me take you back for a quick background of the solution and then (possibly) blow your mind with what we have ended up with.

Back to the future

A while back in the world of directory synchronisation with Azure AD, to have a user and resource forest solution synchronised required the use of Microsoft Forefront Identity Manager (FIM), now Microsoft Identity Manager (MIM). From memory, you needed the former of those products (FIM) whenever you had a multi-forest synchronisation environment with Azure AD.

Just like Marty McFly, Azure AD synchronisation went from relative obscurity to the mainstream. In doing so, there have been many advancements and improvements that negate the need to ever deploy FIM or MIM for ever the more complex environment.

When Azure AD Connect, then Azure AD Sync, introduced the ability to synchronise multiple forests in a user + resource model, it opened the door for a lot of organisations to streamline the federated identity design for Azure and Office 365.

2016-12-02-aadc-design-02

In the beginning…

The following outlines a common real world scenario for numerous enterprise organisations. In this environment we have an existing Active Directory forest which includes an Exchange organisation, SharePoint, Skype for Business and many more common services and infrastructure. The business grows and with the wealth and equity purchases another business to diversity or expand. With that comes integration and the sharing of resources.

We have two companies: Contoso and Fabrikam. A two-way trust is set up between the ADDS forests and users can start to collaboration and share resources.

In order to use Exchange, which is the most common example, we need to start to treat Contoso as a resource forest for Fabrikam.

Over at the Contoso forest, IT creates disabled user objects and linked mailboxes Fabrikam users. When where in on-premises world, this works fine. I won’t go into too much more detail, but, I’m sure that you, mr or mrs reader, understand the particulars.

In summary, Contoso is a user and resource forest for itself, and a resource forest for Fabrikam. Fabrikam is simply a user forest with no deployment of Exchange, SharePoint etc.

How does a resource forest topology work with Azure AD Connect?

For the better part of two years now, since AADConnect was AADSync, Microsoft added in support for multi-forest connectivity. Last year, Jason Atherton (awesome Office 365 consultant @ Kloud) wrote a great blog post summarising this compatibility and usage.

In AADConnect, a user/account and resource forest topology is supported. The supported topology assumes that a customer has that simple, no-nonsense architecture. There’s no room for any shared funny business…

AADConnect is able to select the two forests common identities and merge them before synchronising to Azure AD. This process uses the attributes associated with the user objects: objectSID in the user/account forest and the msExchMasterAccountSID in the resource forest, to join the user account and the resource account.

There is also the option for customers to have multiple user forests and a single resource forest. I’ve personally not tried this with more than two forests, so I’m not confident enough to say how additional user/account forests would work out as well. However, please try it out and be sure to let me know via comments below, via Twitter or email me your results!

Quick note: you can also merge two objects by sAmAccountName and sAmAccountName attribute match, or specifying any ADDS attribute to match between the forests.

Compatibility

aadc-multi-forest

If you’d like to read up on this a little more, here are two articles reference in detail the above mentioned topologies:

Why won’t this work in the example shown?

Generally speaking, the first forest to sync in AADConnect, in a multi-forest implementation, is the user/account forest, which likely is the primary/main forest in an organisation. Lets assume this is the Contoso forest. This will be the first connector to sync in AADConnect. This will have the lowest precedence as well, as with AADConnect, the lower the precedence designated number, the higher the priority.

When the additional user/account forest(s) is added, or the resource forest, these connectors run after the initial Contoso connector due to the default precedence set. From an external perspective, this doesn’t seem like much of a bit deal. AADConnect merges two matching or mirrored user objects by way of the (commonly used) objectSID and msExchMasterAccountSID and away we go. In theory, precedence shouldn’t really matter.

Give me more detail

The issue is that precedence does in deed matter when we go back to our Contoso and Fabrikam example. The reason that this does not work is indeed precedence. Here’s what happens:

2016-12-02-aadc-whathappens-01

  • #1 – Contoso is sync’ed to AADC first as it was the first forest connected to AADC
    • Adding in Fabrikam first over Contoso doesn’t work either
  • #2 – The Fabrikam forest is joined with a second forest connector
  • AADC is configured with user identities exist across multiple directories
  • objectSID and msExchMasterAccountSID is selected to merge identities
  • When the objects are merged, sAmAccountName is taken Contoso forest – #1
    • This happens for Contoso forest users AND Fabrikam forest users
  • When the objects are merged, mail or primarySMTPaddress is taken Contoso forest – #1
    • This happens for Contoso forest users AND Farikam forest users
  • Should the two objects not have a completely identical set of attributes, the attributes that are set are pulled
    • In this case, most of the user object details come from Fabrikam – #2
    • Attributes like the users firstname, lastname, employee ID, branch / office

The result is this standard setup is having Fabrikam users with their resource accounts in Contoso sync’ed, but, have their UPN set with the prefix from the Contoso forest. An example would be a UPN of user@contoso.com rather than the desired user@fabrikam.com. When this happens, there is no SSO as Windows Integrated Authentication in the Fabrikam forest does not recognise the Contoso forest UNP prefix of @contoso.com.

Yes, even with ADDS forest trusts configured correctly and UPN routing etc all working correctly, authentication just does not work. AADC uses the incorrect attributes and sync’s those to Azure AD.

Is there any other way around this?

I’ve touched on and referenced precedence a number of times in this blog post so far. The solution is indeed precedence. The issue that I had experienced was a lack of understanding of precedence in AADConnect. Sure it works on a connector rule level precedence which is set by AADConnect during the configuration process as forests are connected to.

Playing around with precedence was not something I want to do as I didn’t have enough Microsoft Identity Manager or Forefront Identity Manager background to really be certain of the outcome of the joining/merging process of user and resource account objects. I know that FIM/MIM has the option of attribute level precedence, which is what we really wanted here, so my thinking as that we needed FIM/MIM to do the job. Wrong!

In comes Nigel…

Nigel dissected the requirements over the course of a week. He reviewed the configuration in an existing FIM 2010 R2 deployment and found the requirements needed of AADConnect. Having got AADConnect setup, all that was required was tweaking a couple of the inbound rules and moving higher up the precedence order.

Below is the AADConnect Sync Rules editor output from the final configuration of AADConnect:

2016-12-01-syncrules-01

The solution centres around the main precedence rule, rule #1 for Fabrikam (red arrows pointing and yellow highlight) to be above the highest (and default) Contoso rule (originally #1). When this happened, AADConnect was able to pull the correct sAmAccountName and mail attributes from Fabrikam and keep all the other attributes associated with Exchange mailboxes from Contoso. Happy days!

Final words

Tinkering around with AADConnect shows just how powerful the “cut down FIM/MIM” application is. While AADConnect lacks the in-depth configuration and customisation that you find in FIM/MIM, it packs a lot in a small package! #Impressed

Cheers,

Lucian

overview

Using an Azure Function to search the FIM/MIM Metaverse, create a Set and update the Set membership in the the FIM/MIM Service

Introduction

This is the third and last post in this series of integrating Microsoft Identity Manager with Azure Functions.

The first detailed how to use an Azure Function to retrieve data from the MIM Service Server. The second detailed how to use an Azure Function to retrieve data from the MIM Sync (Metaverse) Server.

This third post combines the two and then performs an action in the MIM Service. The practical purpose of this could be functions like “find all users in location y” and “enable them for entitlement x” or “add an attribute value on each of their objects”.

Overview

The reasoning for the two stage approach is that in my experience it is a lot easier to search the Metaverse than the MIM Service to find an object(s), but also the Metaverse has all the information about objects whereas the MIM Service is a ShadowVerse of the Metaverse containing a subset of the managed objects metadata.

Moving forward then the architecture is a hybrid of the first two posts that introduced the concepts associated with integrating MIM with Azure Functions. As per the other two posts this is a base working example and concept.

Prerequisites

The prerequisites are the same as for the 1st and 2nd posts in this series. You’ll need to work through those examples to setup the dependencies and prerequisites. From there you can create one more Azure Function that brings everything together. That’s what I’m covering in this post.

Therefore the prerequisites are;

  • Azure Tenant and a Function Plan
  • Microsoft Identity Manager implementation
  • Remote Powershell configured for your MIM Sync Server
  • Lithnet FIM MIIS Automation Powershell Module installed on your MIM Sync Server
  • The necessary Firewall Rules on your MIM Sync Server and your Azure Network Security Group (assuming your MIM Infrastructure is in Azure) to allow Azure Functions to communicate with MIM Sync and Service Servers

 

This Example performs the following

In this example the HTTP Trigger Azure Function;

  • Takes input for ObjectType, Attribute, AttributeValue, SetName
  • Searches the MIM Sync Metaverse for the input ObjectType, with the AttributeValue in the Attribute
  • Connects to the MIM Service
  • Creates the Set based of the input SetName if it doesn’t exist
  • Adds the objects from the search to the Set
  • Returns the objects added to the Set

In a real world implementation you’d do the above with a criteria based set. This post is a concept of search and find, performing a create and updating. That has many practical applications.

Create your new Azure Function

Just like the other two posts, we’re going to create a new Powershell HTTP Trigger Azure Function.

Upload the Lithnet RMA PS Module to your new Azure Function (as per blog post 1 in this series). You should also be using protected credentials now as well. So upload your username/password encryption key.

 

Here is the Azure Function Powershell Script that performs the process detailed above.

Test it out. Looks good. 88 users matched the value of Sydney in their location attribute.

Verify that the Set was created and the membership updated.

Test calling the Azure Function remotely

Now that it is all working in the Azure Function, lets try doing it from Powershell remotely. This time I’m again looking for Person objects that have Sydney in their location attribute and I’ll create a set named Sydney-NSW and put them in it.

Brilliant, that works nicely. Let’s verify that the Set was created and has the correct number of users in it. Yes, a perfect match.

Summary

Putting Azure Functions and Powershell together along with the Lithnet Powershell Modules opens up a world of possibilities for automation and integration of the MIM Service without the need for any additional infrastructure or any considerable effort.

Experiment and let me know what you do with this style of integration.

Follow Darren on Twitter @darrenjrobinson

arch

Remotely managing your FIM/MIM Synchronisation Server using Powershell and the Lithnet MIIS Automation Powershell Module

Background

I’ve been using Ryan’s Lithnet MIIS Automation Powershell Module for a few months now as you’ve likely seen from some of my blog posts.

The module and its installer direct you to install the module on your FIM/MIM Synchronisation Server. This all makes perfect sense as the FIM/MIM Synchronsation Sever is more of your traditional server application. However we are no longer living in that kind of IT world. Consultants, Administrators, Architects, DevOps etc all want the flexibility to use their own workstations, administrative workstations, automation services etc.

So how do we apply that to the FIM/MIM Synchronisation Server ? Well, with the Lithnet MIIS Automation Powershell Module installed on your FIM/MIM Synchronisation Server you can, thanks to the wonderful thing that is Powershell. Thanks Jeffrey Snover.

Overview

In this blog post I’ll detail the quick and easy steps to enable you to remotely administer, orchestrate, report and query your FIM/MIM Sync Server and Metaverse using Powershell and the Lithnet Automation Powershell Module.

The diagram below outlines the topology. Essentially a standard MIM Sync Sever deployment in an Active Directory Domain. An Admin with a workstation in the same domain with domain credentials.

Prerequisites

It should be pretty obvious by now, but you’ll need;

  • A FIM/MIM Synchronisation Server
    • at least one connected system with a configuration that populates your Metaverse with holograms
  • Download and install the Lithnet MIIS Automation Powershell Module on your FIM/MIM Sync Server
  • The account you will use to connect to the FIM/MIM Sync Server must be in the Administrators Group on the FIM/MIM Sync Server
  • The account you will use to connect to the FIM/MIM Sync Server must be in the FIM/MIM Admins Role Group

Enable the MIM Sync Server for Remote Powershell

In a domain environment as described above this is straight forward. On your FIM/MIM Sync Server we need to enable Powershell Remoting. This is so we can leverage the Lithnet MIIS Automation Powershell module (that is a prerequisite that you’ve already installed right).

On the FIM/MIM Synchronisation Server open Powershell (as Administrator) and execute the command  Enable-PSRemoting -Force 

Test from another server in your network that you can access the MIM Sync Server. I did this from my MIM Service Server.

Establishing a Remote Powershell Session to your FIM/MIM Sync Server

Now you’re ready to start a remote session into your FIM/MIM Sync Server. Take the following snippets and put them into an Administrator Powershell ISE session, modify for your FIM/MIM Sync server name and your Admin username (if you’re not already in a session with that privileged account) and try connecting.

Success we’re connected, a remote session.

Now let’s run a couple of queries using two of the cmdlets from the Lithnet MIIS Automation PS Module. One to get a user and the other to get the MA Stats for the Twitter MA.

Success. Brilliant. Simple.

Troubleshooting

Server User Role Permissions

If you are authenticating with an account without enough permissions for Remote Powershell you’ll get the following message. Access is Denied. Whilst you would expect that putting the user account into the “Remote Management Users” would/should be enough, in my experience you need to have the account you’re connecting with in the Administrators group on the FIM/MIM Server. If there is another method of least privilege please let me know.

 

MIM Sync Permissions

If you aren’t in a FIM/MIM Role for the tasks you are looking to perform, you will get an error similar to that below. You can see I could connect to the MIM Sync Server with Remote Powershell, but could not run the Get-MVObject cmdlet.

If you are in the FIM/MIM Operators Role Group you’d think you could return an object. No. You get an error message like the one below.

When the same account is in the FIM/MIM Admins Role Group, Success.

 

Summary

That is the quick start guide to using Remote Powershell and the Lithnet MIIS Automation Powershell Module to manage your FIM/MIM Sync Server. Automate and Manager away.

You should now think about additional security and restricting what hosts can connect to your FIM/MIM Sync Server using RPS. See Restricting WinRM Hosts here.

 

Follow Darren on Twitter @darrenjrobinson

solution-arch-azure-metaverse-search

Using Azure Functions with the Lithnet MIIS Automation Powershell Module to query your Microsoft Identity Manager Metaverse

This is the 2nd blog continuing on from this post which is an introduction to using Azure Functions with the Lithnet FIM/MIM Powershell Modules. If you haven’t read that one please do so to get up to speed before this one as it has more detail around the setup.

Overview

This post details similar functionality to the first post but with integration to the FIM/MIM Synchronisation Server and the FIM/MIM Metaverse rather than the FIM/MIM Service.

The solution is based around an Azure Function that;

  • takes a HTTP WebRequest that contains a payload with the ObjectType, AttributeName and AttributeValue to search for in the Metaverse
  • The Azure Function uses Remote Powershell to call the Lithnet MIIS Automation Powershell Module installed on the FIM/MIM Sync Server
  • The Lithnet Powershell Module takes the query from the Azure Function, executes the query and returns the result to the Azure Function and the requesting client
  • Note: My MIM Infrastructure is all located in Azure so there are configuration steps in this solution to allow access into my Azure environment. If your FIM/MIM infrastructure is elsewhere you’ll need to transpose the appropriate firewall rules for your architecture

Let’s get started.

Prerequisites

The prerequisites for this solution are;

  • An Azure Tenant
  • FIM/MIM Sync Server (as per the diagram above) with data in your Metaverse from a connected directory service (such as Active Directory)
  • I’ll also be using the awesome Lithnet MIIS Powershell Module from here for Microsoft FIM/MIM from Ryan Newington. A fantastic contribution to the FIM/MIM community
    • You’ll need to download and install it on your FIM/MIM Synchronisation Server. This differs from the Lithnet Module from the first post in this series as this one is specific to the Metaverse not the FIM/MIM Service.

Enable Powershell Remoting on the FIM/MIM Sync Server

On the FIM/MIM Sync Server where we will be sending requests from the Function App we need to enable Powershell Remoting. This is so we can leverage the Lithnet MIIS Automation Powershell module (that is a prerequisite to be installed on your FIM/MIM Sync Server).

On the FIM/MIM Synchronisation Server open Powershell (as Administrator) and execute the command  Enable-PSRemoting -Force 

 

Test from another server in your network that you can access the MIM Sync Server. I did this from my MIM Service Server.

 

PSRemote Inbound Security Rule (Azure NSG)

Using Powershell Remote means we need to have an incoming rule into the Azure Network where my MIM Sync Server is located to allow connections from Azure Functions to my MIM Sync Server. Create an Inbound Rule in your Azure Network Security Group for TCP Port 5986 as per the rule below.

 

Create a Self Signed Cert on the FIM/MIM Sync Server

To secure the connection using Remote Powershell we will secure the HTTPS connection with a certificate. This is because the Azure Function is not a member of the domain where your FIM/MIM Sync Server is located. In this example I’m using a self-signed certificate.

In Powershell (as Administrator) on your FIM/MIM Sync Server run the following command where the DNSName is the DNS name of your FIM/MIM Sync that will resolve from Azure Functions to your FIM/MIM Sync server.

New-SelfSignedCertificate -DnsName mymimsyncserver.westus.cloudapp.azure.com -CertStoreLocation Cert:\LocalMachine\My

Create a Remote Powershell HTTPS Listener

Copy the thumbprint from the self-signed certificate above and use it along with the DNS name of your FIM/MIM Sync Server to run the following command in an Administrator command prompt on your FIM/MIM Sync Server.

winrm create winrm/config/Listener?Address=*+Transport=HTTPS @{Hostname=”mymimsyncserver.westus.cloudapp.azure.com”;CertificateThumbp
rint=”536E41D6089F35ABCDEFD8C52BE754EFF0B279B”}

 

Allow Powershell Remote (HTTPS) through your firewall on your FIM/MIM Sync Server

In an Administrator command prompt run the following command to create a new inbound firewall rule for the Remote Powershell session from your Azure Function.

netsh advfirewall firewall add rule name="WinRM-HTTPS" dir=in localport=5986 protocol=TCP action=allow

Check that the new firewall rule was created successfully.

 

Create your HTTP Request Function

Create a new HTTP Trigger Function choosing Powershell as the language. More detailed steps to do this is in the first post in this series here.

Search FIM/MIM Metaverse Function App Script

Here is the base script to get you started. This differs a little from the first blog post example in that I’ve secured the username and password for connection to my MIM Sync Server. Details on how to do that are also linked to in the first blog post.

Also in this example I’m running Remote Powershell to execute the command on the FIM/MIM Sync Server as that is where the Lithnet MIIS Automation Powershell Module is installed and needs to run.

The following script;

  • Takes an HTTP request with Object Type, AttributeName, AttributeValue
  • It uses a Script Block to take the input variables from the HTTP request and perform a a Powershell Remote command (in this example Get-MVObject)
  • Returns the object to the output

Save the function once you’ve added the script (and updated it for your credentials, target FIM/MIM Sync Server etc).

Bring up the Test dialog and give the script some input values in the Request Body that will result in a successful query result from your Metaverse. Select Run. If you’ve done everything correctly you’ll see an object returned from the Metaverse.

Test the Function App

 

Execute the Azure Function from an HTTP Trigger

Now lets try it remotely. Here is a quick Powershell query to the Azure Function using the Powershell Invoke Rest Method using the same input to the Azure Function. And huzzah a returned object.

Summary

This concept provides a framework to allow a plethora of possibilities all possible through a combination of Azure Functions and the Lithnet MIIS Automation PS Module. The Lithnet MIIS PS Module provides all the functionality you get from being on the MIM Sync Server, but now you can retrieve information remotely or trigger functions remotely.

Follow Darren on Twitter @darrenjrobinson

 

 

 

Microsoft Identity Manager Service and Portal Setup Wizard ended prematurely

Last week I was installing the Microsoft Identity Manager Service and Portal on a relatively fresh build of a Windows 2012R2 Server that also included an automated installation of SharePoint Server 2013 w/ SP1.

After going through all the installation configuration options and having the installation start I got the extremely helpful “Setup Wizard ended prematurely” error message.

Having been in this situation previously on other installs (but for different reasons) I knew it was time to kick off the installation again from the command prompt with logging to an installation log file as shown below.

msiexec /i “e:\Service and Portal\Service and Portal.msi” /l*v c:\temp\install.log 

Providing all the same installation configuration options again the install kicked off again and obviously returned the same “Setup Wizard ended prematurely” error message. But this time I had the installation log file to find out why.

Reading through the log file I found the System.Reflection.TargetInvocationException error as highlighted in the screenshot below. A SharePoint issue. The http://localhost could not be found. What? I could access the SharePoint Central Administration site and I could see everything I’d normally expect.

Until I looked in the Alternate Access Mappings.

For whatever reason my only entry was for the MIM Portal and Service URL that I was trying to install.

I updated the Alternate Access Mappings to included an additional mapping for http://localhost as per the screenshot below.

 

Running the installation process again (and re-entering all the setup configuration) and everything went through as it should. Hope this helps by saving someone else an hour or so.

Follow Darren on Twitter @darrenjrobinson

 

 

solution-arch

Get Users/Groups/Objects from Microsoft/Forefront Identity Manager with Azure Functions and the Lithnet Resource Management Powershell Module

Introduction

As an Identity Management consultant if I had a $1 for every time I’ve been asked “what is user x’s current status in IDAM”, “is user x active?”, “does user x have an account in y?”, “what is user x’s primary email address?”, particularly after Go Live of an IDAM solution my holidays would be a lot more exotic.

From a Service Desk perspective IDAM implementations are often a black box in the middle of the network that for the most part do what they were designed and implemented to do. However as soon as something doesn’t look normal for a user the Service Desk are inclined to point their finger at that black box (IDAM Solution). And the “what is the current value of ..”, “does user x ..” type questions start flying.

What if we could give the Service Desk a simple query interface into FIM/MIM without needing to give them access to another complicated application?

This is the first (of potentially a series) blog post on leveraging community libraries and Azure PaaS services to provide visibility of FIM/MIM information. This first post really just introduces the concept with a working example in an easy way to understand and replicate. It is not intended for production implementation without additional security and optimisation. 

Overview

The following graphic shows the concept of using Azure Functions to take requests from a client (web app, powershell, some other script) query the FIM/MIM Service and return the result. This post details the setup and configuration for the section in the yellow shaded box with the process outlined in the numbers 1, 2 & 3. This post assumes you already have your FIM/MIM implementation setup and configured according to your connected integrated applications/services such as Active Directory. In my example my connected datasource is actually Twitter.

Prerequisites

The prerequisites for this solution are;

 

Creating your Azure App Service

First up you’ll need to create an Azure App Service in your Azure Tenant. To keep everything logically structured for this example I created an Azure App Service in the same resource group that contains my MIM IaaS infrastructure (MIM Sync Server, MIM Service Server, SQL Server, AD Domain Controller etc).

In the Azure Marketplace select New (+) and search for Function App. Select the Function App item from the results and select Create.

Give your Azure App Service a name, choose the Resource Group where you want to locate it. Choose Dynamic for the Hosting Plan. This means you don’t have to worry about resource management for your Web App and you only pay for execution time which unless you put this into production and have gone crazy with it your costs should be zero as they will (should) be well under the free grant tier.  Put the application in the appropriate location such as close to your FIM/MIM resources that it’ll be interacting with and select Create.

Now that you have your Azure App Service setup, you need to create your Azure Function.

Create an HTTP Trigger Powershell Function App

In the Azure Portal locate your App Services Blade and select the Function App created in the steps above. Mine was named MIMMetaverseSearch in the example above. Select PowerShell as the Language and HttpTrigger-Powershell as the Function type.

Give your Function a name. I’ve kept it simple in this example and named it the same as my App Service Plan. Select Create.

Adding the Lithnet Powershell Module into your Function App.

As you’d expect the Powershell Function App by default only has a handful of core Powershell Modules. As we’re using something pretty specific we’ll need put the module into our Function App so we can load it and use the library.

Download and save the Lithnet Resource Management Powershell Module to your local machine. Something like the Powershell command below will do that.

Next follow this great blog post here from Tao to upload the Lithnet RMA PS Module you downloaded earlier into your function directory. I used WinSCP as my FTP client as I’ve shown below to upload the Lithnet RMA PS Module.

FTP to the host for your App Service and navigate to the /site/wwwroot/

Create a bin folder and upload via FTP the Lithnet RMA PS Module.

Using Kudu navigate to the path and version of the Lithnet RMA PS Module.
I’m using v1.0.6088 and my app is named MIMMetaverseSearch so MY path is D:\home\site\wwwroot\MIMMetaverseSearch\bin\LithnetRMA\1.0.6088

Note: the Lithnet RMA PS Module is 64-bit so you’ll need to configure your Web App for 64-bit as per the info in the same blog you followed to upload the module here.

Test loading the Lithnet RMA PS Module in your Function App

In your Function App select </> Develop. Remove the sample script and in your first line import the Lithnet RMA PS Module using the path from the previous step. Then, to check that it loads add a line that references a cmdlet in the module. I used Help Get-Resource. Select Save then Run.

If you’ve done everything correct when you select Run and look at the Logs you’ll see the module was loaded and the Help Get-Resource command was run in the Logs.

Allow your Function App to access your FIM/MIM Service Server

Even though you have logically placed your Function App in the same Resource Group (if you did it like I have) you’ll need to actually allow the Function App that is running in a shared PaaS environment to connect to your FIM/MIM Service Server.

Create an inbound rule in your Network Security Group to allow access to your FIM/MIM Service Server. The example below isn’t as secure as it could (and should) be as it allows access from anywhere. You should restrict the source of the request(s) accordingly. I’m just showing how to quickly get a working example. TCP Port 5725 is required to access your MIM Service Server. Enter the details as per below and select Ok.

Using an Azure Function to query FIM/MIM Service

Note: Again, this is an example to quickly show the concept. In the script below your credentials are in the script in clear text (and of course those below are not valid). For anything other than validating the concept you must protect your credentials. A great example is available here in Tao’s post.

The PS Azure Function gets the incoming request and converts it from JSON. In my request which you’ll see in the next step I’m passing in “displayName” and “objectType”.

In this example I’m using Get-Resource from the Lithnet RMA to get an object from the FIM/MIM Service. First you need to open a connection to the FIM/MIM Service Server. On my Azure IaaS MIM Service Server I’ve configured a DNS name so you can see I’m using that name in line 17 to connect to it using the unsecured credentials from earlier in the script. If you haven’t set up a DNS name for your FIM/MIM Service Server you can use the Public IP Address instead.

Line 20 queries for the ObjectType and DisplayName passed into the Function (see calling the Function in the next step) and returns the response in line 22. Again this is just an example. There is no error checking, validation or anything. I’m just introducing the concept in this post.

Testing your Function App

Now that you have the function script saved, you can test it from the Function App itself. Select Test from the options up in the right from your function. Change the Request Body for what the Function is expecting. In my case displayname and objectType. Select Run and in the Logs if you’ve got everything configured correctly (like inbound network rules, DNS name, your FIM/MIM Service Server is online, your query is for a valid resource) you should see an object returned.

Calling the Function App from a Client

Now that we have our Function App all setup and configured (and tested in the Function App) let’s send a request to the Azure Function using the Powershell Invoke-RestMethod function. The following call I did from my laptop. It is important to note that there is no authN in this example and the function app will be using whatever credentials you gave it to execute the request. In a deployed solution you’ll need to scope who can make the requests, limit on the inbound network rules who can submit requests and of course further protect the account credentials used to connect to your FIM/MIM Service Server.

Successful Response

The following screenshot shows calling the Function App and getting the responding object. Success. In a couple of lines I created a hashtable for the request, converted it to JSON and submitted it and got a response. How powerful is that!?

 

Summary

Using the awesome Lithnet Resource Management PowerShell Module with Azure Functions it is pretty quick and flexible to access a wealth of information we may want to expose for business benefit.

Now if only there was an affiliation program for Azure Functions that could deposit funds for each IDAM request to an Azure Functions App into my holiday fund.

Stay tuned for more posts on taking this concept to the next level.

Follow Darren on Twitter @darrenjrobinson