Continuous Credential Prompt when accessing MIM Password Registration Portal

First published at https://nivleshc.wordpress.com

Recently I was at a customer site, setting up a Microsoft Identity Manager (MIM) 2016 environment, which included the deployment of the Self Service Password Registration and Self Service Password Reset portals. For additional security, I was using Kerberos instead of the default NTLM.

I finished installing the MIM Portal, Service, Password Registration and Password Reset Portals without any issues.

I then proceeded to securing all http endpoints by enabling them for SSL and after that removing the http bindings, so that you could access the MIM Portal, Password Registration and Password Reset Portals only via https. No issues there as well.

By this time I was pretty pleased with myself ūüėČ Everything was going as planned, no issues faced at all. Finally, lady luck was showering me with her blessings.

Having finished the installation and configuration, I proceeded to testing the solution.

The first thing to check was the MIM Portal site. I opened up a web browser and navigated to the Microsoft Identity Manager Portal. When prompted, I logged in with the mimadmin domain account credentials. I was successfully logged in and could access all the parts without any hitch.

Now kids, if you are faint at heart, be very wary of what happens next (hint. this is the time when you cover your eyes with your hands when watching a horror movie).

I then tried accessing the Self Service Password Registration Portal and got prompted for credentials.

SSPRegistration_CredentialPrompt

I entered the mimadmin account credentials and pressed enter. Just as I thought I had successfully logged in, the credential prompt returned! hmm, that is weird. I was pretty sure I had typed the username and password correctly. Oh well, maybe I didn’t.

I typed the credentials again and pressed enter. Quick as a flash, the credential prompt returned! Uh? What was happening here?{scratching my head} Hmm, I seem to be making a lot of typos today. I carefully entered the username and password again, taking my time this time, to ensure I was entering it correctly. I then pressed enter and waited.

Well, I didn’t have to wait for long since within a second, I got greeted with the¬†Not Authorized screen!

SSPRegistration_NotAuthorised

Fascinating. It seems that lady luck had flown away because here indeed was an issue with the Self Service Password Registration Portal! Ok, Mister. Lets have a look at whats causing this kerfuffle!

I opened up the event viewer on the Self Service Password Registration server and went through each of the logged events in the Application and System logs, however I couldn’t find any clue as to why the credentials were not working. I secretly had suspicions that the issue could be due to Kerberos token errors, however I couldn’t find anything in the event logs to substantiate my suspicion. Hmm, the plot was indeed getting thicker!

I next started doing some Google searches, thinking that someone else might have encountered the same issue. Alas, it seemed that I was alone in my woes as the results seemed to be quite thin in regards to any possible solution for my issue.

Finally, I decided to follow my dear ol’ Sherlock’s advice “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”

I went through the whole Self Service Password Registration setup process, checking each and every part of the configuration, to ensure that the values were as expected. After 10 minutes, I was almost done checking and no clues so far ūüė¶

Lastly, I opened IIS Manager and checked all the settings. Nothing here as well. Hey back up a bit. What is this?

The Self Service Password Registration Portal site had its useAppPoolCredentials set to False.

SSPRegistration_useAppPoolCredentialsFalse

Now, this should be True! Is this what was causing the issue?

I quickly changed the value for useAppPoolCredentials from False to True.

SSPRegistration_useAppPoolCredentialsTrue

I then opened my web browser again and navigated to the Self Service Password Registration Portal. Once again the familiar credential prompt came up. I entered the same credentials as before and pressed enter.

Woo hoo!! This time around I was successfully logged in.

I sincerely hope that this post helps others who might be encountering the same error.

Have a great day ūüėČ

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed.¬†There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights ūüôā

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

Display Microsoft Identity Manager Sync Engine Statistics in the MIM Portal

Introduction

In the Microsoft / Forefront Identity Manager Synchronization Service Manager under Tools we have a Statistics Report. This gives a break down of each of the Management Agents and the Connectors on each MA.

I had a recent requirement to expose this information for a customer but I didn’t want them to have to connect to the Synchronization Server (and be given the permissions to allow them to). So I looked into another way of providing a subset of this information in the MIM Portal itself. ¬†This post details that solution.

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

MIM / FIM Synchronization Server Management Agent & Metaverse Statistics

Overview

I approached this in a similar way I did for the User Object Report I recently developed. The approach is;

  • Azure PowerShell Function App that uses Remote PowerShell to connect to the MIM Sync Server and leverage the Lithnet MIIS Automation PowerShell Module to enumerate all Management Agents and build a report on the information required in the report
  • A NodeJS WebApp calls the Azure PowerShell Function App onload to generate the report and display it
  • The NodeJS WebApp is embedded in the MIM Portal as a new Nav Bar Resource and Page

The graphic below details the basic logical integration.

MVStatsReportOverview

Prerequisites

The prerequisites to perform this I’ve covered in other posts. In concept as described above it is similar to the User Object report, that has the same prerequisites and I did a pretty good job on detailing those here. To implement this then that post is the required reading to get you ready.

Azure PowerShell Function App

Below is the raw script from my Function App that connects to the MIM Sync Server and retrieves the Management Agent Statistics for the report.

NodeJS Web App

The NodeJS Web App is the app that gets embedded in the MIM Portal that calls the Azure Function to retreive the data and then display it. To get started you’ll want to start with a based NodeJS WebApp. This post will get you started.¬†Implementing a NodeJS WebApp using Visual Studio Code¬†

The only extension I’m using on top of what is listed there is JQuery. So once you have NodeJS up and running in your VSCode Terminal type npm install jquery and then npm install.

I’ve kept it simple and contained all in a single HTML file using JQuery.

In you NodeJS project you will need to reference your report.html file. It should look like this (assuming you name your report report.html)

var express = require('express');
var router = express.Router();
/* GET - Report page */
router.get('/', function(req, res, next) {
   res.sendFile('report.html', { root:'./public'});
});

module.exports = router;

The Embedded Report

This is what my report looks like embedded in the MIM Portal.

Microsoft Identity Manager Statistics Report

Microsoft Identity Manager Statistics Report

Summary

Integration of FIM / MIM with Azure Platform as a Service Services opens a world of functionality including the ability to expose information that was previously only obtainable by the FIM / MIM Administrator.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire. ¬†When configuring GALSync¬†([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found¬†about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to. ¬†Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true¬†
  1. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also.¬†Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true¬†

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

MIM2016 Upgrade Hanging on Custom Action – SetPermissionEval

I was upgrading a client’s environment from FIM2010 R2 to MIM2016, during the upgrade of the Synchronization service, the installer appeared stuck, I waited for over an hour, there was no activity and no progress update. I checked the msi installation log, and found the last activity was CustomAction = SetPermissionEval, ActionType=3073. Other than this, there was no errors or any indication of failures.

msilog

According to this TechNet article, SetPermissionEval sets access permission (ACLs) for file folders, registry, DCOM launch/access permission and WMI.

ExtensionsCache

So I opened the Process Monitor, I discovered the reason was the hidden folder Microsoft Forefront Identity Manager\2010\Synchronization Service\ExtensionsCache, this directory contained over 260,000 folders with approximately 2 million objects, the SetPermissionEval custom action was applying ACL on each of them!

procmon

I couldn’t find the exact purpose for the ExtensionsCache, there is no Microsoft documentation on it nor any mention in the official upgrade guidance or best practice, however by looking at the contents of the folder, I suspect FIM/MIM creates these folders when running synchronisation or export using custom code based extension rules.

Based on an earlier forum post, I decided to delete the contents in this folder

delete

Once all the items are deleted, I restarted the synchronization service upgrade, the upgrade continued and finished without delay.

I still don’t understand why the installer file should try to set the file permission in the cache directory, when the whole directory content could be removed without problem, why brother?

Anyway if you are upgrading or patching your FIM or MIM instance, it might be worthwhile to check your ExtensionsCache hidden directory, if you have too many folders there, stop the synchronization service and delete those cache folders to avoid this problem.

Receive Push Notifications from Microsoft Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page¬†and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

IMG-5194

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here¬†to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Set your eyes on the Target!

1015red_F1CoverStory.jpg

So in my previous posts I’ve discussed a couple of key points in what I define as the basic principles of Identity and Access Management;

Now that we have all the information needed, we can start to look at your target systems. Now in the simplest terms this could be your local Active Directory (Authentication Domain), but this could be anything, and with the adoption of cloud services, often these target systems are what drives the need for robust IAM services.

Something that we are often asked as IAM consultants is why. Why should the corporate applications be integrated with any IAM Service, and these are valid questions. Sometimes depending on what the system is and what it does, integrating with an IAM system isn’t a practical solution, but more often there are many benefits to having your applications integrated with and IAM system. These benefits include:

  1. Automated account provisioning
  2. Data consistency
  3. If supported Central Authentication services

Requirements

With any target system much like the untitled1IAM system itself, the one thing you must know before you go into any detail are the requirements. Every target system will have individual requirements. Some could be as simple as just needing basic information, first name, last name and date of birth. But for most applications there is allot more to it, and the requirements will be derived largely by the application vendor, and to a lessor extent the application owners and business requirements.

IAM Systems are for the most part extremely flexible in what they can do, they are built to be customized to an enormous degree, and the target systems used by the business will play a large part in determining the amount of customisations within the IAM system.

This could be as simple as requiring additional attributes that are not standard within both the IAM system and your source systems, or could also be the way in which you want the IAM system to interact with the application i.e. utilising web services and building custom Management Agents to connect and synchronise data sets between.

But the root of all this data is when using an IAM system you are having a constant flow of data that is all stored within the “Vault”. This helps ensure that any changes to a user is flowed to all systems, and not just the phone book, it also ensures that any changes are tracked through governance processes that have been established and implemented as part of the IAM System. Changes made to a users‚Äô identity information within a target application can be easily identified, to the point of saying this change was made on this date/time because a change to this persons‚Äô data occurred within the HR system at this time.

Integration

Most IAM systems will have management agents or connectors (the phases can vary depending on the vendor you use) built for the typical “Out of Box” systems, and these will for the most part satisfy the requirements of many so you don’t tend to have to worry so much about that, but if you have “bespoke” systems that have been developed and built up over the years for your business then this is where the custom management agents would play a key part, and how they are built will depend on the applications themselves, in a Microsoft IAM Service the custom management agents would be done using an Extensible Connectivity Management Agent (ECMA). How you would build and develop management agents for FIM or MIM is quite an extensive discussion and something that would be better off in a separate post.

One of the “sticky” points here is that most of the time in order to integrate applications, you need to have elevated access to the applications back end to be able to populate data to and pull data from the application, but the way this is done through any IAM system is through specific service accounts that are restricted to only perform the functions of the applications.

Authentication and SSO

Application integration is something seen to tighten the security of the data and access to applications being controlled through various mechanisms, authentication plays a large part in the IAM process.

During the provisioning process, passwords are usually set when an account is created. This is either through using random password generators (preferred), or setting a specific temporary password. When doing this though, it’s always done with the intent of the user resetting their password when they first logon. The Self Service functionality that can be introduced to do this enables the user to reset their password without ever having to know what the initial password was.

Depending on the application, separate passwords might be created that need to be managed. In most cases IAM consultants/architects will try and minimise this to not being required at all, but this isn’t always the case. In these situations, the IAM System has methods to manage this as well. In the Microsoft space this is something that can be controlled through Password Synchronisation using the “Password Change Notification Service” (PCNS) this basically means that if a user changes their main password that change can be propagated to all the systems that have separate passwords.

SONY DSCMost applications today use standard LDAP authentication to provide access to there application services, this enables the password management process to be much simpler. Cloud Services however generally need to be setup to do one of two things.

  1. Store local passwords
  2. Utilise Single Sign-On Services (SSO)

SSO uses standards based protocols to allow users to authenticate to applications with managed accounts and credentials which you control. Examples of these standard protocols are the likes of SAML, oAuth, WS-Fed/WS-Trust and many more.

There is a growing shift in the industry for these to be cloud services however, being the likes of Microsoft Azure Active Directory, or any number of other services that are available today.
The obvious benefit of SSO is that you have a single username or password to remember, this also greatly reduces the security risk that your business has from and auditing and compliance perspective having a single authentication directory can help reduce the overall exposure your business has to compromise from external or internal threats.

Well that about wraps it up, IAM for the most part is an enabler, it enables your business to be adequately prepared for the consumption of Cloud services and cloud enablement, which can help reduce the overall IT spend your business has over the coming years. But one thing I think I’ve highlighted throughout this particular series is requirements requirements requirements… repetitive I know, but for IAM so crucially important.

If you have any questions about this post or any of my others please feel free to drop a comment or contact me directly.

 

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

PCNS provides all the details for the password change.

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post¬†that showed a working example of using¬†a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 2

Introduction

As the title suggests this is Part 2 of a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.

This post details the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

Part 3 here  details calling the Domino Agents on password sync events (from PCNS via MIM)

Creating a New Domino Application

As mentioned above and in Part 1 we need to create Domino Agents to process password change events into the ID Vault. Domino Agents are required as Domino security will not allow password change events (called using the resetUserPassword method) to be run remotely.  The resetUserPassword method is only supported using the RunOnServer method.

In order to create a Domino Agent we need to install and run the IBM Domino Designer.

With that installed we can start with our first Domino Agent. We will create two Agents. The first will be the one that will perform the execution of the resetUserPassword method. The second will be the trigger that will retrieve the details of the user to change the password for and pass it to the first agent to execute.

In IBM Domino Designer select File => New => Application

Specify the Server to create the new Application on (and subsequently where it will run) and give the Application a name. I used ID Vault PWD Sync.

Create the MIM Password Sync Domino Agent

With the New Application created we can navigate to Code => Agents and select New Agent

Give the Agent a name. I named this one MIMPasswordSync and make sure the type is Java

With the Agent created we need to give it the script that will perform the password changes. Double click on the agent then in the Agent Contents double-click on JavaAgent.java and paste in the script (from Github further below). The only change you may need to make is the location where you want the logging to go to. You will need to create that path if it doesn’t exist as well.

Selecting the Agent Tab in the main pain locate the Agent Properties and configure as per the screenshot below.

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option.

Create the MIM Password Trigger Domino Agent

Create the MIM Password Trigger Domino Agent just as you did the MIM Password Sync Domino Agent. Name it MIMPwdTrigger and make sure the type is also Java. Double click on the Agent and then double-click on JavaAgent.java in the Agent Contents. Use the following script. Note it calls the MIMPasswordSync Agent so if you called yours something different you will need to change it in this script (line 12).

Select the MIMPwdTrigger Agent in the main pane and look at the Properties. Make Runtime to be On event and After documents are created or modified. 

Select the Security Tab in the Properties pane and set the Runtime security level to 3. If the options are blanked out and you can’t select them, close the agent and re-open it and you will be able to configure this option

Configuration ID Vault Password Reset Authority

In order for our MIMPasswordSync Agent to actually change users passwords in the ID Vault we need to configure the ID Vault to allow the account that created and signs the Agents and the Server that the Agents will run on to be Trusted Password Reset Authorities.

Using the IBM Domino Administrator select the Administration menu item and then Configuration. Expand Security from the left hand pane and select ID Vaults.

Having selected the ID Vault from the main pane you will be changing passwords in, on the right hand menu pane expand ID Vaults and double-click on Password Reset Authority.

From the Password reset authority by organisation box select the OrgU/Org you will be sync’ing passwords too. If you have many you will need to complete this step for each one. You will need to do one OrgU/Org at a time if they have different certifiers.

From the Available users, groups and servers box select the Server that you run the Agents with, and select Add. Repeat for selecting the user that you created the Agents with and that will sign the Agents. Then select the user you just added in the Password reset authority by organisation and then select the Password reset agent authority checkbox.  That will put the red @ symbol on the user which identifies it as a Password Reset Agent Authority.

Select Next/Configure, locate the certifier ID for that OrgU/Org, provide the password and complete the process. Repeat for each OrgU/Org.

Signing the Agents

Back using the Domino Designer double-click Agents in the left menu pane. Select each Agent and then click Sign.

Creating a Form to test the Agents

Now we will create a form to allow us to create a document in the DB easily and test that our agents work. In Domino Designer, right-click on Forms and select New Form.

Give the Form a name and an alias. It doesn’t matter what you call it. We’re just using it to test the agents.

Double Click on your new Form. Click in the empty pane and then from the Create menu select Field. Name it server. Repeat for another field and name it username.

Repeat for the third text field, but name it password and select the Type Password.

Finally from the Create menu select Hotspot => Button. Name it Submit and then select it. In the Properties of the button for Run select Client. For Formula enter the formula below.

@Command([FileSave]);
@Command([FileCloseWindow]);
@Command([ToolsRunMacro];"MIMPwdTrigger")

Testing the Agents

In the Domino Designer right-click on the form and select Preview in Notes.

The format for the fields is;

  • server: Server/Org
  • user: Joe Smith/OrgU/Org
  • password: P@SSw0rd

Enter valid input for your environment.

And click on the Submit button. If you have everything correct the document you just created will be processed by the Trigger Agent and then the MIM Password Sync Agent.

If there is an error you will likely have a document still in the IDVault PWD Sync database as shown below. Check the document to make sure you got the details for the user and server correct.

Also check the log file. C:\PWDSync\AgentLog.txt by default as per the script path. When working correctly you will see an entry as per below. If it wasn’t successful the error message should point you to where you have gone wrong. More than likely different names for the Agents, or incorrect format or name for the user and/or server. Or Trusted Password Authority not set for the account the agent was signed with to the OrgU/Org containing the user you are trying to change a password for.

MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Reseting password ...
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:24 PM: Server: XXXNotes1 User:Jane Doe/OrgU/Org-Aus
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Return value: true
MIMPasswordSync|mimpasswordsync: 08/08/2017 02:08:26 PM: Removed User ID Vault change document from 'XXXNotes1'

Summary

Now that we have our Agents built and working we need to be able to call them from our MIM Sync Server. That will be covered in the third and final post in this series.