Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.

e.g Graph API Read/Write Permissions

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertifcate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certifcate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).

Swashbuckle Pro Tips for ASP.NET Web API – Example(s) Using AutoFixture

In the previous post, we implemented IOperationFilter of Swashbuckle to emit the consumes and produces properties in a Swagger document. This post will implement another IOperationFilter to emit example(s) properties containing auto-generated values by AutoFixture.

The sample codes used in this post can be found here.

Acknowledgement

The sample application uses the following spec:

Defining example(s) in Operation Object

There are a few places, in a Swagger definition, where example objects are declared:

  • example property in parameter
  • examples property in responses
  • example field in definitions

Swashbuckle library automatically generates those example data with default values based on their data type. In other words, if its data type is string, the example value becomes string, if its data type is number, the example value becomes 0, and so on:

Is there any way that we can have a sort of meaningful data in those example objects? Of course there is. This NuGet package helps us create more meaningful example objects. Download the package and write some codes like:

Then add a decorator to an action of Web API:

The first one, SwaggerResponse, is built-in Swashbuckle decorator and the second one, SwaggerResponseExample is what we’re going to use for example data generation. Once the decorator is added, we need to add another OptionFilter acton in the Swagger config file like:

Reload the Swagger UI page and we can see the example object with more meaningful values:

This is how the Swagger definition looks like:

This is certainly a good way to show example data. But when we refresh the page, the example objects still show the same value as we hard-coded them.

Automatic Example Data Generation with AutoFixture

We were able to generate an example object by implementing IExampleProvider of Swashbuckle.Examples library. What if we want to create an example object with automatically generated values? AutoFixture is for unit testing by generating a fixture instance with random values. Why not using this for our randomly generating examples? Let’s have a look. This time, we create an abstract class to apply both requests and responses.

The abstract class, ModelExample, internally implements IFixture from AutoFixture and creates a random instance of the type T. Here are request and response classes inheriting the abstract class:

And this is the action method of a Web API. SwaggerRequestModel defines the request example and SwaggerResponseExample defines the response example.

Reload the Swagger UI page several times.

Each time the page is refreshed, the example objects have different values. If we need to put some restrictions like string length or value range, we can simply put other decorators like StringLength or Range on those request/response models.


So far, we have walked through how Swashbuckle library could generate example(s) object with meaningful values rather than default values. There are many other cases that we can customise Swashbuckle library to enrich Swagger document. I’ll introduce some other cases later on.

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

PCNS provides all the details for the password change.

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

SharePoint Online forecast storage requirements

Background

Office 365 users get quite a bit of storage on SharePoint Online for content, be it files, metadata, etc. But still to manage the storage and forecast as to when additional storage has to be added becomes a challenge with very limited analytics available in SharePoint Online. Since adding more storage cost money so adding before you actually require or a bit too late would not be ideal.

SharePoint Online provides two ways to track the storage one from the admin center and other one from within the site collection using Storage metrics.

In SharePoint Online tenant’s admin center it gives us a bar matrix with details about the total storage, used storage and storage available. Here customers can manage the total amount of space allocated to each site collection, the total amount of space utilized and available.

To check Storage Metrics to the Site collection under site settings we can find the Storage metrics link, storage metrics provides us with the space utilization breakdown by each sub site, library and list in that Site Collection.

As shown in the image below sub sites, libraries, lists etc. are listed with size, % of parent. This can help in finding resources which are consuming most resources.

There are some third party storage monitoring utilities available to connect to SharePoint online to determine usage patterns and trends.

But if we need to forecast storage usage, find out the site collection which is growing the most over a period of time in a tenant, with more than hundreds of site collection then it becomes a challenge to keep that track.

Solution

We can use PowerShell script and get the storage information from the tenant, store it in a SharePoint list on a periodic basis. Once we have this data available then on this data Power BI can be used to build a report which shows the growth % for each site collection over a period and get some insight to the storage.

By using PowerShell, we get heaps of information from SharePoint tenant using the command Get-SPOSite but we will here use it to get site collection list, storage information.

Get-SPOSite -Detailed -Limit All | select *

Then once the information is retrieved from SPO, use PowerShell to iterate over the data and save the relevant data to a SharePoint list.

SharePoint list Columns: Site Name/Title, Site Url, Storage, Report Run Date.

Data in SharePoint list can be filtered, grouped and sorted in list views to make it more usable for business to analyze and predict storage growth and requirements.

In order to get detailed analytics on the data, use Power BI reports which can help in putting data together in form of reports and predict the growth percentage for each site collection. PowerBI provides many connectors one of them is for SharePoint Online to easily connect with lists. To set up connector in Power BI check this Kloud Blog.

Below is the Power BI report that was built on the SharePoint list data which was pulled in using PowerShell, Data in the report below is sorted by sites with maximum storage increase within a selected interval.

Using Power BI more smarts can be added like filters, graphs and make the data more usable.

Report details

  • x-axis : List of site collection
  • y-axis : Storage value’s in MB

As we hover the curser over the graph it shows the Max and Min value for the storage over the selected period of time. The graph captures the sites in decreasing order of growth value by storage.

Tabular view for the report.

PowerShell script can be scheduled to run periodically and gather the data by weekly or fortnightly we can calculate the average weekly or fortnightly storage requirement for the tenant.

Secondly data captured is used to forecast weekly storage requirement. From SPO tenant we get details about total storage, used storage, storage available.

Average weekly storage requirement can be taken from list view grouped by week and aggregate of the storage as shown below in the image.

Once we have the above information then we get the average requirement for week is 24,349 MB i.e. 24 GB (difference in storage growth in two weeks) in this case, this can be used to predict the weekly storage requirement which can help forecast as to when the tenant storage is going to run out and additional storage would be required.

SharePoint Online external user access error “User Not in directory”

Background

Organization wants to share their SharePoint online site collection, documents and collaborate with external partners, vendors or customers. By default site collection are shared to internal user’s only. But this can be extended to authenticated external users or also with limited sharing to anonymous users. External users do not have a license to office 365 subscription, they are limited to basic collaboration tasks.

I had recently enabled external access for site collection on the SPO tenant only to selected domains and authenticated external users. External sharing in SharePoint online works well in most scenario’s but there are few issues which pop up while enabling access for external user’s and with limited error details it becomes a bit challenging to understand the cause.

Problem

Error: “User Not in directory”

Error message which user’s get as they try to login to external SharePoint site is quiet generic ” User not in directory “it is not that descriptive and did not points to the cause for the issue.

Solution

To troubleshoot the access for the user. Clear browser cache or open Incognito or Private session, then try below steps.

First check to make sure the account which is used to accept the email invitation to the site is the same account which is being used to login later.

In Office 365 login screen if below screen is popping up prompting “Which account do you want to use?” when you sign in, it means that two different accounts have been configured with Microsoft using the same email address:

A “Work or school” account, which probably was created by your IT department

A “personal” account, which you probably created later on by the user.   

            

Personal account can be renamed which means using a different email address to sign in to it. To fix it follow this KB article (https://support.microsoft.com/en-us/help/11545/microsoft-account-rename-your-personal-account)

If external user’s accepted the invite using the personal account and later on try to connect by selecting the work account and is getting the error “User not in directory”. This is the most common cause for the error. Make sure the user is using the same account to accept the invite and log-on to the site.

Secondly if the account used for accepting the invite and login are same and still the error screen pop’s up, then user account has to be set up again, but before we need to do the clean up on the existing references for user profile and remove the user from SharePoint and then send fresh invite. To remove the user and all references follow the below steps.

External users are managed from a site-collection–by–site-collection basis. An external user account must be removed from each site collection that the user was granted access to.

Browse to each site collection that the user previously had access to, and then follow below steps:

  • In the site collection, edit the URL in the browser by appending the following string to the site address:
    _layouts/15/people.aspx/membershipGroupId=0

  • Select the user from the list and Click Delete. Then once user is removed next

  • Start the SharePoint Online Management Shell.
  • Type the following cmdlet:
    $cred = Get-Credential
    In the Windows PowerShell Credential required dialog box, type your site collection admin account and password, and then click OK.
  • Connect to SharePoint Online, and then type the following cmdlet:
    Connect-SPOService -Url https://tenant-admin.sharepoint.com -Credential $cred
  • Remove the user from each site collection by using the following cmdlet:
    $ExtUser = Get-SPOExternalUser -filter someone@example.com
  • Type the following cmdlet:
    Remove-SPOExternalUser -UniqueIDs @($ExtUser.UniqueId)

Then we can add back the user and resend the invite. It should fix up the issue.

Last thing to check is that user has a proper role assigned to user account under user profile in the office portal.

  • To check the role assigned to the user, Go to office 365 admin center.
  • Sign in with global administrator’s account.
  • Check the external user in Users>Active users, then, check the roles of the external user and change it to User (no admin access).

Making application configuration files dynamic with confd and Azure Redis

Service discovery and hot reconfiguration is a common problem we face in cloud development nowadays. In some cases we can rely on an orchestration engine like Kubernetes to do all the work for us. In other cases we can leverage a configuration management system and do the orchestration ourselves. However, there are still some cases where either of these solutions are impractical or just too complex for the immediate problem… and you don’t have a Consul cluster at hand either :(.

confd to the rescue

Confd is a Golang written binary that allows us to make configuration files dynamic by providing a templating engine driven by backend data stores like etcd, Consul, DynamoDb, Redis, Vault, Zookeeper. It is commonly used to allow classic load balancers like Nginx and HAProxy to automatically reconfigure themselves when new healthy upstream services come online under different IP addresses.

NOTE: For the sake of simplicity I will use a very simple example to demonstrate how to use confd to remotely reconfigure an Nginx route by listening to changes performed against an Azure Redis Cache backend. However, this idea can be extrapolated to solve service discovery problems whereby application instances continuously report their health and location to a Service Registry (in our case Azure Redis) that is monitored by the Load Balancer service in order to reconfigure itself if necessary.

https://www.nginx.com/blog/service-discovery-in-a-microservices-architecture

Just as a side note, confd was created by Kelsey Hightower (now Staff Developer Advocate, Google Cloud Platform) in the early Docker and CoreOS days. If you haven’t heard of Kelsey I totally recommend you YouTube around for him to watch any of his talks.

Prerequisites

Azure Redis Cache

Redis, our Service Discovery data store will be listening on XXXX-XXXX-XXXX.redis.cache.windows.net:6380 (whereXXXX-XXXX-XXXX is your DNS prefix). confd will monitor changes on the /myapp/suggestions/drink cache key and then update Nginx configuration accordingly.

Container images

confd + nginx container image
confd’s support for Redis backend using a password is still not available under the stable or alpha release as of August 2017. I explain how to easily compile the binary and include it in an Nginx container in a previous post.

TLDR: docker pull xynova/nginx-confd

socat container image
confd is currently unable to connect to Redis through TLS (required by Azure Redis Cache). To overcome this limitation we will use a protocol translation tool called socat which I also talk about in a previous post.

TLDR: docker pull xynova/socat

Preparing confd templates

Driving Nginx configuration with Azure Redis

We first start a xynova/nginx-confd container and mount our prepared confd configurations as a volume under the /etc/confd path. We are also binding port 80 to 8080 on localhost so that we can access Nginx by browsing to http://localhost:8080.


The interactive session logs show us that confd fails to connect to Redis on 127.0.0.1:6379 because there is no Redis service inside the container.

To fix this we bring xynova/socat to create a tunnel that confd can use to talk to Azure Redis Cache in the cloud. We open a new terminal and type the following (note: replace XXXX-XXXX-XXXX with your own Azure Redis prefix).

Notice that by specifying --net container:nginx option, I am instructing the xynova/socat container to join the xynova/nginx-confd container network namespace. This is the way we get containers to share their own private localhost sandbox.

Now looking back at our interactive logs we can see that confd is now talking to Azure Redis but it cannot find the/myapp/suggestions/drink cache key.

Lets just set a value for that key:

confd is now happily synchronized with Azure Redis and the Nginx service is up and running.

We now browse to http://localhost:8080 and check test our container composition:

Covfefe… should we fix that?
We just set the /myapp/suggestions/drink key to coffee.

Watch how confd notices the change and proceeds to update the target config files.

Now if we refresh our browser we see:

Happy hacking.

 

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 1

Introduction

Recently I wrote about getting started with the latest IBM/Lotus Notes/Domino Management Agent for Microsoft Identity Manager. In a recent engagement we are using that MA to provision and manage identities into Domino. We are also using the MA to synchronise passwords via PCNS and MIM to the Notes users’ Internet (HTTP) password.

What you may or may not be aware of is that IBM introduced a new feature with Domino 8.5 called the ID Vault. The ID Vault is a Domino based application that holds protected copies of Notes user IDs. Now here’s the twist. The Microsoft Domino MA only supports password synchronisation to the HTTP password, not to the ID Vault.

My customer is using the ID Vault and naturally we need to synchronise password changes to both the HTTP Password and the ID Vault (for users Notes IDs). This post is the first in a series that details how I recently accomplished synchronising passwords to the Domino ID Vault.

  1. This post provides the introduction and the creation of a PowerShell Management Agent into Domino to join identities into the MIM Metaverse
  2. Post two details Creating Domino Agents that will handle taking requests from the MIM PS MA to change users ID Vault password
  3. Post three will detail calling the Domino Agents on password sync events (from PCNS via MIM)

Overview

The following diagram shows a high-level overview of password synchronisation using FIM/MIM from AD to Domino. Password changes/resets can be initiated using a number of methods. The FIM/MIM Self Service Password Reset functionality, users changing their password via their domain joined workstations as defined by AD Group Password Polic(y)ies, using the AD FS Password Change function, or even on behalf of users by a Service Desk/Administrator. In each scenario implementing Microsoft’s Password Change Notification Service will get the password change to FIM/MIM. I’m not going to cover PCNS as it is out of the box and straight forward to install and configure. This MS PFE PCNS implementation document covers it quite well.

Likewise I’m not going to go into any detail about password sync to the HTTP Password. That’s out of the box functionality, that is pretty much the same as any other MA configured as a Password Sync Target. That said in my environment I did have to configure the MS Domino MA like this to get password events out to Domino.

ID Vault FIM/MIM PowerShell Management Agent

First up, we are going to need a Management Agent to join Notes users to our users from Active Directory in the Metaverse. I’ve gone to my favourite PowerShell Management Agent (Granfeldt) for this.

The Granfeldt PS MA will be configured to;

  • Import and join Domino Users to the Metaverse. The MA will be slimline in the number of attributes it brings in. Enough to perform the join and have enough information about the users context in Domino to be able to perform the password sync event
  • be a target for Password Synchronisation
  • send the password change event to the Domino Agent we will build to perform the password change. A Domino Agent is required as the ID Vault will only accept password changes from a process run on the Domino Server(s). More on this in parts 2 and 3

The integration of the MIM Sync Engine with Domino with the PowerShell Management Agent is done using LDAP. The Name and Address Book is easily accessed via LDAP.

To get started I looked up the Server Document for the Domino Server I wanted to connect to that had the Name and Address Book. Selecting the Directory tab I could see that LDAP(S) 389/636 was enabled.

I then went to the Name and Address Book and looked up my Admin Notes ID to get its context so I could translate it to LDAP format. Darren Kloud Robinson/OrgU/Org-AUS translates to CN=Darren Kloud Robinson,OU=Org,O=Org-AUS“.

Knowing my Notes ID Password I used LDP from Windows Server to bind to the Domino Directory. You could use any LDAP Browser/Tool.

Once I validated I could connect and browse the tree, I knew I had my connection details sorted, I translated that to PowerShell.

That looked something like this;

I then wrapped this into a FIM/MIM PowerShell Management Agent. The Granfeldt PS MA Scripts are below.

Domino PowerShell Management Agent Schema Script

As described above the Schema Script is very light on the number of attributes specified. Basically the minimum required to get a join and give us the context of the user to process password sync events.

Domino PowerShell Management Agent Import Script

As detailed above the Import brings through enough metadata to perform the join and give us the attributes needed for the user context to be able to sync passwords through.

Domino PowerShell Management Agent Export Script

File just needs to exist. Not used in this scenario.

Domino PowerShell Management Agent Password Script

See Part 3 in this series for the Password.ps1 script. But if you are following sequentially, copy the empty Export.ps1 script for now and name it Password.ps1 and have it located in the same directory as the other PS MA scripts.

Wiring it all together

As for creating the PS MA, I’ve detailed this in-depth many times. Check out this post (or the many other similar I’ve posted) and the Getting Started section if you are new to the Granfeldt PowerShell Management Agent. Copy the above scripts to the directory you create, and when creating the MA provide the paths to the scripts (in 8.3 format).

A key item though is to configure the PS MA as a Password Sync Target as per the screenshot below. You will also need to configure where passwords are coming from to send to this new MA. If it is Active Directory, open the Properties of your AD MA select Configure Directory Partitions then under Password Synchronization enable the checkbox Enable this partition as a password synchronization source. Select Targets and select your newly created Notes ID Vault Password MA. Select Ok then Ok again.

After creating a Run Profile and doing a Stage and Import, based on your Join rule (probably email address) you should have a heap of connectors. In my environment displayName contains the context of the user. Eg. Full Name/OrgU/Org We’ll need this to send the password change event to the ID Vault.

Summary

Through the PowerShell MA as detailed above we have been able to enumerate users from Domino and join them to existing users in the MIM Sync Metaverse. We can now set about creating Domino Agents to take password sync events from this MA and change users passwords in the Domino ID Vault.

Part 2 in this series here details creating the Domino Agents and configuring Domino to accept the changes.

Build from source and package into a minimal image with the new Docker Multi-Stage Build feature

Confd is a Golang written binary that can help us make configuration files dynamic. It achieves this by providing a templating engine that is driven by backend data stores like etcd, consul, dynamodb, redis, vault, zookeeper.

https://github.com/kelseyhightower/confd

A few days ago I started putting together a BYO load-balancing PoC where I wanted to use confd and Nginx. I realised however that some features that I needed from confd were not yet released. Not a problem; I was able to compile the master branch and package the resulting binary into an Nginx container all in one go, and without even having Golang installed on my machine. Here is how:

confd + Nginx with Docker Multi-Stage builds

First I will create my container startup script  docker-confd/nginx-confd.sh.
This script launches nginx and confd in the container but tracks both processes so that I can exit the container if either of them fail.

Normally you want to have only once process per container. In my particular case I have inter-process signaling between confd and Nginx and therefore it is easier for me to keep both processes together.

Now I create my Multi-Stage build Dockerfile:  docker-confd/Dockerfile
I denote a build stage by using the AS <STAGE-NAME> keyword: FROM golang:1.8.3-alpine3.6 AS confd-build-stage. I can reference the stage by name further down when I am copying the resulting binary into the Nginx container.

Now I build my image by executing docker build -t confd-nginx-local docker-confd-nginx.

DONE!, just about 15MB extra to the Nginx base alpine image.

Read more about the Multi-Stage Build feature on the Docker website.

Happy hacking.

Swashbuckle Pro Tips for ASP.NET Web API – Content Types

Open API 2.0 (AKA Swagger) is a de-facto standard to document Web API. For ASP.NET Web API applications, Swashbuckle helps developers build the Swagger definition a lot easier. As Swashbuckle hasn’t fully implemented the Swagger specification, we need to develop some extensions using a few interfaces provided by Swashbuckle. In this post we’re going to talk about a couple of extensions to make Swagger definition more completed.

The sample codes used in this post can be found here.

Acknowledgement

The sample application uses the following spec:

Defining consumes and produdes in Operation Object

In Swagger, HTTP verb like GET, POST, PUT, PATCH or DELETE is referred as Operation Object. Each Operation Object can define which content types are to be requested (consumes) and which content types are to be returned (produces). Therefore, with Swashbuckle, Swagger document page produces like:

In other words, Swashbuckle assumes those five content types as default for requests – application/json, text/json, application/xml, text/xml, and application/x-www-form-urlencoded. And those four content types are the default response ones – application/json, text/json, application/xml and text/xml. Here’s a part of the Swagger definition automatically generated.

If we want to globally apply those content types, that can be done within the global configuration. Here’s the sample OWIN configuration:

It clears everything and add only one content type – application/json. By doing so, we can globally set one content type for both requests and responses. If we want to have more control on each endpoint, the IOperationFilter interface of Swashbuckle gives us that flexibility.

Implementing SwaggerConsumesAttribute Decorator

First of all, we need to write a simple decorator, called SwaggerConsumesAttribute which handles the consumes field of the Operation Object.

Through this decorator, we simply define number of content types to pass. How does it apply?

Let’s move on.

Implementing Consumes Filter

We now write the Consumes filter class by implementing the IOperationFilter interface like.

What it does are:

  • To check the content type values passed from the SwaggerConsumerAttribute decorator, and
  • To add all content types passed from the decorator to operation.consumes.

This needs to be added to the Swashbuckle configuration like:

Now run the Web API again and we’ll see the result like:

Implementing SwaggerProducesAttribute and Produces

Both SwaggerProducesAttribute decorator and Produces filter can be written as what we did above for consumes.

The SwaggerProducesAttribute decorator can be applied to:

The picture below is the Swagger definition based on the extensions above:

We’ve so far had a look how to extend Swashbuckle to fill the missing parts in Swagger definition. In the next post, we’ll walk through another extension for Swagger definition.