When it comes to Microsoft’s Forefront Identity Manager (FIM), I sometimes run into ‘religious arguments’ with fellow FIM consultants about which way is the ‘correct’ or ‘right way’ to architect FIM to implement identity business rules into a brand new FIM architecture. Typically the argument comes about determining at the very start of a project about whether to base the FIM code base on ‘classical’ rules extensions using VB.NET or C# or try to use FIM R2’s Management Policy Rules (MPR), Sets, Sync. Rules and Workflows to achieve all business rules (which I’ve seen referred to as either ‘Declarative’ or ‘Codeless’). Microsoft are obviously keen to have as many of their customer use Declarative provisioning where possible.
Discussing this topic with my fellow Kloudies, it works out that on most of Kloud’s FIM projects we always end up using a mixture of ‘both’, but it really depends on percentage wise on which side of the pendulum swing you end up on – whether the project becomes mostly Classical or mostly Declarative. Often, the user base of a FIM project will determine which model is chosen as Classical can generally process business rules (in the form of attribute manipulation, or provisioning/de-provisioning) a lot faster and without creating what I term ‘object bloat’ (and therefore SQL database size bloat without significant pruning and log truncation). Small user populations, let’s say 5000 users or less, I feel that there’s an opportunity lost if an attempt is not made to try to achieve 100% Declarative codebase particularly as administrators in smaller organisations generally don’t like supporting a system that requires extensive coding knowledge (especially .NET code!).
I’ve recently had an opportunity to attempt a goal of an almost 100% Declarative (Codeless) architecture for FIM recently for a customer, as I felt their skill set was aligned to using the FIM portal to manage objects rather than having to modify any existing C# code. With any FIM project, the trickiest part is the ‘handover’ stage where you have to train up an on premise administrator on all the intricate workings of FIMs objects, attributes and data flow.
I feel there’s also an opportunity to show off the power of FIM’s Declarative engine, and how quickly a new business rule can be implemented with FIM, if a FIM consultant shows off the FIM portal to Business Analysts and IT Architects during a presentation.
I almost achieved this vision of a “100% Declarative FIM architecture”, but I failed in a few select areas that I thought would be interesting to blog about and hopefully generate discussion in our community. The key success was not having a single piece of C# code for any Metaverse Rules Extensions. I did, however use some Management Agent Rules Extensions on the Active Directory Management Agent and my ‘source of truth’ Management Agents.
The following is a list of outcomes that I achieved using Declarative provisioning:
1. Provisioning and deprovisioning person, location and organisational unit objects to SQL, AD, Exchange, Lync and CSV files using Sets and mostly ‘Transition In’ MPRs, including:
a. Using Detected Rule Entries (DREs) to calculate Set membership. Part of the challenge of my environment was to determine, timing-wise, how to provision both an Active Directory (AD), Exchange and Lync account to an AD domain in one ‘complete FIM cycle’. I managed to get both AD and Exchange provisioned at the same time thanks to the native AD Management Agent, however I was using a PowerShell Management Agent to provision my Lync account which requires the AD account to already be there. I ended up using a DRE on the Outbound Sync rule for AD, and then using the existence of that DRE (and a calculated set membership) to trigger the Outbound Sync rule for Lync. There are of course many ways of detecting whether an AD account is ‘active’ however, I just wanted something to work for both existing AD accounts and ones provisioned by FIM. Also, the only system that contributes DREs is Active Directory and this avoids ‘sticky values’ where the FIM portal keeps sending old values to the Metaverse because its the last contributing system for that attribute.
b. As part of my Active Directory outbound provisioning rule, I used a combination of multiple Action workflows to generate the right values, including calculating a unique ‘sAMAccountName’ using a Global Catalog lookup to the Active Directory. I also used ‘Parameters‘ in my outbound Sync. Rule to send initial flow values as variables to the Sync. Rule for all attributes of a person, as the Person object attribute precedence often would not have Active Directory as the source of truth for many attributes which meant that the user would not get any initial values on the first export. I also did not want to use ‘equal precedence’ for those attributes as the business wanted to have set ‘sources of truth’ for each attribute and not use a system of ‘last update’ wins which you would get with ‘equal attribute precedence’.
2. Linking a user object to a location object in the portal to extract location attributes from that location object and store them in a user’s object. This was achieved using an Action ‘lookup’ workflow and XPath filters.
3. Managing the update of all object attributes in the FIM portal, including exposing new objects like ‘Location’ and ‘Organisational Unit’ objects with custom FIM portal extensions, and writing of RCDC XML files for each attribute (time consuming but exposes the power of the FIM portal as a potential area to manage some objects exclusively in the portal)
4. Using native FIM ‘sync rules’ to achieve the following types of attribute manipulations, usually inbound direction only, including:
a. Converting string attribute values into GUID attributes and vice versa
b. Using ‘leftpad’ and ‘rightpad’ to pad values out with ‘0’ values to the require attribute length. I only had to perform this action for data cleansing purposes and not for the Production system. (NOTE: As stated later in this blog, I had to use ‘leftpad’ and ‘rightpad’ in C# for joining rules as you cannot manipulate joining rules with Sync. Rules using any of FIM’s native functions.)
5. Manipulating attribute values by using an Action workflow using native FIM functions like ‘lowercase’, ‘uppercase’, ‘propercase’ (e.g. turning ‘michael pearn’ into ‘Michael Pearn’) and the ‘Word’ function to separate values from one attribute into many attributes. I would often use an Action workflow to set values AFTER an object is imported into the FIM portal, primarily because if there is data quality issues, and you have those exact same functions on an inbound Sync. Rule instead, then the object hits a DLL error with the inbound sync. function and is not imported at all. An Action workflow function instead of an inbound sync. rule function guarantees that all objects can get to the FIM portal regardless of data quality issues.
6. Using ‘Temporal Sets’ and ‘Transition In’ triggers to send out reminder emails about an account that was due to expire. Part of FIM’s Declarative engine power I feel is that it can, without any C# code, achieve this very simply by using Sets based on ‘X number of days after today’s date’
7. Using MPRs for object permissions. Part of the FIM portal’s power is to be able to determine, based on your Set membership, which rights you have down to the attribute level (read, write, delete) to different objects. This I feel is a big feature of FIM’s power to delegate the use of the FIM portal to different areas of the business to control the updating of different objects.
8. ‘Disconnect’ state triggering attribute value in Metaverse or the FIM portal: One of my sources of truth about an object was a CSV file, and frequently objects would disappear out of this CSV file, leading to a disconnected object in the Connector Space. I needed to then set values on this object indicating it was no longer ‘active’ so I could keep track of its status in the FIM portal. I flowed a static string value of ‘true’ to a boolean attribute called ‘isConnected’ to the FIM portal. The only contributing system to this attribute was this Management Agent. The key part was not having an inbound import on that attribute from the FIM portal to avoid the FIM portal setting what I term a ‘sticky value’ to the Metaverse. I then used ‘Transition In’ states to determine when that object was no longer ‘true’ for that attribute and then used an Action Workflow to send ‘not active’ values to that object.
The following is a list of tasks that, no matter how hard I tried, I just could not use using FIM’s Declarative model and instead had to resort to using C# (‘Classical’) rules extensions:
1. Advanced joining rules: In my system, I was using ’employeeID’ as a joining attribute, however between my HR SQL system and Active Directory there was a disparity between the format of these values. As an example, the HR system was: ‘000123’, the AD system: ‘123’. In order to join identities using this system, FIM’s Sync rules cannot use any advanced joining rules in its interface so I had to write C# code in the source Management Agent to use ‘leftpad’ function (ie. add ‘000’ to the start of the ‘123’ value) to join it with these objects. Once I joined the object using FIM, I then pushed the full padded value back to Active Directory so that the attribute was exactly the same in all source systems.
2. Working with multi-valued attributes: Trying to work with multi-valued attributes is next to impossible with the functions that are available in the FIM Sync rules. The only way I could add items to an array was using C# attribute import code.
3. Converting Binary values to ISO8601 Datetime: I ended up using some borrowed C# code to convert Active Directory’s ‘lastLogonTimestamp’ value (binary) to ISO8601 datetime format. Flowing ISO8601 datetime into the FIM portal can then be used for ‘temporal set’ calculation such as sending reminder emails that an account is about to expire.
As always I’m open for any feedback from fellow FIM experts on their attempts to achieve a 100% Declarative model and whether they’ve achieved any of my items above using FIM’s sync. rules, workflows or other ‘codeless’ methods. Feel free to get in contact with me if you’d like any more information on how I approach the use of FIM Declarative rules.
This project was also run in conjunction with a SharePoint and Nintex solution, so a future blog will outline how we approached having FIM and SharePoint work in harmony for business processes.