WorkdayAPI PowerShell Module

Obtaining Workday HR Supervisory Hierarchy, Provisioning Flags and Photos with PowerShell

A few weeks back I posted this regarding using PowerShell and the Granfeldt PowerShell Management Agent to interface Microsoft Identity Manager with Workday HR. The core of this functionality is the WorkdayAPI PowerShell Module which I forked from Nathan and added additional functionality.

New WorkdayAPI PowerShell Module Cmdlets

This post details additional functionality I’ve added to the WorkdayAPI PowerShell Module. Updates include the following additional cmdlets;

Get-WorkdayWorkerProvData

Implementations of Workday obviously vary from organisation to organisation. Workday HR being an authoritative source of identity means it is also often the initiator of processes for Identity Management. For one of our customers base entitlements are derived from Workday Job Profile and may specify an Active Directory Account and an Office 365 License.

The Get-WorkdayWorkerProvData cmdlet is invoked from when calling Get_WorkdayWorkerAdv with the -IncludeWork flag. The result now returns ProvisioningGroup information as part of the returned PowerShell Object. As shown below Workday has indicated my Identity requires Active Directory and Office 365 (email).

Workday Provisioning Group.PNG

If you just want to return the Account Provisioning information you can with Get-WorkdayWorkerProvData.
Get-WorkdayWorkerProvData -WorkerId 181123 -WorkerType Employee_ID
Provisioning_Group Status   Last_Changed
------------------ ------   ------------
Office 365 (email) Assigned 1/05/2017 9:31:21 PM
Active Directory   Assigned 1/05/2017 9:57:37 PM

Get-WorkdayWorkerMgmtData

The Get-WorkdayWorkerMgmtData cmdlet is invoked from when calling Get_WorkdayWorkerAdv with the -IncludeWork flag. The result now returns MgmtData information as part of the returned PowerShell Object. The collection is an ordered list of the Supervisory Hierarchy for the object.

Mgmt Data.PNG

Expanding the collection we can see the Supervisory Hierarchy. Note: the top of the hierarchy is listed twice as in this organisation the top reports to himself. 

Mgmt Heirarchy 2.PNG

If you just want to return the Management Hierarchy you can with Get-WorkdayWorkerMgmtData.

Get-WorkdayWorkerMgmtData -WorkerId 1234 -WorkerType Employee_ID

Get-WorkdayWorkerPhoto

The Get-WorkdayWorkerPhoto cmdlet is invoked from when calling Get_WorkdayWorkerAdv with the -IncludePhoto and -PhotoPath flags. The result will then output the photo to the path provided in the -PhotoPath option.

If you just want to export the Photo for a single user you can with Get-WorkdayWorkerPhoto.

Get-WorkdayWorkerPhoto -WorkerId 1234 -WorkerType Employee_ID -PhotoPath 'C:\temp\workday'

Summary

Using the WorkdayAPI PowerShell Module we can now access information to drive the provisioning process as well as understand identities placement in the Supervisory hierarchy. We can also obtain their Workday Profile photo and sync that to other places if required.

Lifecycle Management of Identities in SailPoint IdentityNow via API and PowerShell

Introduction

If you’ve been following along I’ve been posting about leveraging the SailPoint IdentityNow API for;

Now that I’ve covered Searching and Authoring all that is left is lifecycle management. And that’s what I’ll cover in this post. Updating and Deleting Entities via the API.

Updating SailPoint IdentityNow Entities

If you have not read the first post in this series, start there as ‘updating’ builds on top of Search/Reporting. It also covers enabling the API.

My quick start guide to updating IdentityNow Entities starts with searching to find the Entities (probably Users) you want to update. In my example below I’m searching for all objects on a Source. Then I iterate through the results and update them. I’m updating the Country attribute.

When updating an entity (e.g User) you need to perform a PATCH webrequest specifying the underlying ID (objectID) of the object. The URI format looks like;

https://orgName.api.identitynow.com/v2/accounts/2c91808365bd1f010165caf761625bcd?org=orgName

Example Script

Here is an example script. As per the previous two posts, change all the lines for your tenant and your API details.

  • Line 16 is the query for objects to update
  • Lines 39-41 is the attribute to update

Updating Manager

For manager, the attribute is a reference on the IdentityNow Source to the Manager. On my “External Entities” Source I locate the object representing the Manager and obtain their accountId (which in my case is firstname.lastname) and set that as the ManagerID. I then find the users that I want to update for this manager and update them as we did in the previous example, but with a reference to accountId of the Manager for the Manager attribute.

NOTE: When querying IdentityNow via the API the syntax is very important. Especially when also incorporating variables. If I have a variable $manager with a displayName value, that would normally contain a space. So we need to capture the whole string. Here is an example of doing that. So in order to query for $manager = “Rick Sanchez” in PowerShell that would be:

$queryManager = "attributes.displayName:"+'"'+"$($manager)"+'"'

which will give us attributes.displayName:”Rick Sanchez” which will return in my case the single object for Rich Sanchez not a list of references to Rick Sanchez.

Deleting SailPoint IdentityNow Entities

Deleting is very similar to Updating. Again the easiest method is to search and obtain the object(s) to be deleted and then delete via a DELETE webrequest specifying the underlying ID (objectID) of the object to be deleted. The URI looks like;

https://orgName.api.identitynow.com/v2/accounts/2c91808565bd1f110165cb628d1a702f?org=orgName

Example Script

Here is an example script. It searches IdentityNow based on object naming (see line 14), then finds the Source that the object is connected to that we wish to delete. In this example the Source is the one I created in the last post “External Entities”. Update for the name of your Source (line 25).

Summary

Using the API we can Search for Identities, Author and Update them.

Authoring Identities in SailPoint IdentityNow via the API and PowerShell

Introduction

A key aspect of any Identity Management project is having an Authoritative Source for Identity. Typically this is a Human Resources system. But what about identity types that aren’t in the authoritative source? External Vendors, contingent contractors and identities that are used by End User Computing systems such as Privileged Accounts, Service Accounts, Training Accounts.

Now some Identity Management Solutions allow you to Author identity through their Portals, and provide a nice GUI to create a user/training/service account. SailPoint IdentityNow however doesn’t have that functionality. However it does have an API and I’ll show you in the post how you can use it to Author identity into IdentityNow via the API.

Overview

So, now you’re thinking great, I can author Identity into IdentityNow via the API. But, am I supposed to get managers to interface with an API to kick off a workflow to create identities? Um, no. I don’t think we want to be giving them API access into our Identity Management solution.

The concept is this. An Identity Request WebApp would collect the necessary information for the identities to be authored and facilitate the creation of them in IdentityNow via the API. SailPoint kindly provide a sample project that does just that. It is available on Github here. Through looking at this project and the IdentityNow API I worked out how to author identity via the API using PowerShell. There were a few gotchas I had to work through so I’m providing a working example for you to base a solution around.

Getting Started

There are a couple of things to note.

  • Obviously you’ll need API access
  • You’ll want to create a Source that is of the Flat File type (Generic or Delimited File)
    • We can’t create accounts against Directly Connected Sources
  • There are a few attributes that are mandatory for the creation
    • At a minimum I supply id, name, givenName, familyName, displayName and e-mail
    • At an absolute bare minimum you need the following. Otherwise you will end up with an account in IdentityNow that will show as “Identity Exception”
      • id, name, givenName, familyName, e-mail*

* see note below on e-mail/email attribute format based on Source type

Creating a Flat File Source to be used for Identity Authoring

In the IdentityNow Portal under Admin => Connections => Sources select New.

Create New Source.PNG

I’m using Generic as the Source Type. Give it a name and description. Select Continue

New Generic Source.PNG

Assign an Owner for the Source and check the Accounts checkbox. Select Save.

New Source Properties.PNG

At the end of the URL of the now Saved new Source get and record the SourceID. This will be required so that when we create users via the API, they will be created against this Source.

SourceID.PNG

If we look at the Accounts on this Source we see there are obviously none.

Accounts.PNG

We’d better create some. But first you need to complete the configuration for this Source. Go and create an Identity Profile for this Source, and configure your Identity Mappings as per your requirements. This is the same as you would for any other IdentityNow Source.

Authoring Identities in IdentityNow with PowerShell

The following script is the bare minimum to use PowerShell to create an account in IdentityNow. Change;

  • line 2 for your Client ID
  • line 4 for your Client Secret
  • line 8 for your Tenant Org Name
  • line 12 for your Source ID
  • the body of the request for the account to be created (lines 16-21)

NOTE: By default on the Generic Source the email attribute is ’email’. By default on the Delimited Source the email attribute is ‘e-mail’. If your identities after executing the script and a correlation are showing as ‘Identity Exception’ then it’s probably because of this field being incorrect for the Source type. If in doubt check the Account Schema on the Source.

Execute the script and refresh the Accounts page. You’ll see we now have an account for Rick.

Rick Sanchez.PNG

Expanding Rick’s account we can see the full details.

Rick Full Details.PNG

Testing it out for a Bulk Creation

A few weeks ago I wrote this post about generating user data from public datasets. I’m going to take that and generate 50 accounts. I’ve added additional attributes to the Account Schema (for suburb, state, postcode, street). Here is a script combining the two.

Running the script creates our 50 users in conjunction to the couple I already had present.

Bulk Accounts Created.PNG

Summary

Using the IdentityNow API we can quickly leverage it to author identity into SailPoint IdentityNow. That’s the easy bit sorted. Now to come up with a pretty UI and a UX that passes the End-User usability tests. I’ll leave that with you.

 

Set your eyes on the Target!

1015red_F1CoverStory.jpg
So in my previous posts I’ve discussed a couple of key points in what I define as the basic principles of Identity and Access Management;

Now that we have all the information needed, we can start to look at your target systems. Now in the simplest terms this could be your local Active Directory (Authentication Domain), but this could be anything, and with the adoption of cloud services, often these target systems are what drives the need for robust IAM services.
Something that we are often asked as IAM consultants is why. Why should the corporate applications be integrated with any IAM Service, and these are valid questions. Sometimes depending on what the system is and what it does, integrating with an IAM system isn’t a practical solution, but more often there are many benefits to having your applications integrated with and IAM system. These benefits include:

  1. Automated account provisioning
  2. Data consistency
  3. If supported Central Authentication services

Requirements
With any target system much like the untitled1IAM system itself, the one thing you must know before you go into any detail are the requirements. Every target system will have individual requirements. Some could be as simple as just needing basic information, first name, last name and date of birth. But for most applications there is allot more to it, and the requirements will be derived largely by the application vendor, and to a lessor extent the application owners and business requirements.
IAM Systems are for the most part extremely flexible in what they can do, they are built to be customized to an enormous degree, and the target systems used by the business will play a large part in determining the amount of customisations within the IAM system.
This could be as simple as requiring additional attributes that are not standard within both the IAM system and your source systems, or could also be the way in which you want the IAM system to interact with the application i.e. utilising web services and building custom Management Agents to connect and synchronise data sets between.
But the root of all this data is when using an IAM system you are having a constant flow of data that is all stored within the “Vault”. This helps ensure that any changes to a user is flowed to all systems, and not just the phone book, it also ensures that any changes are tracked through governance processes that have been established and implemented as part of the IAM System. Changes made to a users’ identity information within a target application can be easily identified, to the point of saying this change was made on this date/time because a change to this persons’ data occurred within the HR system at this time.
Integration
Most IAM systems will have management agents or connectors (the phases can vary depending on the vendor you use) built for the typical “Out of Box” systems, and these will for the most part satisfy the requirements of many so you don’t tend to have to worry so much about that, but if you have “bespoke” systems that have been developed and built up over the years for your business then this is where the custom management agents would play a key part, and how they are built will depend on the applications themselves, in a Microsoft IAM Service the custom management agents would be done using an Extensible Connectivity Management Agent (ECMA). How you would build and develop management agents for FIM or MIM is quite an extensive discussion and something that would be better off in a separate post.
One of the “sticky” points here is that most of the time in order to integrate applications, you need to have elevated access to the applications back end to be able to populate data to and pull data from the application, but the way this is done through any IAM system is through specific service accounts that are restricted to only perform the functions of the applications.
Authentication and SSO
Application integration is something seen to tighten the security of the data and access to applications being controlled through various mechanisms, authentication plays a large part in the IAM process.
During the provisioning process, passwords are usually set when an account is created. This is either through using random password generators (preferred), or setting a specific temporary password. When doing this though, it’s always done with the intent of the user resetting their password when they first logon. The Self Service functionality that can be introduced to do this enables the user to reset their password without ever having to know what the initial password was.
Depending on the application, separate passwords might be created that need to be managed. In most cases IAM consultants/architects will try and minimise this to not being required at all, but this isn’t always the case. In these situations, the IAM System has methods to manage this as well. In the Microsoft space this is something that can be controlled through Password Synchronisation using the “Password Change Notification Service” (PCNS) this basically means that if a user changes their main password that change can be propagated to all the systems that have separate passwords.
SONY DSCMost applications today use standard LDAP authentication to provide access to there application services, this enables the password management process to be much simpler. Cloud Services however generally need to be setup to do one of two things.

  1. Store local passwords
  2. Utilise Single Sign-On Services (SSO)

SSO uses standards based protocols to allow users to authenticate to applications with managed accounts and credentials which you control. Examples of these standard protocols are the likes of SAML, oAuth, WS-Fed/WS-Trust and many more.
There is a growing shift in the industry for these to be cloud services however, being the likes of Microsoft Azure Active Directory, or any number of other services that are available today.
The obvious benefit of SSO is that you have a single username or password to remember, this also greatly reduces the security risk that your business has from and auditing and compliance perspective having a single authentication directory can help reduce the overall exposure your business has to compromise from external or internal threats.
Well that about wraps it up, IAM for the most part is an enabler, it enables your business to be adequately prepared for the consumption of Cloud services and cloud enablement, which can help reduce the overall IT spend your business has over the coming years. But one thing I think I’ve highlighted throughout this particular series is requirements requirements requirements… repetitive I know, but for IAM so crucially important.
If you have any questions about this post or any of my others please feel free to drop a comment or contact me directly.
 

The Vault!

Vault
The vault or more precisely the “Identity Vault” is a single pane view of all the collated data of your users, from the various data source repositories. This sounds like a lot of jargon but it’s quite simple really.
In the diagram below we look at a really simple attribute firstName (givenName within AD) DataFlow
As you will see at the centre is the attribute, and branching off this is all the Connected Systems, i.e. Active Directory. What this doesn’t illustrate very well is the specific data flow, where this data is coming from and where it’s going to. This comes down to import and export rules as well as any precedence rules that you need to put in place.
The Identity Vault, or Central Data Repository, provides a central store of an Identities information aggregated from a number of sources. It’s also able to identify the data that exists within each of the connected systems from which it either collects the identity information from or provides the information to as a target system. Sounds pretty simple right?
Further to all the basics described above, each object in the Vault has a Unique Identifier, or an Anchor. This is a unique value that is automatically generated when the user is created to ensure that regardless of what happens to the users details throughout the lifecycle of the user object, we are able to track the user and update changes accordingly. This is particularly useful when you have multiple users with the same name for example, it avoids the wrong person being updated when changes occur.

Attribute User 1 User 2
FirstName John John
LastName Smith Smith
Department Sales Sales
UniqueGUID 10294132 18274932

So the table above provides the most simplest forms of a users identity profile, whereas a complete users identity profile will consist of many more attributes, some of which maybe custom attributes for specific purposes, as in the example demonstrated below;

Attribute ContributingMA Value
AADAccountEnabled AzureAD Users TRUE
AADObjectID AzureAD Users 316109a6-7178-4ba5-b87a-24344ce1a145
accountName MIM Service jsmith
cn PROD CORP AD Joe Smith
company PROD CORP AD Contoso Corp
csObjectID AzureAD Users 316109a6-7178-4ba5-b87a-24344ce1a145
displayName MIM Service Joe Smith
domain PROD CORP AD CORP
EXOPhoto Exchange Online Photos System.Byte[]
EXOPhotoChecksum Exchange Online Photos 617E9052042E2F77D18FEFF3CE0D09DC621764EC8487B3517CCA778031E03CEF
firstName PROD CORP AD Joe
fullName PROD CORP AD Joe Smith
mail PROD CORP AD joe.smith@contoso.com.au
mailNickname PROD CORP AD jsmith
o365AccountEnabled Office365 Licensing TRUE
o365AssignedLicenses Office365 Licensing 6fd2c87f-b296-42f0-b197-1e91e994b900
o365AssignedPlans Deskless, MicrosoftCommunicationsOnline, MicrosoftOffice, PowerAppsService, ProcessSimple, ProjectWorkManagement, RMSOnline, SharePoint, Sway, TeamspaceAPI, YammerEnterprise, exchange
o365ProvisionedPlans MicrosoftCommunicationsOnline, SharePoint, exchange
objectSid PROD CORP AD AQUAAAAAAAUVAAAA86Yu54D8Hn5pvugHOA0CAA==
sn PROD CORP AD Smith
source PROD CORP AD WorkDay
userAccountControl PROD CORP AD 512
userPrincipalName PROD CORP AD jsmith@contoso.com.au

So now we have more complete picture of the data, where it’s come from and how we connect that data to a users’ identity profile. We can start to look at how we synchronise that data to any and all Managed targets. It’s very important to control this flow though, to do so we need to have in place strict governance controls about what data is to be distributed throughout the environment.
One practical approach to managing this is by using a data exchange agreement. This helps the organisation have a more defined understanding of what data is being used by what application and for what purpose, it also helps define a strict control on what the application owners can do with the data being consumed for example, strictly prohibiting the application owners from sharing that data with anyone, without the written consent of the data owners.
In my next post we will start to discuss how we then manage target systems, how we use the data we have to provision services and manage the user information through what’s referred to as synchronisation rules.
As with all my posts if, you have any questions please drop me a note.
 

Where's the source!

SauceIn this post I will talk about data (aka the source)! In IAM there’s really one simple concept that is often misunderstood or ignored. The data going out of any IAM solution is only as good as the data going in. This may seem simple enough but if not enough attention is paid to the data source and data quality then the results are going to be unfavourable at best and catastrophic at worst.
With most IAM solutions data is going to come from multiple sources. Most IAM professionals will agree the best place to source the majority of your user data is going to be the HR system. Why? Well simply put it’s where all important information about the individual is stored and for the most part kept up to date, for example if you were to change positions within the same company the HR systems are going to be updated to reflect the change to your job title, as well as any potential direct report changes which may come as a result of this sort of change.
I also said that data can come and will normally always come from multiple sources. At typical example of this generally speaking, temporary and contract staff will not be managed within the central HR system, the HR team simply put don’t care about contractors. So where do they come from, how are they managed? For smaller organisations this is usually something that’s manually done in AD with no real governance in place. For the larger organisations this is less ideal and can be a security nightmare for the IT team to manage and can create quite a large security risk to the business, so a primary data source for contractors becomes necessary what this is is entirely up to the business and what works for them, I have seen a standard SQL web application being used to populate a database, I’ve seen ITSM tools being used, and less common is using the IAM system they build to manage contractor accounts (within MIM 2016 this is through the MIM Portal).
There are many other examples of how different corporate applications can be used to augment the identity information of your user data such as email, phone systems and to a lessor extent physical security systems building access, and datacentre access, but we will try and keep it simple for the purpose of this post. The following diagram helps illustrate the dataflow for the different user types.
IAM Diagram
What you will notice from the diagram above, is even though an organisation will have data coming from multiple systems, they all come together and are stored in a central repository or an “Identity Vault”. This is able to keep an accurate record of the information coming from multiple sources to compile what is the users complete identity profile. From this we can then start to manage what information is flowed to downstream systems when provisioning accounts, and we can also ensure that if any information was to change, it can be updated to the users profiles in any attached system that is managed through the enterprise IAM Services.
In my next post I will go into the finer details of the central repository or the “Identity Vault”
So in summary, the source of data is very important in defining an IAM solution, it ensures you have the right data being distributed to any managed downstream systems regardless of what type of user base you have. My next post we will dig into the central repository or the Identity Vault, this will go into details around how we can set precedence to data from specific systems to ensure that if there is a difference in the data coming from the difference sources that only the highest precedence will be applied we will also discuss how we augment the data sets to ensure that we are also only collecting the necessary information related to the management of that user and the applications that use within your business.
As per usual, if you have any comments or questions on this post of any of my previous posts then please feel free to comment or reach out to me directly.

An Identity Consultants Summary of the recent Cloud Identity Summit 2017

I’ve just returned from Chicago and the Cloud Identity Summit that was held at the Sheraton Grand Chicago. It was my first CIS conference and reminded me a lot of the now defunct Quest Experts Conference and The Burton Group Conference, both in terms of the content and scale. It definitely had a more intimate feel than the massive Microsoft Ignite category of event which attracts 25k+ attendees. 1400 attendees at CIS was a record for this event, but it still meant you got the 1:1 time with vendors and speakers which is fantastic.

Just like the Quest “The Experts Conference” (TEC) and The Burton Group Conference if you pick your sessions based on the synopsis and the speaker the sessions can be highly technical 400+ level and worthy of the 30 hr journey to get to the conference. I focused on my particular subject of Identity, so this summary is biased towards that track.

A summary of my takeaways that I’ll briefly detail in the post are:

  • ID Pro
  • Strong Authentication / Goodbye Passwords
  • PAM and IGA
  • SCIM 2.0
  • FIDO 2.0

And before I forget, CIS is dead; long live CIS, now known as Identiverse which will be in Boston in June (24-27) 2018. Ping Identity have renamed the conference moving forward.

 

ID Pro

Ian Glazer in his keynote on Tuesday announced what has been missing from the IDAM Community. A professional organisation to represent it. Named ID Pro with all the details available here, it is professional organisation for IDAM exponents. Join now here for US$150. Supported by the Kantara initiative this organisation already has the support and backing of the industry.

 

Strong Authentication / Goodbye Passwords

There were numerous sessions around this topic. And it was fantastic to see that the eco-system to support the holy grail future of No Passwords, but Strong Authentication is now present. Alex Simons summed it up nicely in his keynote on Wednesday but setting the goal of 1 Billion Logins (without passwords) by 2018 launching the hashtag to go with it #1Billionby2018 Checkout the FIDO 2.0 summary further below.

 

Privileged Account Management and Identity Governance & Access

Privileged Account Management and Identity Governance & Access are better together. We knew this anyway and I’ve been approaching it this way with my solutions. It was therefore refreshing to be entertained by Kelly Grizzle in his session When meets through their mutual friend . In essence SailPoint have been working heavily on their IGA offering but with the help of SCIM now at 2.0 they’ve been working with PAM vendors such as CyberArk to provide the integration and visibility the two need. Kelly entertaining and informative presentation can be found here.

 

SCIM 2.0

Mentioned above in the PAM and IGA summary, SCIM 2.0 is now ready for prime time. Whilst SCIM has been around for some time it hasn’t seen widespread adoption in my circles. But that’s about to change. Microsoft have been using it as a primary integration method with Azure AD with the likes of Facebook for Work. Microsoft also have a SCIM MA for Microsoft Identity Manager. I’ll be experimenting with it in the near future.

 

FIDO 2.0

FIDO first came on to my radar about 4 years ago. It is in a lot of the workflows we do every day (if you have a modern operation system – Windows 8+ and bio-metrics on your device or a FIDO compliant token). With FIDO 2.0  and U2F v1.1 and UAF v1.1 now complete the foundation and enabling services for Strong Authentication are ready to go.

 

Summary of the Summary

I’ve tried hard to not make this too wordy, but the takeaway is this. Identity is the foundation of who you are and what you do. With all the other disruption in the IT industry around cloud and mobility, identity is always the enabler. Get it right and you can make life easier for your users, more visible for your information security officers and auditable for your compliance requirements. Just keep up, as it’s moving very fast.

It all comes down to "Requirements"

In my l last post, I discussed the basic concepts of Identity Management. In this post I’m going to talk about the need to clearly identify the business requirements for IAM as a whole and not just to a specific technical need. More often than not IAM projects are spawn to satisfy a specific need, and realistically these today are around the adoption of specific cloud technologies, and Office 365 is the most obvious one.
The issue with doing this is that when you implement a solution for a single purpose without putting any consideration into the “What else” question that is always overlooked. Over the course of my career I’ve discovered that when the next requirement comes up within the business that requires an Identity Management solution, they are often left in a situation which either requires them to completely rebuild the solution that has already been built to provide the initial service offering, or they are required to setup a completely separate environment which then complicates and increases the overall support for the solutions that could have, more often than not, been delivered within the single system.
This brings us back to the title, requirements! So, what are requirements? And why are they so important? Simply put, requirements tell you what you want to achieve within the business with the technology you have or are about to invest in.
When thinking of requirements for any IAM solution, there are basic principles that you should always take into consideration. An IAM solution is built to satisfy 4 basic functions;

  1. Authentication
  2. Authorization
  3. User Management
  4. Central User Store

When establishing requirements within your business for any IAM solution it’s important to understand these basic functions, and what they could mean for your business. So lets break them down.

Authentication

Most would say this is pretty self-explanatory, but there’s often a lot more to this than what many think. For example, you want to enable a SaaS application such as TechnologyOne Finance. Do you want to provide a seamless authentication model with Single Sign-On (SSO) or do you want to keep the authentication separate from your local security domains to provide a higher level of security. Do you want to provide more of a Same Sign-On solution where they use their local username and password but are forced to login every time which can be a less inviting end user experience.

Authorization

Authorization within IAM simply put provides authorization workflows to requests for access to resources or the creation of new resources managed by the IAM solutions. Authorizations ensure that access compliance and government processes are followed with all managed resources.

User Management

User management is simply that, it manages the user objects of which it knows about, this includes any add moves or changes that occur to these objects throughout their lifecycle, and when a user leaves the organisation for whatever reason they are terminated through the standard business processes that are established as part of the requirements that have been defined.

Central User Repository

This is effectively where everything is stored, in Microsoft it’s referred to as the metaverse, in the Novell space it’s referred to as the Identity Vault. But it’s simply a central repository of all user objects, as well as the configuration items such as workflows, policy rules as well as various other configuration items.
A common misconception with requirements is technical teams will look for what the Requirements comparisontechnical requirements are to satisfy the solutions which is being built! But this is where things will often go wrong, business requirements come down to simple business logic of what are the primary business objectives. These business requirements will often create a bunch of technical requirements, but it must start with the basic business requirements.

Summary

So, to summarise, complex IAM solutions all come down to basic principles, what am I trying to achieve as a business? It is for this reason you need to start with the basics and understand the purpose of the solution being built. You wouldn’t want to build a house without plans or you wouldn’t want to build a road without understanding where it’s going. So why would you do the same with your Identity Management solutions.

It's time to get your head out of the clouds!

head-in-the-clouds1
For those of you who know me, you are probably thinking “Why on earth would we be wanting to get our heads out of the “Cloud” when all you’ve been telling me for years now is the need to adopt cloud!
This is true for the most part, but my point here is many businesses are being flooded by service providers in every direction to adopt or subscribe to their “cloud” based offering, furthermore ICT budgets are being squeezed forcing organisations into SaaS applications. The question that is often forgotten is the how? What do you need to access any of these services? And the answer is, an Identity!
It is for this reason I make the title statement. For many organisations, preparing to utilise a cloud offering` can require months if not years of planning and implementing organisational changes to better enable the ICT teams to offer such services to the business. Historically “enterprises” use local datacentres and infrastructure to provide all business services, where everything is centrally managed and controlled locally. Controlling access to applications is as easy as adding a user to a group for example. But to utilise any cloud service you need to take a step back and get your head back down on ground zero, look at whether your environment is suitably ready. Do you have an automated simple method of providing seamless access to these services, do you need to invest in getting your business “Cloud Ready”?
image1This is where Identity comes to the front and centre, and it’s really the first time for a long time now Identity Management has become a centrepiece to the way a business can potentially operate. This identity provides your business with the “how’ factor in a more secure model.
So, if you have an existing identity management platform great, but that’s not to say it’s ready to consume any of these offerings, this just says you have the potential means. For those who don’t have something in place then welcome! Now I’m not going to tell you products you should or shouldn’t use, this is something that is dictated by your requirements, there is no “one size fits all” platform out there, in fact some may need a multitude of applications to deliver on the requirements that your business has, but one thing every business needs is a strategy! A roadmap on how to get to that elusive finish line!
A strategy is not just around the technical changes that need to be made but also around the organisational changes, the processes, and policies that will inevitably change as a consequence of adopting a cloud model. One example of this could be around the service management model for any consumption based services, the way you manage the services provided by a consumption based model is not the same as a typical managed service model.
And these strategies start from the ground up, they work with the business to determine their requirements and their use cases. Those two single aspects can either make or break an IAM service. If you do not have a thorough and complete understanding of the requirements that your business needs, then whatever the solution being built is, will never succeed.
So now you may be asking so what next? What do we as an organisation need to do to get ourselves ready for adopting cloud? Well all in all there are basically 5 principles you need to undertake to set the foundation of being “Cloud Ready”

  1. Understand your requirements
  2. Know your source of truth/s (there could be more then one)
  3. Determine your central repository (Metaverse/Vault)
  4. Know your targets
  5. Know your universal authentication model

Now I’m not going to cover all these in this one post, otherwise you will be reading a novel, I will touch base on these separately. I would point out that many of these you have probably already read, these are common discussion points online and the top 2 topics are discussed extensively by my peers. The point of difference I will be making will be around the cloud, determining the best approach to these discussions so when you have your head in the clouds, you not going to get any unexpected surprises.

Using an Azure Function to search the FIM/MIM Metaverse, create a Set and update the Set membership in the the FIM/MIM Service

Introduction

This is the third and last post in this series of integrating Microsoft Identity Manager with Azure Functions.
The first detailed how to use an Azure Function to retrieve data from the MIM Service Server. The second detailed how to use an Azure Function to retrieve data from the MIM Sync (Metaverse) Server.
This third post combines the two and then performs an action in the MIM Service. The practical purpose of this could be functions like “find all users in location y” and “enable them for entitlement x” or “add an attribute value on each of their objects”.

Overview

The reasoning for the two stage approach is that in my experience it is a lot easier to search the Metaverse than the MIM Service to find an object(s), but also the Metaverse has all the information about objects whereas the MIM Service is a ShadowVerse of the Metaverse containing a subset of the managed objects metadata.
Moving forward then the architecture is a hybrid of the first two posts that introduced the concepts associated with integrating MIM with Azure Functions. As per the other two posts this is a base working example and concept.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Overview.png

Prerequisites

The prerequisites are the same as for the 1st and 2nd posts in this series. You’ll need to work through those examples to setup the dependencies and prerequisites. From there you can create one more Azure Function that brings everything together. That’s what I’m covering in this post.
Therefore the prerequisites are;

  • Azure Tenant and a Function Plan
  • Microsoft Identity Manager implementation
  • Remote Powershell configured for your MIM Sync Server
  • Lithnet FIM MIIS Automation Powershell Module installed on your MIM Sync Server
  • The necessary Firewall Rules on your MIM Sync Server and your Azure Network Security Group (assuming your MIM Infrastructure is in Azure) to allow Azure Functions to communicate with MIM Sync and Service Servers

 

This Example performs the following

In this example the HTTP Trigger Azure Function;

  • Takes input for ObjectType, Attribute, AttributeValue, SetName
  • Searches the MIM Sync Metaverse for the input ObjectType, with the AttributeValue in the Attribute
  • Connects to the MIM Service
  • Creates the Set based of the input SetName if it doesn’t exist
  • Adds the objects from the search to the Set
  • Returns the objects added to the Set

In a real world implementation you’d do the above with a criteria based set. This post is a concept of search and find, performing a create and updating. That has many practical applications.

Create your new Azure Function

Just like the other two posts, we’re going to create a new Powershell HTTP Trigger Azure Function.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Create1.png
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Create2.png
Upload the Lithnet RMA PS Module to your new Azure Function (as per blog post 1 in this series). You should also be using protected credentials now as well. So upload your username/password encryption key.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Files.png
 
Here is the Azure Function Powershell Script that performs the process detailed above.

Test it out. Looks good. 88 users matched the value of Sydney in their location attribute.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Test.png
Verify that the Set was created and the membership updated.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Verify.png

Test calling the Azure Function remotely

Now that it is all working in the Azure Function, lets try doing it from Powershell remotely. This time I’m again looking for Person objects that have Sydney in their location attribute and I’ll create a set named Sydney-NSW and put them in it.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/TestRemote.png
Brilliant, that works nicely. Let’s verify that the Set was created and has the correct number of users in it. Yes, a perfect match.
https://dl.dropboxusercontent.com/u/76015/BlogImages/MIMFunctionSearchModify/Verify2.png

Summary

Putting Azure Functions and Powershell together along with the Lithnet Powershell Modules opens up a world of possibilities for automation and integration of the MIM Service without the need for any additional infrastructure or any considerable effort.
Experiment and let me know what you do with this style of integration.
Follow Darren on Twitter @darrenjrobinson

Follow ...+

Kloud Blog - Follow