Scripting the generation & creation of Microsoft Identity Manager Sets/Workflows/Sync & Management Policy Rules with the Lithnet Resource Management PowerShell Module

Introduction

Yes, that title is quite a mouthful. And this post is going to be quite long. But worth the read if you are having to create a number of rules in Microsoft/Forefront Identity Manager, or even more so the same rule in multiple environments (eg. Dev, Staging, Production).

My colleague David Minnelli introduced using the Lithnet RMA PowerShell Module and the Import-RMConfig cmdlet recently for bulk creation of MIM Sets and MPR’s. David has a lot of the background on Import-RMConfig and getting started with it. Give that a read for a more detailed background.

In this post I detail using Import-RMConfig to create a Set, Workflow, Synchronization Rule and Management Policy Rule to populate a Development AD Domain with Users from a Production AD Domain. This process is designed to run on a combined MIM Service/Sync Server. If your roles a separated (as they likely will be in a Production environment) you will need to run these scripts on the MIM Sync Server (so it can query the Management Agents, and you will need to add in a line to connect to the MIM Service (eg. Set-ResourceManagementClient ) at the beginning of the script.

In my environment I have two Active Directory Management Agents, each connected to an AD Forest as shown below.

On each of the AD MA’s I have a Constant Flow Attribute (named Source) configured to flow in a value representing the source AD Forest. I’m doing this in my environment as I have more than one production forest (hence the need for automation). You could simply use the Domain attribute for the Set criteria. That attribute is used in the Set later on. Mentioning it up front so it make sense.

Overview

The Import-RMConfig cmdlet uses XML and XOML files that contain the configuration required to create the Set, Workflow, Sync Rule and MPR in the FIM/MIM Service. The order that I approach the creation is, Sync Rule, Workflow, Set and finally the MPR.

Each of these objects as indicated above leverage an XML and/or XOML input file. I’ve simplified base templates and included them in the scripts.

The Sync Rule Script includes a prompt to choose a folder (you can create one through the GUI presented) to store the XML/XOML files to allow the Import-RMConfig to use them. Once generated you can simply reference the files with Import-RMConfig to replicate the creation in another environment.

Creating the Synchronization Rule

For creation of the Sync Rule we need to define which Management Agent will be the target for our Sync Rule. In my script I’ve automated that too (as I have a number to do). I’m querying the MIM Sync Server for all its Active Directory MA’s and then providing a dialog to allow you to choose the target MA for the Sync Rule. That dialog simply looks like the one below.

Creating the Sync Rule will finally ask you to give the Rule a name. This name will then be used as the base Display Name for the Set, MPR and Workflow (and a truncated version as the Rule ID’s).

The script below in the $SyncRuleXML section defines the rules of the Sync Rule. Mine is an Outbound Sync Rule, with a base set of attributes and transforming the users UPN and DN (for the differing Development AD namespace). Update lines 42 and 45 for the users UPN and DN your namespace.

Creating the Workflow

The Workflow script is pretty self-explanatory. A simple Action based workflow and is below.

Creating the Set

The Set is the group of objects that will be synchronized to the target management agent. As my Sync Rule is only for Users my Set is also only contains users. As stated in the Overview I have an attribute that defines the authoritative source for the objects. I’m also using that in my Set criteria.

Creating the Management Policy Rule

The MPR ties everything together. Here’s that part of the script.

Tying them all together

Here is the end to end automation, and the raw script that you could use as the basis for automating similar rule generation. The Sync Rule could easily be updated for Contacts or Groups. Remember the attributes and object classes are case sensitive’.

  • Through the Browse for Folder dialog I created a new folder named ProvisionDevAD

  • I provided a Display Name for the rules

  • I chose the target Management Agent

  • The SyncRule, Workflow, Set and MPR are created. The whole thing takes seconds.

  • Script Complete

Let’s take a look at the completed objects through the MIM Portal.

Sync Rule

The Sync Rule is present as we named it. Including the !__ prefix so it appears at the top of the list.

Outbound Sync Rule based on a MPR, Set and Workflow

The Resources will be created and if deleted de-provisioned.

And our base attribute flows.

Set

Our Set was created.

Our naming aligns with what we input

And a Criteria based Set. As per the Overview I have an attribute populated by a Constant flow rule that I based my set on. You’ll want to update for you environment.

Workflow

The Action Workflow was created

All looks great

And it applies our Sync Rule

MPR

And finally our MPR. Created as a Transition In MPR with Action Workflow

Set Transition and naming all aligned

The Transition Set configured for the Set that was created

And the Workflow configured with the Workflow that was just created

Summary

When you have a lot of Sync Rules to create, or you know you will need to re-create them numerous times, potentially in different environments automation is key. This just scratches the surface on what can be achieved, and made so much easier using the Lithnet PowerShell Modules.

Here’s is the full script. Note: You’ll need to make a couple of minor changes as indicated earlier, but you should be able to create a Provisioning Rule end to end pretty quick to validate the process. Then customize accordingly for your environment and requirements. Enjoy.

Automating Source IP Address updates on an Azure Network Security Group RDP Access Rule

Recently I’ve migrated a bunch of Virtual Box Virtual Machines to Azure as detailed here. These VM’s are in Resource Groups with a Network Security Group associated that restricts access to them for RDP based on a source TCPIP address. All good practice. However from a usability perspective, when I want to use these VM’s, I’m not always in the same location, and rarely on a connection with a static IP address.

This post details a simple little script that;

  • Has a couple of variables associated with a Resource Group, Network Security Group, Virtual Machine Name and an RDP Configuration File associated with the VM
  • Gets the public IP Address of the machine I’m running the script from
  • Prompts for Authentication to Azure, and retrieves the NSG associated with the Resource Group
  • Compares the Source IP Address in the ‘RDP’ Inbound Rule to my current IP Address. If they aren’t a match it updates the Source IP Address to be my current public IP Address
  • Starts the Virtual Machine configured at the start of the script
  • Launches Remote Desktop using the RDP Configuration file

The Script

Here’s the raw script. Update lines 2-8 for your environment and away you go. Simple but useful as is often the way.

Diagnosing FIM/MIM ‘kerberos-no-logon-server’ error on an Active Directory Management Agent

Overview

I have a complex customer environment where Microsoft Identity Manager is managing identities across three Active Directory Forests. The Forests all serve different purposes and are contained in different network zones. Accordingly there are firewalls between the zone where the MIM Sync Server is located and two of the other AD Forests as shown in the graphic below.

As part of the project the maintainers of the network infrastructure had implemented rules to allow the MIM Sync server to connect to the other two AD Forests. I had successfully been able to create the Active Directory Management Agents for each of the Forests and perform synchronization imports.

The Error ‘kerberos-no-logon-server’

Everything was going well right up to the point I went to export changes to the two AD Forests that were separated by firewalls. I received the ‘kerberos-no-logon-server’ error as shown below from the run profile output.

I started investigating the error as I hadn’t encountered this one before. There were a few posts on the possibilities mainly dealing with properties of the AD MA’s configuration. But I did stumble on a mention of kerberos being used when provisioning users to Active Directory and setting the initial password. That aligned with what I was doing. I had provided the networking engineers with my firewall port requirements. Those are (no PCNS required for this implementation) ;

  • 389 TCP – LDAP
  • 636 TCP – LDAPS
  • 88 TCP – Kerberos
  • 464 TCP/UDP – Kerberos
  • 53 TCP – DNS
  • 3268 TCP/UDP – Global Catalog
  • 3269 TCP/UDP – Global Catalog
  • 135 TCP – RPC

My old school immediate thought was to Telnet to each of the ports to see if the firewall was allowing me through. But with a couple of forests to test against and UDP ports as well, it wasn’t going to be that easy. I found a nice little Test-Port function that did both TCP and UDP. I already had an older script for testing TCP ports via PowerShell. So I combined them.

Identifying the cause

As suspected connectivity to the forest where my MIM Sync Server was located was all good. The other two, not so much. GC connectivity wouldn’t give me the Kerberos error, but not having Kerberos Port 464 certainly would.

In the backwards and forwards with the networking team I had to test connectivity many times so I added a running output as well as a summary output. The running output highlighting ports that weren’t accessible.

Here’s the raw script if you’re in a similar situation. Get the Test-Port Function from the URL in line 1 to test UDP Port connectivity. Add additional ports to the arrays if required (eg. for PCNS), and update the forest names in lines 21-23.

Summary

I’m sure this is going to become more relevant in a Cloud/Hybrid world where MIM Servers will be in Azure, Active Directory Forests will be in different networks and separated by firewalls and Network Security Groups.

Adapting to the changes in the AzureAD Preview PowerShell Module ADAL Helper Library

I’m a big proponent of using PowerShell for integration and automation of Azure Active Directory Services using the Azure AD GraphAPI. You may have seen many of my posts leverage the evolving Azure AD Preview PowerShell Module helper libraries. Lines in my scripts that use this look like the one below. In this case using preview version 2.0.0.52.

# the default path to where the ADAL GraphAPI PS Module puts the Libs
Add-Type -Path 'C:\Program Files\WindowsPowerShell\Modules\AzureADPreview\2.0.0.52\Microsoft.IdentityModel.Clients.ActiveDirectory.dll'

The benefit of using this library is the simplification of Authentication to AzureAD, from which we can then receive a token and interact with the GraphAPI via PowerShell using Invoke-RestMethod.

Earlier this week it was bought to my attention that implementation of some of my scripts were failing when using the latest v2 releases of the AzureAD PowerShell Module (v2.0.0.98).  Looking into it the last version I had working is v2.0.0.52. v2.0.0.55 doesn’t work with my scripts either.  So anything after v2.0.0.52 the following will not work

What’s Changed?

First up the PowerShell Module has been renamed. It is no longer AzureADPreview, it is just AzureAD. So the path it gets installed into (depending on the version you have) is now;

'C:\Program Files\WindowsPowerShell\Modules\AzureAD\2.0.0.98\Microsoft.IdentityModel.Clients.ActiveDirectory.dll'

Looking into the updated PowerShell Module there has been a change to the Microsoft.IdentityModel.Clients.ActiveDirectory.dll library.

A number of the methods in the library have changed. I believe this is part of Microsoft transitioning the endpoint to use GraphAPI. With that understanding I approached using PowerShell to integrate with the GraphAPI more akin to the way I do when not using the helper library.

User PowerShell and the ADAL Helper Library to connect to AzureAD via the GraphAPI

Here is the updated script to connect (and retrieve a batch of users). You will need to update lines 4, 17 & 18 for your Tenant name and the username and password (non-MFA enabled) you will be connecting with.

Bulk create and update related configuration objects in FIM/MIM using the Lithnet Import-RMConfig cmdlet

Working on a FIM implementation for a customer, I needed to bulk create and update a number of related Sets and MPRs which granted permissions to users. I could have performed this task a number of ways:

  • Manually create and update all objects
  • Scripted in PowerShell using FIM Automation
  • Scripted using the Lithnet FIM/MIM Service PowerShell Module

I’ve been successfully using the Lithnet FIM/MIM Service PowerShell Module in a number of scripts to query and bulk create objects in the FIM Service which has greatly improved the quality and simplicity of my PowerShell scripts compared to using the FIM Automation module. What I hadn’t used to date was the ability in the Lithnet module to use the Configuration Management features within FIM, so I thought I’d set myself a challenge to come upto speed with this feature and determine if it was a quicker method than using FIM Automation. I knew it was going to be quicker using the Lithnet module but since I needed to do some research and testing compared to knowing how to tackle it using FIM Automation, I needed to be 100% convinced.

This blog assumes a very good working knowledge of the Microsoft Forefront Identity Manager product, now known as Microsoft Identity Manager. If you haven’t used the Lithnet module yet, what are you waiting for! Ryan Newington has developed and released to the community this PowerShell module which cuts the amount of scripting required to manage FIM configuration and objects within the FIM Service. Using the Configuration Management features was a perfect set of functions I needed for my scenario which I will now describe.

Scenario

For this customer, they had a delegated permissions model where each of their 13 agencies had administrators that were able to manage a fixed set of user attributes just for their own users. Users from other agencies were not visible within their view within the FIM Portal, and to support this model, each agency has linked FIM Set and MPR objects to give this delegated administration. My customer engagement was to update this existing FIM implementation to add custom telephone-based attributes against each user, and create a new phone object type. The existing permissions needed to be updated to include the new telephony attributes, and new permissions were required for the phone objects linked to each agency. This customer had 13 agencies and the beauty was the naming convention the FIM solution had for Sets and MPRs, with the names of the common objects being the same with only the agency name being different. For example, each agency administrator Set object was called ‘_Agency (<agency>) Administrators’. This was the same for MPRs, so my task was made easier as I just need to develop a templated XML configuration file which I could easily update per agency. Follow me so far?

What I liked about the Configuration Management feature was the ability to easily reference other objects without needing to separately query the FIM Service to return an ‘ObjectID’ which you can then supply to your configuration. The Lithnet module takes care of the referencing.

The Lithnet module is located here, and the quickest way to learn about this Configuration Management feature is to read the Lithnet documentation here, and as suggested watch the recording from FIM contributor Ike Ugochuku and download his sample XML configuration here which is a great starting point – here.

Required Actions

For my scenario, I needed to perform the following:

  • Create a Set object to group all phone objects for each agency
  • Create a permissions-granting MPR object giving each agency Modify permissions to one attribute for phone objects, a ‘visibility’ boolean attribute. You’ll notice in the XML configuration below, the Set created above is referenced, showing the power of this Lithnet module
  • Create a permissions-granting MPR object giving each agency Read permissions to all attributes for phone objects
  • Update an existing permissions-granted MPR object giving each agency Modify permissions to one attribute for user objects, against the ‘visibility’ attribute
  • Update an existing permissions-granted MPR object giving each agency Read permissions to the new phone related attributes for user objects
  • Update an existing permissions-granted MPR object removing the Modify permissions for user self-service which has been assigned in error some time ago

Using the -Preview switch against a configuration XML file (sample listed below) for the first agency, a preview can be tested before applying.

On a second agency with some of the updates already applied, I performed a preview. It shows all objects exist and which individual change would be applied.

As a test against the second agency, I changed resource operation from ‘Add Update’ to ‘Add’. When attempting the preview again, it showed it is creating the Sets even though they already exist. This is not ideal for verification, so I changed it back to ‘Add Update’ which I recommend using.

Removing ‘Preview’ and committing the change against the first agency shows what is changed and committed to FIM.

Validating the changes with a preview once applied to confirm all changes had been committed.

Although I could have been smarter to script the process for the remaining agencies, I did a search-and-replace of the agency name in the configuration XML file, saved it and re-ran the import.

Last step was to check the FIM Service requests to confirm they have been completed.

My sample XML configuration file for an agency change is located here.

Tips

Some simple recommended tips based on my use of this Configuration Management feature:

  • ID is a required tag even if you don’t use it as a reference within your XML configuration file
  • When creating FIM objects, use the XML tag ‘Add Update’ so that the Preview function can be used to validate the changes you have performed. If you leave it as ‘Add’, the Preview will simply report that the object ‘would’ be created if executed, rather than query and check that the objects exists and no update is required
  • Case sensitivity of the XML tags as mentioned by Ike in his video. I didn’t test changing the case as the sample XML configuration files were working configuration

So how long did it all take?

My goal was to make the changes into production against the 13 agencies after successfully testing the FIM configuration in a test environment. After watching Ike’s video and reading Ryan’s wiki, it took about 15 minutes to customise the sample XML files and update the configuration in the production FIM instance for all 13 agencies. A huge time saving. Other colleagues have started using this functionality both for speed of implementation and consistency in configuration, and my colleague Darren Robinson has also saved a great deal of effort as detailed in his blog here.

What scenarios have you performed with this functionality? Let us know in the comments below.

Dynamically rename an AWS Windows host deployed via a syspreped AMI

One of my customers presented me with a unique problem last week. They needed to rename a Windows Server 2016 host deployed using a custom AMI without rebooting during the bootstrap process. This lack of a reboot rules out the simple option of using the PowerShell Rename-Computer Cmdlet. While there are a number of methods to do this, one option we came up with is dynamically updating the sysprep unattended answer file using a PowerShell script prior to the unattended install running during first boot of a sysprepped instance.

To begin, we looked at the Windows Server 2016 sysprep process on an EC2 instance to get an understanding of the process required. It’s important to note that this process is slightly different to Server 2012 on AWS, as EC2Launch has replaced EC2Config. The high level process we need to perform is the following:

  1. Deploy a Windows Server 2016 AMI
  2. Connect to the Windows instance and customise it as required
  3. Create a startup PowerShell script to set the hostname that will be invoked on boot before the unattended installation begins
  4. Run InitializeInstance.ps1 -Schedule to register a scheduled task that initializes the instance on next boot
  5. Modify SysprepInstance.ps1 to replace the cmdline registry key after sysprep is complete and prior to shutdown
  6. Run SysprepInstance.ps1 to sysprep the machine.

Steps 1 and 2 are fairly self-explanatory, so let’s start by taking a look at step 3.

After a host is sysprepped, the value for the cmdline registry key located in HKLM:\System\Setup is populated with the windeploy.exe path, indicating that it should be invoked after reboot to initiate the unattended installation. Our aim is to ensure that the answer file (unattend.xml) is modified to include the computer name we want the host to be named prior to windeploy.exe executing. To do this, I’ve created the following PowerShell script named startup.ps1 and placed it in C:\Scripts to ensure my hostname is based on the internal IP address of my instance.

Once this script is in place, we can move to step 4, where we schedule the InitializeInstance.ps1 script to run on first boot. This can be done by running InitializeInstance.ps1 -Schedule located in C:\ProgramData\Amazon\EC2-Windows\Launch\Scripts

Step 5 requires a modification of the SysprepInstance.ps1 file to ensure the shutdown after Sysprep is delayed to allow modification of the cmdline registry entry mentioned in step 3. The aim here is to replace the windeploy.exe path with our startup script, then shutdown the host. The modifications to the PowerShell script to accommodate this are made after the “#Finally, perform Sysprep” comment.

Finally, run the SysprepInstance.ps1 script.

Once the host changes to the stopped state, you can now create an AMI and use this as your image.

A workaround for the Microsoft Identity Manager limitation of not allowing simultaneous Management Agents running Synchronisation Profiles

Why ?

For those of you that may have missed it, in early 2016 Microsoft released a hotfix for Microsoft Identity Manager that included a change that removed the ability for multiple management agents on a Microsoft Identity Manager Synchronization Server to simultaneously run synchronization run profiles. I detailed the error you get in this blog post.

At the time it didn’t hurt me too much as I didn’t require any other fixes that were incorporated into that hotfix (and the subsequent hotfix). That all changed with MIM 2016 SP1 which includes functionality that I do require. The trade-off however is I am now coming up against the inability to perform simultaneous delta/full synchronization run profiles.

Having had this functionality for all the previous versions of the product that I’ve worked with for the last 15+ years, you don’t realise how much you need it until its gone. I understand Microsoft introduced this constraint to protect the integrity of the system, but my argument is that it should be up to the implementor. Where I use this functionality a LOT is with Management Agents processing different object types from different connected systems (eg. Users, Groups, Contacts, Photos, other entity types). For speed of operation I want my management agents running synchronization run profiles simultaneously.

Overview of my workaround

First up this is a workaround not a solution to actually running multiple sync run profiles at once. I really hope Microsoft remove this limitation in the next hotfix. What I’m doing is trying to have my run sequences run as timely as possible.

What I’m doing is:

  • splitting my run profiles into individual steps
    • eg. instead of doing a Delta Import and a Delta Sync in a single multi-step run profile I’m doing a Delta Import as one run profile and the sync as a separate one. Same for Full Import Full Synchronization
  • running my imports in parallel. Essentially the Delta or Full imports are all running simultaneously
  • utilising the Lithnet MIIS Automation Powershell Module for the Start-ManagementAgent and Wait-ManagementAgent cmdlets to run the synchronisation run profiles

Example Script

This example script incorporates the three principles from above. The environment I built this example in has multiple Active Directory Forests. The example queries the Sync Server for all the Active Directory MA’s then performs the Imports and Sync’s. This shows the concept, but for your environment potentially without multiple Active Directories you will probably want to change the list of MA’s you use as input to the execution script.

Prerequisites

You’ll need;

The following script takes each of the Active Directory Management Agents I have (via a dynamic query in Line 9) and performs a simultaneous Delta Import on them. If your Run Profiles for Delta Imports aren’t named DI then update Line 22 accordingly.

-RunProfileName DI

With imports processed we want to kick into the synchronisation run profiles. Ideally though we want to kick these off as soon as the first import has finished.  The logic is;

  • We loop through each of the MA’s looking for the first MA that has finished its Import (lines 24-37)
  • We process the Sync on the first completed MA that completed its import and wait for it to complete (lines 40-44)
  • Enumerate through the remaining MA’s, verifying they’ve finished their Import and run the Sync Run Profile (lines 48-62)

If your Run Profiles for Delta Sync’s aren’t named DS then update Line 40 and 53 accordingly.

-RunProfileName DS

Summary

While we don’t have our old luxury of being able to choose for ourselves when we want to execute synchronisation run profiles simultaneously (please allow us to Microsoft), we can come up with band-aid workarounds to still get our synchronisation run profiles to execute as quickly as possible.

Let me know what solutions you’ve come up with to workaround this constraint.

How to create a PowerShell FIM/MIM Management Agent for AzureAD Groups using Differential Sync and Paged Imports

Introduction

I’ve been working on a project where I must have visibility of a large number of Azure AD Groups into Microsoft Identity Manager.

In order to make this efficient I need to use the Differential Query function of the AzureAD Graph API. I’ve detailed that before in this post How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries. Due to the number of groups and the number of members in the Azure AD Groups I needed to implement Paged Imports on my favourite PowerShell Management Agent (Granfeldt PowerShell MA). I’ve previously detailed that before too here How to configure Paged Imports on the Granfeldt FIM/MIM PowerShell Management Agent.

This post details using these concepts together specifically for AzureAD Groups.

Pre-Requisites

Read the two posts linked to above. They will detail Differential Queries and Paged Imports. My solution also utilises another of my favourite PowerShell Modules. The Lithnet MIIS Automation PowerShell Module. Download and install that on the MIM Sync Server where you be creating the MA.

Configuration

Now that you’re up to speed, all you need to do is create your Granfeldt PowerShell Management Agent. That’s also covered in the post linked above  How to create an AzureAD Microsoft Identity Manager Management Agent using the MS GraphAPI and Differential Queries.

What you need is the Schema and Import PowerShell Scripts. Here they are.

Schema.ps1

Two object classes on the MA as we need to have users that are members of the groups on the same MA as membership is a reference attribute. When you bring through the Groups into the MetaVerse and assuming you have an Azure AD Users MA using the same anchor attribute then you’ll get the reference link for the members and their full object details.

Import.ps1

Here is my PSMA Import.ps1 that performs what is described in the overview. Enumerate AzureAD for Groups, import the active ones along with group membership.

Summary

This is one solution for managing a large number of Azure AD Groups with large memberships via a PS MA with paged imports showing progress thanks to differential sync which then allows for subsequent quick delta-sync run profiles.

I’m sure this will help someone else. Enjoy.

Follow Darren on Twitter @darrenjrobinson

An Azure Timer Function App to retrieve files via FTP and Remote PowerShell

Introduction

In an age of Web Services and API’s it’s an almost a forgotten world where FTP Servers exist. However most recently I’ve had to travel back in time and interact with a FTP server to get a set of files that are produced by other systems on a daily basis. These files are needed for some flat-file imports into Microsoft Identity Manager.

Getting files off a FTP server is pretty simple. But needing to do it across a number of different environments (Development, Staging and Production) meant I was looking for an easy approach that I could also replicate quickly across multiple environments. As I already had Remote PowerShell setup on my MIM Servers for other Azure Function Apps I figured I’d use an Azure Function for obtaining the FTP Files as well.

Overview

My PowerShell Timer Function App performs the following:

  • Starts a Remote PowerShell session to my MIM Sync Server
  • Imports the PSFTP PowerShell Module
  • Creates a local directory to put the files into
  • Connects to the FTP Server
  • Gets the files and puts them into the local directory
  • Ends the session

Pre-requisites

From the overview above there are a number of pre-requites that other blog posts I’ve written detail nicely the steps involved to appropriately setup and configure. So I’m going to link to those. Namely;

  • Configure your Function App for your timezone so the schedule is correct for when you want it to run. Checkout the WEBSITE_TIME_ZONE note in this post.

    WEBSITE_TIME_ZONE

  • You’ll need to configure your Server that you are going to put the files onto for Remote PowerShell. Follow the Enable Powershell Remoting on the FIM/MIM Sync Server section of this blogpost.
  • The credentials used to connect to the MIM Server are secured as detailed in the Using an Azure Function to query FIM/MIM Service section of this blog post.
  • Create a Timer PowerShell Function App. Follow the Creating your Azure App Service section of this post but choose a Timer Trigger PowerShell App.
    • I configured my Schedule for 1030 every day using the following CRON configuration
      0 30 10 * * *
  • On the Server you’ll be connecting to in order to run the FTP processes you’ll need to copy the PSFTP Module and files to the following directories. I unzipped the PSFTP files and copied the PSFTP folder and its contents to;
    • C:\Program Files\WindowsPowerShell\Modules
    • C:\Windows\System32\WindowsPowerShell\v1.0\Modules

     

Configuring the Timer Trigger Function App

With all the pre-requisites in place it’s time to configure the Timer Function App that you created in the pre-requisites.

The following settings are configured in the Function App Application Settings;

  • FTPServer (the server you will be connecting to, to retrieve files)
  • FTPUsername (username to connect to the FTP Sever with)
  • FTPPassword (password for the username above)
  • FTPSourceDirectory (FTP directory to get the files from)
  • FTPTargetDirectory (the root directory under which the files will be put)

ApplicationSettings

  • You’ll also need Application Settings for a Username and Password associated with a user that exists on the Server that you’ll be connecting to with Remote PowerShell. In my script below these application settings are MIMSyncCredUser and MIMSyncCredPassword

Function App Script

Finally here is a raw script. You’ll need to add appropriate error handling for your environment. You’ll also want to change lines 48 and 51 for the naming of the files you are looking to acquire. And line 59 for the servername you’ll be executing the process on.

Summary

A pretty quick and simple little Azure Function App that will run each day and obtain daily/nightly extracts from an FTP Server. Cleanup of the resulting folders and files I’m doing with other on-box processes.

 

This post is cross-blogged on both the Kloud Blog and Darren’s Blog.

Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration

Last week in a customer development environment I had one of those oh shit moments where I thought I’d lost a couple of weeks of work. A couple of weeks of development around multiple Management Agents, MV Schema changes etc. Luckily for me I was just connecting to an older VM Image, but it got me thinking. It would be nice to have an automated process that each night would;

  • Export each Management Agent on a FIM/MIM Sync Server
  • Export the FIM/MIM Synchronisation Server Configuration
  • Take a copy of the Extensions Folder (where I keep my PowerShell Management Agents scripts)
  • Export the FIM/MIM Service Server Configuration

And that is what this post covers.

Overview

My automated process performs the following;

  1. An Azure PowerShell Timer Function WebApp is triggered at 2330 each night
  2. The Azure Function App initiates a Remote PowerShell session to my Dev MIM Sync Server (which is also a MIM Service Server)
  3. In the Remote PowerShell session the script;
    1. Creates a new subfolder under c:\backup with the current date and time (dd-MM-yyyy-hh-mm)

  1. Creates further subfolders for each of the backup elements
    1. MAExports
    2. ServerExport
    3. MAExtensions
    4. PortalExport

    1. Utilizes the Lithnet MIIS Automation PowerShell Module to;
      1. Enumerate each of the Management Agents on the FIM/MIM Sync Server and export each Management Agent to the MAExports Folder
      2. Export the FIM/MIM Sync Server Configuration to the ServerExport Folder
    2. Copies the Extensions folder and subfolder contexts to the MAExtensions Folder
    3. Utilizes the FIM/MIM Export-FIMConfig cmdlet to export the FIM Server Configuration to the PortalExport Folder

Implementing the FIM/MIM Backup Process

The majority of the setup to get this to work I’ve covered in other posts, particularly around Azure PowerShell Function Apps and Remote PowerShell into a FIM/MIM Sync Server.

Pre-requisites

  • I created a C:\Backup Folder on my FIM/MIM Server. This is where the backups will be placed (you can change the path in the script).
  • I installed the Lithnet MIIS Automation PowerShell Module on my MIM Sync Server
  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 2330 every night using the following CRON configuration

0 30 23 * * *

  • I set the Azure Function App Timezone to my timezone so that the nightly backup happened at the correct time relative to my timezone. I got my timezone index from here. I set the  following variable in my Azure Function Application Settings to my timezone name AUS Eastern Standard Time.

    WEBSITE_TIME_ZONE

The Function App Script

With all the pre-requisites met, the only thing left is the Function App script itself. Here it is. Update lines 2, 3 & 6 if your variables and password key file are different. The path to your password keyfile will be different on line 6 anyway.

Update line 25 if you want the backups to go somewhere else (maybe a DFS Share).
If your MIM Service Server is not on the same host as your MIM Sync Server change line 59 for the hostname. You’ll need to get the FIM/MIM Automation PS Modules onto your MIM Sync Server too. Details on how to achieve that are here.

Running the Function App I have limited output but enough to see it run. The first part of the script runs very quick. The Export-FIMConfig is what takes the majority of the time. That said less than a minute to get a nice point in time backup that is auto-magically executed nightly. Sorted.

 

Summary

The script itself can be run standalone and you could implement it as a Scheduled Task on your FIM/MIM Server. However I’m using Azure Functions for a number of things and having something that is easily portable and repeatable and centralised with other functions (pun not intended) keeps things organised.

I now have a daily backup of the configurations associated with my development environment. I’m sure this will save me some time in the near future.

Follow Darren on Twitter @darrenjrobinson