020916_1103_Synchronous3.png

Simultaneously Start|Stop all Azure Resource Manager Virtual Machines in a Resource Group

Problem

How many times have you wanted to Start or Stop all Virtual Machines in an Azure Resource Group ? For me it seems to be quite often, especially for development environment resource groups. It’s not that difficult though. You can just enumerate the VM’s then cycle through them and call ‘Start-AzureRMVM’ or ‘Start-AzureRMVM’. However, the more VM’s you have, that approach running serially as PowerShell does means it can take quite some time to complete. Go to the Portal and right-click on each VM and start|stop ?

There has to be a way of starting/shutting down all VM’s in a Resource Group in parallel via PowerShell right ?

Some searching and it seems common to use Azure Automation and Workflow’s to accomplish it. But I don’t want to run this on schedule or necessarily mess around with Azure Automation for development environments, or have to connected to the portal and kickoff the workflow.

What I wanted was a script that was portable. That lead me to messing around with ‘ScriptBlocks’ and ‘Start-Job’ functions in PowerShell. Passing variables in for locally hosted jobs running against Azure though was painful. So I found a quick clean way of doing it, that I detail in this post.

Solution

I’m using the brilliant Invoke-Parallel Powershell Script from Cookie.Monster, to in essence multi-thread and run in parallel the Virtual Machine ‘start’ and ‘stop’ requests.

In my script at the bottom of this post I haven’t included the ‘invoke-parallel.ps1’. The link for it is in the paragraph above. You’ll need to either reference it at the start of your script, or include it in your script. If you want to keep it all together in a single script include it like I have in the screenshot below.

My rudimentary PowerShell script takes two parameters;

  1. Power state. Either ‘Start’ or ‘Stop’
  2. Resource Group. The name of the Azure Resource Group containing the Virtual Machines you are wanting to start/stop. eg. ‘RG01’

<

p style=”background:white;”>Example: .\AzureRGVMPowerGo.ps1 -power ‘Start’ -azureResourceGroup ‘RG01’ or PowerShell .\AzureRGVMPowerGo.ps1 -power ‘Start’ -azureResourceGroup ‘RG01’

Note: If you don’t have a session to Azure in your current environment, you’ll be prompted to authenticate.

Your VM’s will simultaneously start/stop.

What’s it actually doing ?

It’s pretty simple. The script enumerates the VM’s in the Resource Group you’ve specified. It looks to see the status of the VM’s (Running or Deallocated) that is the inverse of the ‘Power’ state you’ve specified when running the script. It’ll start stopped VM’s in the Resource Group when you run it with ‘Start’ or it will stop all started VM’s in the Resource Group when you run it with ‘Stop’. Simples.

This script could also easily be updated to do other similar tasks. Like, delete all VM’s in a Resource Group.

Here it is

Enjoy.

Follow Darren Robinson on Twitter

An Office 365 public service announcement: the most important part of hybrid identity is not AADConnect or ADFS; rather it’s the existing environment.

Originally posted on Lucian’s blog at www.clouduccino.com. Follow Lucian on Twitter @LucianFrango for daily doses of cloud.


This is part frustrated (mild) rant, part helpful hint and like the title says: part public service announcement. While I am most definitely a glass is half full kind of person, and I don’t get stressed out much or phased by pressure much, I, like anyone, do get annoyed with certain things. Let’s start with a quick vent before continuing on with the public service announcement.

Rather then just have a rant or a whinge, let me explain the situation and by doing so I’ll most likely vent some frustration.

Deploying a Hybrid Exchange environment to integrate with Exchange Online in Office 365 can be a behemoth of a task if you lack certain things. While anyone can say any number of criteria or list off important considerations, pre-requisites or requirements, I feel that there is only one thing that needs to be addressed. One thing that sticks in my mind as being the foundation of the endeavour.

Just like the old saying goes “you can’t build a strong house on weak foundations”; the same applies to that initial journey to the cloud.

I say initial journey, as for many organisations that first step after setting up a website, which can be the beginning to being a cloud-focused organisation, Office 365 is truly the first step to move infrastructure, process and systems to the cloud that I see happen the most.

Importantly though is to realise that as amazing and full of features as Office 365 is, deploying a Hybrid environment to leverage what I consider the best of both worlds, a hybrid identity, all that matters is the existing Active Directory Domain Services (ADDS) environment. That is ALL THAT MATTERS.

Step aside Active Directory Federation Services (ADFS) and Azure AD Connect (AADConnect) or even Hybrid Exchange Server 2016 itself. All those components sit on top of the foundation of the existing identity and directory services that is the ADDS environment.

ADDS is so crucial as it key link in the chain, so much so that if it has issues, the entire project can easily and quickly run into trouble and delays. Delays lead to cost. Delays lead to unhappy management. Delays lead to unhappy users. Delays lead to the people working on the project going grey.

I remember a time when I had blonde curly hair. I grew out of that and as I got older my hair darkened to a rich, chocolate (at least 70% cocoa) brown. Now, as each project gets notched on my belt, slowly, the slick chocolate locks are giving way to the odd silky, white sign that there’s not enough emphasis on a well-managed and organised ADDS.

Read More

Delegate Mailbox Access using Groups in Exchange Online

A common misconception about granting mailbox access rights in Exchange Online is that you can only add access to the individual and not a group. You may have opened the Exchange Administrator Center (EAC), found the mailbox you wanted and looked at the delegated access tab. Only to be provided with a list of eligible user identity’s, but none of your on-premises security groups that have been created. Fear not, the on-premises groups just need a little remediation to the correct flavour to be seen in the picker and then applied.
The key settings of a group to assign mailbox access in Exchange Online are:
  1. Universal Security
  2. Mail Enabled
Things in your environment may have gotten this way over time as legacy on-premises Exchange Server versions weren’t as picky with your selection and would allow the permission to be granted to any kind of group you threw at it. Moving to Exchange Online has more sophisticated taste that needs to be catered for due to its Evergreen state with all the latest Exchange trends.
If you migrate on-premises mailboxes with these slightly incorrect permissions to Exchange Online they will be lost forever. So if you remedy the existing on-premises mailboxes before initiating a remote move request to Exchange Online, no further action will be required at completion.

The Process

Update the group to universal:
Get-Group $Name | Set-Group -Universal
Mail enable the group:
Enable-DistributionGroup -identity $Name
Finally, not compulsory but if you want the group to be hidden from your Exchange Address Book:
Get-DistributionGroup -identity $Name | Set-DistributionGroup -HiddenFromAddressListsEnabled:$True
Force a synchronisation of your group modifications with Azure Active Directory and you should find that the group will be available in the Office 365 Exhange Administrator Center. On the Azure AD Connect Server:
CD C:\Program Files\Microsoft Azure AD Sync\Bin\
.\DirectorySyncClientCmd.exe delta
So my final note that I would like to re-iterate is that if you tidy this stuff up before migrating to Office 365 you will have a lot less problems when the mailbox becomes active in Office 365. A mailbox that has been migrated with a faulty group type will have permissions assigned on the object, but will not resolve to an object and therefore only show SID information. Firstly the permissions fail, but secondly, it makes it very difficult to find out what the group was previously to add post migration. (see first permissions below)

 Hot Tip

Fix the issue with your groups before you start the synchronisation of the mailbox to Exchange Online and the permissions will be stamped correctly during the initial phases of the move request and once finalised, will work as expected.

Microsoft Azure Stack is the New Hybrid Cloud

Last week Microsoft released the public technical preview of new Azure Stack. Azure Stack, along with its predecessor Windows Azure Pack, gives anyone the ability to extend Azure management capabilities to their on-premises datacentre.

Firstly, a bit of background.

With Windows Server 2012 R2, Microsoft made available Windows Azure Pack. Azure Pack offered an on-premise integration point between Windows Server, System Centre, and SQL Server to offer a self-service portal and private cloud services including virtual machine provisioning and management (IaaS), database as a services (DBaaS), and scalable web application hosting (PaaS). Azure Pack can be integrated with Service Management Automation (SMA) and leverages Desired State Configuration (DSC) to manage IaaS deployments.

Microsoft built Windows Azure Pack using a common Service Management API based on the Azure Service Management (ASM) API which is how the “Classic” Azure platform is managed. Azure Pack delivers a Self Service portal based on System Centre Service Provider Foundation (a subset of Orchestrator features) that utilises templates and other library resources from System Centre to provision virtual networks, virtual machines, web apps and other IaaS and PaaS resources on-premises.

It’s about the Stack

The rapid rate of change in Azure and the introduction of a new management API (Azure Resource Manager – ARM) meant Azure Pack features fell behind what is possible in Azure today.

In 2015 Microsoft announced that with the release of Windows Server 2016, they will provide Azure Stack, which is the next iteration of hybrid cloud integration, looking to address Azure Pack’s shortcomings by providing an Azure-consistent API (using ARM) for managing underlying resource providers (compute, network and storage) across public and private clouds.

By utilising the ARM API across both Azure cloud and Hyper-V on premise, Azure Stack users will be able to:

  • Deploy resources across public and private clouds from the same JSON templates and PowerShell Cmdlets
  • Reduce reliance on System Centre to manage underlying resources (ARM does not use System Centre on-premises, though System Centre may still be used to manage infrastructure)
  • Utilise the same resource management model over public and private clouds.

Adopting a Hybrid Cloud approach marks a movement away from integrating hardware with software and towards making software control the underlying hardware, without regard to it, other than as additional capacity.

Now, when we consider the design of an on-premises Private Cloud, we know it needs to take into account the Azure tenancy and the toolsets and processes that have been implemented or are planned for the near future. Having this inter-operability between platforms empowers us to extend a true ‘Hybrid Cloud’ for greater cloud consistency.

It is important to note that at launch only a fraction of the full Azure offerings are available on the Azure Stack preview. At this time this includes compute, storage and network resources (IaaS) plus some of the PaaS features such as web and mobile app services.

Aside from feature gaps, there are also markedly different use cases for deploying applications to on-premises or to the cloud, such as

  • Compliance related concerns as to where your customer or corporate data resides. For some industry sectors it can sometimes be a lengthy process to get cloud use approved by legal, regulatory, security and industry stakeholders.
  • Financial constraints can determine one approach or the other – if deploying on-premises infrastructure, you will have to deploy for peak demand and if this is only intermittent (e.g. monthly or quarterly usage spikes) this can provide a better case for scalable cloud services. On the other hand, for static loads that need to run 24/7, on-premises can be a more cost effective approach (although it can also be misleading to compare without including sunk costs of the datacentre vs. the prospective cost of a cloud service… but that’s an argument for another time).
  • The other big reason for deploying workloads on-premises is that your organisation’s journey to the cloud is going to be a long one, so while the aspiration is there, a pragmatic approach is to deploy the easy workloads to the cloud first and the more difficult workloads and applications remain on-premises.

As most large organisations face one or more of the scenarios above, Azure Stack delivers the benefits of cloud manageability, speed of deployment, and the economy and simplicity of having one management framework across the Public and Private clouds.

One the great use cases that springs to mind is to deploy your dev/test workloads to Azure and take advantage of scaling up during heavy use and UAT, but when there is little activity simply turn off your servers and automate out of hours and weekend shutdowns to further reduce costs. When code is ready to be deployed to production you can then use all the same templates to deploy into your production systems that reside in your datacentre hosted on Azure Stack.

To get started, download and deploy Windows Server 2016 Technical Preview 4: https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-technical-preview.

Azure Stack can be downloaded by registering at the following link: https://azure.microsoft.com/en-us/overview/azure-stack/try/

And you will have to be mindful of the Azure Stack requirements: https://azure.microsoft.com/en-us/documentation/articles/azure-stack-deploy/

That’s it for now, but there will be much more to come on Azure Stack as we explore the platform and start rolling out with our customers.

Office 365 Import Service via PowerShell

If you’ve been involved in an Exchange Migration to Office 365 of late, I’m sure you’ll be well aware of the new Microsoft Office 365 Import Service.  Simply put, give them your PST’s via hard disk or over the network, and they’ll happily ingest them into the mailbox of your choice – what’s not to like!  If you’ve been hiding under a rock, the details are available here!

Anyway, rather than marvelling at the new Microsoft offering, I thought I’d share my recent experience with the network route option.  When I first started playing [read testing] I soon noticed that the Portal GUI was, well, struggling to keep up with the requests.  So instead, I reached out to MS and found that this is all possible programmatically using PowerShell!

So first of all, you want to get the files into an Azure Blob.  That’s achieved via AzCopy which is available on this link.  Then you’ll want to grab the Azure Storage Explorer, granted, you could probably do this all via PowerShell, but when there is a good GUI sometimes it’s just easier, it’s available here.

When you fire up the Storage Explorer, you’re looking for the Storage Key.  This is available via the Portal | Admin | Import.  From there, you’ll see a key icon which is referred to as ‘Copy Office 365 Storage Key and URL‘.  Once you’re connected you should see some default folders which I have screenshot below…

Blob Storage

Blob Storage

Select one and click Security, select the Shared Access Key tab and generate Signature.  So now we’ve got the necessary items, we can plumb them into variable as I’ve demonstrated below…

cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"

$Source = "\\Server\Share"
$Destination = "https://55f6dd****49d9a5bcf36.blob.core.windows.net/ingestiondata/"
$StorageKey = "9XAsHc/OmkZgdrSXkE+***+njq9qQWvt*******rXysz//CvB2j6mlb6XSJgi4bfh+Br6Onn*****AEVtA=="
$LogFile = "C:\Users\dyoung\Documents\Uploads.log"

& .\AzCopy.exe /Source:$Source /Dest:$Destination /DestKey:$StorageKey /S /V:$LogFile

This will grab all the PST’s from the \ \Server\Share UNC path and look to upload them to the ingestiondata directory. There is a log file which is pretty useful and the PowerShell window will let you know once the upload is complete.

Now we’ve got our PST’s up in Azure, but we need to let MS know where to ingest them. For this, we need to generate a CSV. When I was creating the PST names I found it good to name them ADDisplayName.pst, simply so we can automate the CSV creation.  I also added incremental numbers at the end of each file, as it’s possible to insert multiple PST’s to a single mailbox…

e.g. ‘Dave Young.pst‘, ‘Dave Young1.pst‘ and ‘Dave Young2.pst‘ would all be imported into Dave Young’s mailbox.

I also decided to import to the users archive mailbox (as I’ve already given the users an archive) and I wanted the PST’s in the root directory…these options are all covered in the TechNet article here

$FilePath = "/"

$AllPSTs = Get-ChildItem '\\Server\Share' -filter "*.pst"

$CSVTable= @()
for($Cycle=0; $Cycle -lt $AllPSTs.name.Count; $Cycle++)
 {
 $DisplayName = [io.path]::GetFileNameWithoutExtension($AllPSTs.name[$Cycle])
 if ($DisplayName -match "[1-9]")
 {
 $DisplayName = $DisplayName -replace ".$"
 }
 $SMTPAddress = $null
 $Mailbox = Get-Mailbox $DisplayName
 $SMTPAddress = $Mailbox.PrimarySmtpAddress.ToString()
 $SMTPAddress = $SMTPAddress.ToLower()
 $AzureBSAccountUri = 'https://55f6ddf********a5bcf36.blob.core.windows.net/ingestiondata/'+$AllPSTs[$Cycle].Name
 $CSVColumn = New-Object PSObject
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Workload' -Value 'Exchange'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'FilePath' -Value $FilePath
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Name' -Value $AllPSTs.name[$Cycle]
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Mailbox' -Value $Mailbox.UserPrincipalName
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'AzureSASToken' -Value '?sv=2014-02-14&amp;sr=c&amp;sig=2dSSbF9CsQ******HpwHOJy4pmGFvsniv46hZh%2BA%3D&amp;st=2016-02-01T13%3A30%3A00Z&amp;se=2016-02-09T13%3A30%3A00Z&amp;sp=r'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'AzureBlobStorageAccountUri' -Value $AzureBSAccountUri
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'IsArchive' -Value 'TRUE'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'TargetRootFolder' -Value '/'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPFileContainer' -Value ''
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPManifestContainer' -Value ''
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPSiteUrl' -Value ''
 $CSVTable += $CSVColumn
 }

$CSVTable | Export-Csv "PstImportMappingFile-$((Get-Date).ToString("hh-mm-dd-MM-yy")).csv" -NoTypeInformation

Hopefully the output of this script is a pretty decent csv.  Now it’s time to create the import jobs.  Now you might be wondering, there is no ‘New-o365MailboxImportRequest‘ cmdlet and you’d be right.  I’ve used a suffix so I could load both on-premises and cloud cmdlets into the same PowerShell session, it’s achieved by ‘Import-PSSession $Session -Prefix o365′ which is described in point 2 here

Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property @{
InitialDirectory = [Environment]::GetFolderPath('MyDocuments')
Filter = 'CSV (*.csv)|*.csv'
}
[void]$FileBrowser.ShowDialog()
$FileBrowser.FileNames

Import-Csv $($FileBrowser.SafeFileName) | foreach {New-o365MailboxImportRequest -Mailbox $_.Mailbox -AzureBlobStorageAccountUri $_.AzureBlobStorageAccountUri -BadItemLimit unlimited -AcceptLargeDataLoss -AzureSharedAccessSignatureToken $_.AzureSASToken -TargetRootFolder $_.TargetRootFolder -IsArchive}

These jobs will be in PowerShell and they can of course be checked using the cmdlets ‘Get-MailboxImportRequest’ & ‘Get-MailboxImportRequestStatistics’.

And that’s pretty much it, yes it was quick but I reckon it’s pretty self explanatory and it’s considerably more efficient than doing it via the GUI – Enjoy!

Cheers,
Dave

Dynamic Active Directory User Provisioning placement (OU) using the Granfeldt Powershell Management Agent

When using Forefront / Microsoft Identity Manager for provisioning users into Active Directory, determining which organisational unit (OU) to place the user in varies from customer to customer. Some AD OU structures are flat, others hierarchical based on business, departmental, functional role or geography. Basically every implementation I’ve done has been different.

That said the most recent implementation I’ve done is for an organisation that is growing and as such the existing structure is in flux and based on differing logic depending on who you talk to. Coming soon they’re restructuring their OU structure but not before MIM will be live. Damn.

This blog post details my interim solution to enable provisioning of new users into AD with an existing complex OU structure until the customer simplifies their hierarchy with upcoming work and without having to maintain a matrix of mapping tables.

Overview

In this blog post I’ll document how you can perform dynamic Active Directory user provisioning placement utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent.

Using the Synchronisation Rule for AD Outbound I land the new AD account into a landing OU. My Placement MA (using the PS MA), see’s accounts in this landing OU and then queries back into AD using the new users Title (title) and State (st) attributes to find other AD users with the same values, location (DN). It orders the results and moves our new AD account to the location of the greatest number of users with the same metadata.

My User Placement MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

Getting Started with the Granfeldt PowerShell Management Agent

In case you haven’t read my other PS MA blog posts here is the background on getting started with the Granfeldt PS MA. If you have you can skip through this section.

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Placement

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only using this MA to move the AD Object the schema only consists of the attributes needed to workout where the user should reside and update them.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

The import script looks to see if the user is currently residing in the LandingOU that the AD MA puts new users into. If the user is, then the Import Script gets the users ‘Title’ and ‘State’ attribute values and searches AD to find any other users with the same State and Title and pulls back their OU. The script discounts any users that are in the Expired OU structure and any users (including the user being processed) that are in the Landing OU. It orders the list by occurrence and then selects the first entry. This becomes the location where we will move the user to. We also update the targetOU custom attribute with where the user is, once moved to keep the synchronisation rule happy and no longer trigger a move on the user (unless someone moves the user back to the LandingOU). The location of the LandingOU and the Expired OU’s path are set near the top of the script. If we don’t find any matches then I have catchall per State (st) where we will put the user and they can then be moved manually if required later.

On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, but limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

The export script is basically doing the move of the users AD account.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attributes, create the Set used for the transition, create your Synchronisation Rule, create your Terminal Services Workflow, create your Terminal Services MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flow is just bringing back in the DN to a custom MV attribute.

Synchronisation Rules

My Placement MA Outbound Sync rule relies on the value of the new OU that was bought in on the Import script.

We flow out the new DN for the user object.

Set

I created a Set that I use as a transition to trigger the move of the user object on my Placement MA. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active) and that the user is currently in the LandingOU.

Workflow

An action based workflow that will use the trigger the Synchronisation rule on the Placement MA.

MPR

Finally my MPR for moving the AD object is based on the transition set,

and my Placement Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to accomplish something that would have been a nightmare to try and do with declarative rules or un-maintainable using a matrix of mapping tables. It would also be possible to use this as a basis to move users when their State or Title changes. This example could also be reworked easily to use different attributes.

Follow Darren on Twitter @darrenjrobinson

Managing AD Terminal Services Configuration with FIM / MIM using the Granfeldt PowerShell Management Agent

Forefront / Microsoft Identity Manager contains numerous Management Agents (MA’s) out of the box. However an MA for managing AD Terminal Services user configuration isn’t one of them. And at first pass you’d think you could just manipulate a few attributes in AD on an AD MA like you do for home directories (aside from creating the file and permissions on the filesystem) and you’d be done. Don’t worry, I made that wrong assumption too.

Overview

In this blog post I’ll document how you can enable Active Directory users with the necessary attributes and file system elements utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of provisioning a terminal services home directory, a terminal services profile directory, update the associated users AD account with those paths and the terminal services drive letter whilst also allowing the user to logon. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change location of terminal services directories, delete for de-provisioning). For the file system directories and permissions I’ve re-used logic from my AD home director PS MA detailed here which is based on Kent Nordström’s home directory MA stripped back with a few other subtle changes.

My Terminal Services MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

A note on managing Terminal Services Configuration

Terminal Services configuration properties are, historically, held in a single attribute ‘userParameters’ (a BLOB, a Binary Large OBject). Inside the userParameters BLOB are the four attributes we’re interested in;

  • allowLogon
  • terminalServicesHomeDirectory
  • terminalServicesProfilePath
  • terminalServicesHomeDrive

What you see via LDAP in the userParameters attribute is either an empty attribute (no terminal services configuration) or an attribute full of encoded characters. We’ll use this as our trigger for the MIM rules to determine who has an existing configuration and who doesn’t.

Getting Started with the Granfeldt PowerShell Management Agent

In case you haven’t read my other PS MA blog posts here is the background on getting started with the Granfeldt PS MA. If you have you can skip through this section.

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\TermServ

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing provisioning of terminal services configuration the schema only consists of the attributes needed to create the terminal services directories, the terminal services drive letter and update the user object in AD.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

The import script looks to see if there is a terminal services config for the user (in the userParameters BLOB) and then uses the ADSI provider that allows us to access the terminal services configuration attributes to read and flow through the four attributes we’re interested in.

On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

The export script is doing a couple of functions. It is creating the terminal services home and profile directories, setting the necessary permissions on the home directory for the user and then updating the AD object with the terminal services home and profile directory paths, the terminal services directory drive letter and allowing the user to logon.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attributes, create the Set used for the transition, create your Synchronisation Rule, create your Terminal Services Workflow, create your Terminal Services MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flow is just bringing back in the DN to a custom MV attribute.

Synchronisation Rules

My Terminal Services Outbound Sync rule doesn’t need to be very complex. All it is doing is sync’ing out the Terminal Services Profile and Directory paths, Terminal Services Drive Letter and Allow Logon attributes.

Set

I created a Set that I use as a transition to trigger provisioning to the Terminal Services MA. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active) and that the Terminal Services Config is not already set (this is also based on a Boolean attribute I set via an Advanced Flow rule on my AD MA based on whether there is a value in the userParameters attribute). See my Exchange mailbox provisioning post using the PS MA for an example of setting a boolean attribute based on attribute being present or not.

Workflow

An action based workflow that is used to trigger the Synchronisation rule on the Terminal Services MA.

MPR

Finally my MPR for provisioning Terminal Services config is based on the transition set,

and my Terminal Services Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to provision a terminal services configuration to new users and permission the directories correctly. It is also easy to extend the solution to manage moving terminal services directories to different locations, deletion etc.

 

Follow Darren on Twitter @darrenjrobinson

Provisioning Home Directories for Active Directory Users with FIM / MIM using the Granfeldt PowerShell Management Agent

Forefront / Microsoft Identity Manager contains numerous Management Agents (MA’s) out of the box. However an MA for creating user home directories and setting the associated permissions isn’t one of them.

Over the years I’ve accomplished home directory provisioning and permissioning in Active Directory / Windows File Services and Novell eDirectory / Novell File Services using methods that aren’t strictly best practice / supported (e.g. calling native libraries from within a Management Agent Extension to create/manage/delete etc). Whilst this functionally works the ability for end customers to maintain the implementation for changes is limited.

Overview

In this blog post I’ll document how you can enable an Active Directory users with a home directory with the required permissions and update their associated AD object with the home directory path and drive letter utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of provisioning a home directory to a user. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change location of home directory, add sub-directories, delete for de-provisioning). I didn’t start from scratch on this one as Kent Nordström had most of what I needed as detailed here. I stripped it back to do what I wanted with a few other subtle changes.

My Home Directory MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\HomeDir

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing provisioning of home directories. As such the schema only consists of the attributes needed to create the home directory and update the user object in AD.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

Nothing to complex. Bringing back in the two attributes that we export to allow the confirming import to complete.
On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, but limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

The export script is doing a couple of functions. It is creating the home directory, setting the necessary permissions on the home directory for the user and then updating the AD object with the home directory path and the home directory drive letter.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attributes, create the Set used for the transition, create your Synchronisation Rule, create your Home Director Workflow, create your Home Directory MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management its empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flow is just bringing back in the DN to a custom MV attribute.

Synchronisation Rules

My Home Directory Outbound Sync rule doesn’t need to be very complex. All it is doing is sync’ing out the Home Directory Drive Letter and Home Directory Path attributes.

Set

I created a Set that I use as a transition to trigger provisioning to the Home Directory. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active) and that the Home Folder Path is not already set.

Workflow

An action based workflow that will use the trigger the Synchronisation rule on the Home Directory MA.

MPR

Finally my MPR for provisioning Home Directories is based on the transition set,

and my Home Directory Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to provision home directories to new users and permission them correctly. It is also easy to extend the solution to manage moving home directories to different locations, deletion etc.

Follow Darren on Twitter @darrenjrobinson

Provisioning Users for Lync / Skype for Business with FIM / MIM using the Granfeldt PowerShell Management Agent

Forefront / Microsoft Identity Manager contains numerous Management Agents (MA’s) out of the box. However a MA for Lync / Skype for Business isn’t one of them.

Over the years I’ve accomplished lifecycle management for users in Lync via FIM using methods that aren’t strictly best practice / supported (e.g. calling PowerShell from within a Management Agent Extension and going out-of-bounds to enable/disable/manage policies). Whilst this functionally works the ability for end customers to maintain the implementation for changes is limited.

Overview

In this blog post I’ll document how you can enable an Active Directory User for Lync / Skype for Business utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of enabling a user. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change mobility, federation, voice policies for users based on managed attributes, and de-provisioning).

My Lync / Skype for Business PS MA is used in conjunction with an Active Directory MA and Declarative Rules in the MIM Portal.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the service account to the appropriate Lync role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use “dir /x” to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Lync

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing provisioning, the schema only consists of the attributes needed to know the state of the user with respect to enablement and the attributes needed to enable an account.

Password Script (password.ps1)

Just a placeholder file as detailed above.

Import Script (import.ps1)

I’m using msDS-cloudExtensionAttribute20 as a breadcrumb attribute. I’m not exporting anything else on the Lync MA but we need to export something to trigger the Export script. It exists in the Import script as well for the confirming import.
On the $request line you’ll see I have a filter for the User Object Class and users with an account name starting with ‘U’. For the customer I wrote this for, all users start with U. You can obviously also have Filters on the MA. I have a few there as well, but limiting the size of the returned array makes things quicker to run full imports.

Export Script (export.ps1)

In the working example below I’ve hardcoded the RegistrarPool into the Export Script. If your environment contains more than one RP then you could have the desired pool as a flow rule on your AD MA that create AD users on so you can make it an initial flow attribute.

A key item is I’m also using Remote PowerShell for Lync. That way I don’t need to have the PS Module for Lync on the MIM server itself.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attribute, create the Set used for the transition, create your Synchronisation Rule, create your Lync Workflow, create your Lync MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management its empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flows are a combination of logic used for other parts of my solution, and a Boolean flag to determine if the user in enabled for Lync or not (used for my Transition Set and my Export script).

Synchronisation Rules

My Lync Outbound Sync rule doesn’t need to be very complex. All it is doing is sync’ing out my breadcrumb attribute. I’m flowing out the objectGUID out to ms-dsCloudExtensionAttribute20.

Set

I created a Set that I use as a transition to trigger provisioning to Lync. My Set looks to see if the user account is in AD and active (I have a Boolean attribute not shown and another rule in the MIM Portal that is set based on an advanced flow rule in the Sync engine that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active).

Workflow

An action based workflow that will use the trigger the Synchronisation rule on the Lync MA.

MPR

Finally my MPR for provisioning Lync is based on the transition set,

and my Lync Workflow.

Summary

Using the Granfeldt PowerShell MA it was very easy to provision Lync / Skype for Business to users. It is also easy to extend to manage voice policies etc.

Follow Darren on Twitter @darrenjrobinson

 

Provision Users for Exchange with FIM/MIM 2016 using the Granfeldt PowerShell MA, avoiding the AD MA (no-start-ma) error

Forefront / Microsoft Identity Manager provides Exchange Mailbox provisioning out of the box on the Active Directory Management Agent. I’ve used it in many many implementations over the years. However, in my first MIM 2016 implementation in late 2015 I ran into issues with something I’d done successfully many times before.

I was getting “no-start-ma” on the AD MA on export to AD. The point at which the MA sets up its connection to the Exchange environment. After some searching I found Thomas’s blog detailing the problem and a solution. In short update the MIM Sync Server to .NET 4.6. For me this was no-joy. However when MS released the first rollup update for MIM in December everything fired up and worked as normal.

Step forward a month as I was finalising development for the MIM solution I was building for my customer and my “no-start-ma” error was back when I re-enabled mailbox provisioning. Deselect the Exchange Provisioning option on the AD MA and all is good. Re-enable it and it fails. One week left of dev and I need mailbox provisioning so time for a work around whilst I lodge a Premier Support ticket.

So how can I get mailbox provisioning working reliably and quickly? I was already using Søren Granfeldt’s PowerShell MA for managing users Terminal Services configuration, Home Directories and Lync/Skype for Business. What’s one more. Look out for blog posts on using the PS MA to perform those other functions that I’ll be posting in the coming weeks.

Using the Granfeldt PowerShell Management Agent to Provision Exchange Mailboxes

In this blog post I’ll document how you can enable Mailbox Provisioning in Exchange utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. I’ll show you how to do the minimum of enabling a user with a mailbox. Understanding how this is done you can then easily then extend the functionality for lifecycle management (e.g. change account settings for POP/IMAP/ActiveSync and de-provisioning).

My Exchange PS MA is used in conjunction with an Active Directory MA and Declarative Provisioning Rules in the MIM Portal. Essentially all the AD MA does, when you enable Exchange Provisioning (when it works) is call the ‘update-recipient’ cmdlet to finish of the mailbox provisioning. My Exchange PSMA does the same thing.

Overview

There are three attributes you need to supply values for in order to then provision them a mailbox (on top of having an Active Directory account, or course);

  • mailNickName
  • homeMDB
  • homeExchangeServerName

The later two I’m flowing the appropriate values for using my Active Directory MA. I’m setting those attributes on the AD MA as I’m provisioning the AD account on that MA which then lets me set those two attributes as initial flow only. I’m doing that as over time it is highly likely that those attribute values may change with normal business as usual messaging admin tasks. I don’t want my Exchange MA stomping all over them.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different. Configuration tasks like adding additional attributes the User Object Class in the MIM Portal, updating MPR’s, flow rules, Workflows, Sets etc are assumed knowledge and if not is easily Bing’able for you to work it out.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run the scripts as, needs to be in the format of just ‘accountname’ NOT ‘domain\accountname’. I’m using the service account that I’ve used for the Active Directory MA. The target system is the same directory service and the account has the permissions required (you’ll need to add the management agent account to the appropriate Exchange role group for user management)
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\Exchange

Schema Script (schema.ps1)

As I’m using the OOTB (out of the box) Active Directory MA to provision the AD account and only showing mailbox provisioning, the schema only consists of the attributes needed to know the state of the user with respect to enablement and the attributes associated with enabling and confirming a user for a mailbox.

https://gist.github.com/darrenjrobinson/ae46cdfccb825dce69b3

Password Script (password.ps1)

Empty as described above.

Import Script (Import.ps1)

Import values for attributes defined in the schema.

Export Script (Export.ps1)

The business part of the MA. Take the mailnickName attribute value flowed from FIM, (the other required attributes are populated via the AD MA) and call update-recipient to provision the mailbox.

Wiring it all together

In order to wire the functionality all together there are the usual number of configuration steps to be completed. Below I’ve shown a number of the key points associated with making it all work.

Basically create the PS MA, import attributes from the PS MA, add any additional attributes to the Portal Schema, update the Portal Filter to allow Administrators to use the attribute, update the Synchronisation MPR to allow the Sync Engine to flow in the new attribute, create the Set used for the transition, create your Synchronisation Rule, create your Mailbox Workflow, create your Mailbox MPR, create your MA Run Profiles and let it loose.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the same service account as my AD MA.

Password script must be specified but as we’re not doing password management its empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

My join rule is simple. AccountName (which as you’ll see in the Import.ps1 is aligned with sAMAccountName) to AccountName in the MetaVerse.

My import flows are a combination of logic used for other parts of my solution, a Boolean flag & Mailbox GUID to determine if the user has a mailbox or not (used for my Transition Set and my Export script).

Below is my rules extension that sets a boolean value in the MV and then flowed to the MIM Portal that I use in my Transition Set to trigger my Synchronisation Rule.

Synchronisation Rules

My Exchange Outbound Sync rule doesn’t and isn’t complex. All it is doing is sync’ing out the mailnickName attribute and applying the rule based on an MPR, Set and Workflow.

For this implementation my outbound attribute flow for mailnickName is a simple firstname.lastname format.

Set

I have a Set that I use as a ‘transition set’ to trigger provisioning to Lync. My Set looks to see if the user account exists in AD (I flow in the AD DN to an attribute in the Portal) and the mailbox status (set by the Advanced Flow Rule shown above). I also have (not shown in the screenshot) a Boolean attribute in the MIM Portal that is set based on an advanced flow rule on the AD MA that has some logic to determine if employment date as sourced from my HR Management Agent is current and the user should be active or not).

Workflow

An action based workflow that will use the trigger the Synchronisation rule for Exchange Mailbox creation.

MPR

Finally my MPR for provisioning mailboxes is based on the transition set,

and my Mailbox Workflow.

Summary

Using the Granfeldt PowerShell MA I was able to quickly abstract Mailbox Provisioning from the AD Management Agent and perform the functionality on its own MA.

 

Follow Darren on Twitter @darrenjrobinson