A tool to find mailbox permission dependencies

First published at https://nivleshc.wordpress.com

When planning to migrate mailboxes to Office 365, a lot of care must be taken around which mailboxes are moved together. The rule of the thumb is “those that work together, move together”. The reason for taking this approach is due to the fact that there are some permissions that do not work cross-premises and can cause issues. For instance, if a mailbox has delegate permissions to another mailbox (these are permissions that have been assigned using Outlook email client) and if one is migrated to Office 365 while the other remains on-premises, the delegate permissions capability is broken as it does not work cross-premises.

During the recent Microsoft Ignite, it was announced that there are a lot of features coming to Office 365 which will help with the cross-premises access issues.

I have been using Roman Zarka’s Export-MailboxPermissions.ps1 (part of https://blogs.technet.microsoft.com/zarkatech/2015/06/11/migrate-mailbox-permissions-to-office-365/ bundle) script to export all on-premises mailboxes permissions then using the output to decide which mailboxes move together. Believe me, this can be quite a challenge!

Recently, while having a casual conversation with one of my colleagues, I was introduced to an Excel  spreadsheet that he had created. Being the Excel guru that he is, he was doing various VLOOKUPs into the outputs from Roman Zarka’s script, to find out if the mailboxes he was intending to migrate had any permission dependencies with other mailboxes. I just stared at the spreadsheet with awe, and uttered the words “dude, that is simply awesome!”

I was hooked on that spreadsheet. However, I started craving for it to do more. So I decided to take it on myself to add some more features to it. However, not being too savvy with Excel, I decided to use PowerShell instead. Thus was born Find_MailboxPermssions_Dependencies.ps1

I will now walk you through the script and explain what it does

 

  1. The first pre-requisite for Find_MailboxPermissions_Dependencies.ps1 are the four output files from Roman Zarka’s Export-MailboxPermissions.ps1 script (MailboxAccess.csv, MailboxFolderDelegate.csv, MailboxSendAs.csv, MaiboxSendOnBehalf.csv)
  2. The next pre-requisite is details about the on-premises mailboxes. The on-premises Exchange environment must be queried and the details output into a csv file with the name OnPrem_Mbx_Details.csv. The csv must contain the following information (along the following column headings)“DisplayName, UserPrincipalName, PrimarySmtpAddress, RecipientTypeDetails, Department, Title, Office, State, OrganizationalUnit”
  3. The last pre-requisite is information about mailboxes that are already in Office 365. Use PowerShell to connect to Exchange Online and then run the following command (where O365_Mbx_Details.csv is the output file)
    Get-Mailbox -ResultSize unlimited | Select DisplayName,UserPrincipalName,EmailAddresses,WindowsEmailAddress,RecipientTypeDetails | Export-Csv -NoTypeInformation -Path O365_Mbx_Details.csv 

    If there are no mailboxes in Office 365, then create a blank file and put the following column headings in it “DisplayName”, “UserPrincipalName”, “EmailAddresses”, “WindowsEmailAddress”, “RecipientTypeDetails”. Save the file as O365_Mbx_Details.csv

  4. Next, put the above files in the same folder and then update the variable $root_dir in the script with the path to the folder (the path must end with a )
  5. It is assumed that the above files have the following names
    • MailboxAccess.csv
    • MailboxFolderDelegate.csv
    • MailboxSendAs.csv
    • MailboxSendOnBehalf.csv
    • O365_Mbx_Details.csv
    • OnPrem_Mbx_Details.csv
  6.  Now, that all the inputs have been taken care of, run the script.
  7. The first task the script does is to validate if the input files are present. If any of them are not found, the script outputs an error and terminates.
  8. Next, the files are read and stored in memory
  9. Now for the heart of the script. It goes through each of the mailboxes in the OnPrem_Mbx_Details.csv file and finds the following
    • all mailboxes that have been given SendOnBehalf permissions to this mailbox
    • all mailboxes that this mailbox has been given SendOnBehalf permissions on
    • all mailboxes that have been given SendAs permissions to this mailbox
    • all mailboxes that this mailbox has been given SendAs permissions on
    • all mailboxes that have been given Delegate permissions to this mailbox
    • all mailboxes that this mailbox has been given Delegate permissions on
    • all mailboxes that have been given Mailbox Access permissions on this mailbox
    • all mailboxes that this mailbox has been given Mailbox Access permissions on
    • if the mailbox that this mailbox has given the above permissions to or has got permissions on has already been migrated to Office 365
  10. The results are then output to a csv file (the name of the output file is of the format Find_MailboxPermissions_Dependencies_{timestamp of when script was run}_csv.csv
  11. The columns in the output file are explained below
Column Name Description
PermTo_OtherMbx_Or_FromOtherMbx? This is Y if the mailbox has given permissions to or has permissions on other mailboxes. Is N if there are no permission dependencies for this mailbox
PermTo_Or_PermFrom_O365Mbx? This is TRUE if the mailbox that this mailbox has given permissions to or has permissions on is  already in Office 365
Migration Readiness This is a color code based on the migration readiness of this permission. This will be further explained below
DisplayName The display name of the on-premises mailbox for which the permission dependency is being found
UserPrincipalName The userprincipalname of the on-premises mailbox for which the permission dependency is being found
PrimarySmtp The primarySmtp of the on-premises mailbox  for which the permission dependency is being found
MailboxType The mailbox type of the on-premises mailbox  for which the permission dependency is being found
Department This is the department the on-premises mailbox belongs to (inherited from Active Directory object)
Title This is the title that this on-premises mailbox has (inherited from Active Directory object)
SendOnBehalf_GivenTo emailaddress of the mailbox that has been given SendOnBehalf permissions to this on-premises mailbox
SendOnBehalf_GivenOn emailaddress of the mailbox that this on-premises mailbox has been given SendOnBehalf permissions to
SendAs_GivenTo emailaddress of the mailbox that has been given SendAs permissions to this on-premises mailbox
SendAs_GivenOn emailaddress of the mailbox that this on-premises mailbox has been given SendAs permissions on
MailboxFolderDelegate_GivenTo emailaddress of the mailbox that has been given Delegate access to this on-premises mailbox
MailboxFolderDelegate_GivenTo_FolderLocation the folders of the on-premises mailbox that the delegate access has been given to
MailboxFolderDelegate_GivenTo_DelegateAccess the type of delegate access that has been given on this on-premises mailbox
MailboxFolderDelegate_GivenOn email address of the mailbox that this on-premises mailbox has been given Delegate Access to
MailboxFolderDelegate_GivenOn_FolderLocation the folders that this on-premises mailbox has been given delegate access to
MailboxFolderDelegate_GivenOn_DelegateAccess the type of delegate access that this on-premises mailbox has been given
MailboxAccess_GivenTo emailaddress of the mailbox that has been given Mailbox Access to this on-premises mailbox
MailboxAccess_GivenTo_DelegateAccess the type of Mailbox Access that has been given on this on-premises mailbox
MailboxAccess_GivenOn emailaddress of the mailbox that this mailbox has been given Mailbox Access to
MailboxAccess_GivenOn_DelegateAccess the type of Mailbox Access that this on-premises mailbox has been given
OrganizationalUnit the Organizational Unit for the on-premises mailbox

The color codes in the column Migration Readiness correspond to the following

  • LightBlue – this on-premises mailbox has no permission dependencies and can be migrated
  • DarkGreen  – this on-premises mailbox has got a Mailbox Access permission dependency to another mailbox. It can be migrated while the other mailbox can remain on-premises, without experiencing any issues as Mailbox Access permissions are supported cross-premises.
  • LightGreen – this on-premises mailbox can be migrated without issues as the permission dependency is on a mailbox that is already in Office 365
  • Orange – this on-premises mailbox has SendAs permissions given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the SendAs capability will be broken. Lately, it has been noticed that this capability can be restored by re-applying the SendAs permissions to both the migrated and on-premises mailbox post migration
  • Pink – the on-premises mailbox has FolderDelegate given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the FolderDelegate capability will be broken. A possible workaround is to replace the FolderDelegate permission with Full Mailbox access as this works cross-premises, however there are privacy concerns around this workaround as this will enable the delegate to see all the contents of the mailbox instead of just the folders they had been given access on.
  • Red – the on-premises mailbox has SendOnBehalf permissions given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the SendOnBehalf capability will be broken. A possible workaround could be to replace SendOnBehalf with SendAs however the possible implications of this change must be investigated

Yay, the output has now been generated. All we need to do now is to make it look pretty in Excel 🙂

Carry out the following steps

  • Import the output csv file into Excel, using the semi-colon “;” as the delimiter (I couldn’t use commas as the delimiter as sometimes department,titles etc fields use them and this causes issues with the output file)
  • Create Conditional Formatting rules for the column Migration Readiness so that the fill color of this cell corresponds to the word in this column (for instance, if the word is LightBlue then create a rule to apply a light blue fill to the cell)

Thats it Folks! The mailbox permissions dependency spreadsheet is now ready. It provides a single-pane view to all the permissions across your on-premises mailboxes and gives a color coded analysis on which mailboxes can be migrated on their own without any issues and which might experience issues if they are not migrated in the same batch with the ones they have permissions dependencies on.

In the output file, for each on-premises mailbox, each line represents a permission dependency (unless the column PermTo_OtherMbx_Or_FromOtherMbx? is N). If there are more than one set of permissions applicable to an on-premises mailbox, these are displayed consecutively underneath each other.

It is imperative that the migration readiness of the mailbox be evaluated based on the migration readiness of all the permissions associated with that mailbox.

Find_MailboxPermissions_Dependencies.ps1 can be downloaded from  GitHub

A sample of the spreadsheet that was created using the output from the Find_MailboxPermissions_Dependencies.ps1 script can be downloaded from https://github.com/nivleshc/arm/blob/master/Sample%20Output_MailboxPermissions%20Dependencies.xlsx

I hope this script comes in handy when you are planning your migration batches and helps alleviate some of the headache that this task brings with it.

Till the next time, have a great day 😉

[Updated] How are email addresses created for Office 365 Mailboxes?

First published at https://nivleshc.wordpress.com

Background

Over the past few weeks, I have been doing some Cloud-Only Office 365 deployments using Azure AD Connect . As you might imagine, this deployment is abit different to the Hybrid Office 365 deployment.

One of the things that got me thinking was, how are the email addresses created for my Office 365 mailboxes? As I was synchronising objects from my on-premises Active Directory, this question held the answer to what values I needed to change in my on-premises Active Directory user object, to get the desired email addresses populated in the Office 365 mailbox that will be created for it.

Preparation

To find the above answer, I devised a simple experiment.

I decided to create four on-premises Active Directory user objects, with different combinations of Netbios, UserPrincipalName (UPN), Mail attribute and ProxyAddresses and trace what happens to these values as they are used to create their corresponding Azure AD object. I will then assign an Office 365 Exchange Online Plan 2 license to these Azure AD objects and see what email addresses got assigned to the resulting Office 365 mailbox.

Simple? Lets begin.

The four user accounts I created in on-premises Active Directory had the following properties (ProxyAddresses were populated using ADSIEdit). The domains used by the UPN, Email and proxyaddresses attributes are all internet routable domains.

FirstName     :Ross
LastName      :McCary
DisplayName.  :Ross McCary
Netbios       :CONTOSO\R.McCary
UPN           :Ross.McCary@contoso.com
Email         :{blank}
ProxyAddresses:{blank}

FirstName.    :Angela
LastName      :Jones
DisplayName.  :Angela Jones
Netbios       :CONTOSO\A.Jones
UPN           :Angela.Jones@contoso.com
Email         :An.Jones@contoso.com
ProxyAddresses:{blank}

FirstName     :Zada
LastName      :Daley
DisplayName.  :Zada Daley
Netbios       :CONTOSO\Z.Daley
UPN           :Zada.Daley@contoso.com
Email         :Zada.Daley@tailspintoys.com
ProxyAddresses:{blank}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
Netbios       :CONTOSO\B.Brown
UPN           :Bob.Brown@contoso.com
Email         :Bob.Brown@contoso.com
ProxyAddresses:SMTP:Bo.Brown@contoso.com
               smtp:Bob@contoso.com
               smtp:Bobi@tailspintoys.com

The Experiment

I initiated an Azure AD Connect delta synchronisation cycle and waited. After a few minutes, I saw new objects created in my Office 365 tenant’s Azure AD that corresponded to the ones that I had created in the on-premises Active Directory.

Here are the values that Azure AD Connect (AADC) added for the newly created AAD objects (tenantid is the id for the Office 365 tenant)

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
SignInName    :Ross.McCary@contoso.com
UPN           :Ross.McCary@contoso.com
ProxyAddresses:{blank}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
SignInName    :Angela.Jones@contoso.com
UPN           :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com}

FirstName     :Zada
LastName      :Daley
DisplayName   :Zada Daley
SignInName.   :Zada.Daley@contoso.com
UPN           :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
SignInName.   :Bob.Brown@contoso.com
UPN           :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

Now, this was quite interesting. AADC added proxyaddresses for only those Azure AD (AAD) objects that had the email field populated in their corresponding on-premises Active Directory user objects. Also, for Bob Brown, two additional proxy addresses had been added. Interesting indeed! (the additional attributes are in orange above)

I then proceeded to assigning an Office 365 Exchange Online Plan 2 license to all the above AAD objects so that a mailbox would be provisioned for them. After waiting a few minutes, I checked to confirm that the mailboxes had been successfully provisioned. I then went back to Azure AD and checked the attributes again.

Below is what I saw (additional values that got added after license assignment are in Orange)

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
SignInName    :Ross.McCary@contoso.com
UPN           :Ross.McCary@contoso.com
ProxyAddresses:{SMTP:Ross.McCary@tenantid.onmicrosoft.com,
                smtp:Ross.McCary@contoso.com}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
SignInName    :Angela.Jones@contoso.com
UPN           :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com,
                smtp:Angela.Jones@contoso.com,
                smtp:An.Jones@tenantid.onmicrosoft.com}

FirstName     :Zada
LastName      :Daley
DisplayName   :Zada Daley
SignInName    :Zada.Daley@contoso.com
UPN           :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com,
                smtp:Zada.Daley@contoso.com,
                smtp:Zada.Daley@tenantid.onmicrosoft.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
SignInName.   :Bob.Brown@contoso.com
UPN           :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

For the mailboxes, below is what I saw

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
UserID        :Ross.McCary@contoso.com
ProxyAddresses:{SMTP:Ross.McCary@tenantid.onmicrosoft.com,
                smtp:Ross.McCary@contoso.com}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
UserID        :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com,
                smtp:Angela.Jones@contoso.com,
                smtp:An.Jones@tenantid.onmicrosoft.com}

FirstName     :Zada
LastName      :Daley
DisplayName.  :Zada Daley
UserID        :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com,
                smtp:Zada.Daley@contoso.com,
                smtp:Zada.Daley@tenantid.onmicrosoft.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
UserID        :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

Quite interesting (hmm well not really 😉 ). The newly created mailboxes had the same proxyaddresses as the proxyaddresses assigned to their corresponding AAD object.

The Result

After looking at the results, I could easily see a pattern between the on-premises Active Directory user object’s email and proxyaddresses values and the Azure AD and Office 365 mailbox email addresses.

From my experiment, I deduced the following process that is used to create email addresses for the Office 365 mailboxes

  1. When provisioning Azure AD objects, AADC first checks to see if the on-premises AD user object has proxyaddresses assigned. If there are any, these are added as proxyaddresses for the Azure AD object (the proxyaddress with the SMTP: prefix (uppercase smtp) will be the primarySMTP for the AAD object). If the email attribute is not part of the proxyaddresses, it is added as an additional proxyaddress. Next, the primarySMTP’s part before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address and then added as an additional proxyaddress (for instance Ross.McCary@tenantid.onmicrosoft.com)
  2. If the on-premises Active Directory user object does not have any proxyaddresses, then the email attribute is assigned as the primarySMTP address for the AAD object. If the email attribute is not the same as the UPN, then the UPN is added as an additional proxy address. Next, the primarySMTP’s part before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address and then added as an additional proxyaddress (for instance Ross.McCary@tenantid.onmicrosoft.com)
  3. In cases where both the email attribute and proxyAddresses are blank, the part of the UPN before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address. In this instance, the routing email address is set as the primarySMTP address. The UPN is then added as an additional proxyaddress.

An interesting thing to note is that even before assigning an Office 365 license, you can see what email addresses will be assigned, by using PowerShell to check the proxyaddresses attribute of the Azure AD object.

I hope the above provides some clarity around how email addresses are created for Office 365 mailboxes and helps with your Cloud-Only Office 365 architectures.

Have a great day 😉

[Update] There could be instances, where by some mistake

  • (a) a new user is assigned an email attribute that is already attached to an existing mailbox in Office 365
  • (b) a new user is assigned a proxyaddress that is already attached to an existing mailbox in Office 365

For issue (a), AADC will not create the new user in AAD and instead display an error in the AADC console when doing an exporting to AAD. The error will be similar to below

“Unable to update this object because the following attributes associated with this object have values that may already be associated with another object in your local directory services: [ProxyAddresses xxxxxxxxx;]. Correct or remove the duplicate values in your local directory.”

The error will provide detailed information regarding the values that are causing the issue. It will also contain the ObjectIdInConflict. This is the id of the existing Object that the new user is in conflict with.

Using the ObjectIdInConflict value, search the AADC Metaverse with the clause “cloudAnchor contains ObjectIdInConflict“. (replace ObjectIdInConflict with the ObjectIdInConflict value as shown in the error). This will show the metaverse record of the object that the new user is conflicting with. In this case, remediate the issue in the on-premises Active Directory and then initiate another AADC delta synchronisation cycle.

 

For issue (b), AADC will create a new user in AAD however it will remove the proxyAddress that is causing the conflict from the new user object in AAD. It will also create a record in the Office 365 Admin Center, under Settings\DirSync errors, with details of the new user, the existing user that it is in conflict with and also the attribute that is in conflict. In this case, remediate the issue in the on-premises Active Directory and initiate another AADC delta synchronisation cycle.

Note: In both cases above, the technical contact for the Office 365 tenant gets sent an email with details of the errors.

Deploying Cloud-only mailboxes in Office 365 using On-Premises Directory objects

First published at https://nivleshc.wordpress.com

In this blog, I will show you how to create Cloud-only mailboxes in Exchange Online (Exchange Online is the messaging part of Office 365) that are bound to objects synchronised from your on-premises Active Directory. The Cloud-only approach is different to the Hybrid approach because you do not need an Exchange server deployed in your on-premises environment.

There are a few reasons why you would want to link your Cloud-only mailboxes to your on-premises Active Directory. The most important reason is to ensure you don’t have multiple identities for the same user. Another reason is to provide the notion of single-sign-on. This can be established by using the password synchronisation feature of Azure AD Connect (this will be discussed abit later).

Ok, lets get started.

The diagram below shows what we will be doing. In a nutshell, we will replicate our on-premises Active Directory objects to Azure AD (these will be filtered so that only required objects are synchronised to Azure AD) using Azure AD Connect Server. Once in Azure AD, we will appropriately license the objects using the Office 365 Admin portal (any license bundle that contains the Exchange Online Plan 2 license is fine. Even Exchange Online Plan 2 by itself is sufficient).

Onpremise AD Objects Synchronised AAD

Prepare your Office 365 Tenant

Once you have obtained your Office 365 tenant, add and verify the domain you will be using for your email addresses (for instance, if your email address will be tom.jones@contoso.com, then add contoso.com in Office 365 Admin Center under Setup\Domains). You will be provided with a TXT entry that you will need to use to create a DNS entry under the domain, to prove ownership.

Once you have successfully verified the domain ownership, you will be provided with the MX entry value for the domain. This must be used to create an MX entry DNS record for the domain so that all emails are delivered to Office 365.

Prepare your on-premises Active Directory

You must check to ensure your on-premises Active Directory does not contain any objects that are incompatible with Azure AD. To perform this check, run idFix in your environment.

Note of advice - idFix, by default runs across all your Active Directory objects. You do not have to fix objects that you won't be synchronising to Azure AD

It is highly recommended that your on-premise Active Directory user objects have their userprincipalname (upn) matched to their primary email address. This will remove any confusion that users might face when accessing the Office 365 services via a web browser as Office 365 login pages refer to username as “email address”.

Next, ensure that the E-mail field for all users in Active Directory contains the UPN for the user.

ADUser

Deploy and Configure Azure AD Connect Server

Ensure all the prerequisites have been met, as outlined at https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-prerequisites

Next, follow the article at https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-select-installation to deploy and configure your Azure AD Connect (AADC) Server.

During the configuration of AADC, you will be asked to specify which on-premise Active Directory objects should be synchronised to Azure AD. Instead of synchronising all your on-premise Active Directory objects, choose the Organisational Unit that contains all the users, groups and contacts you want to synchronise to Azure AD.

Choose the Password Synchronisation option while installing the AADC server. This will synchronise your on-premise password hashes to Azure AD, enabling users to use their on-premises credentials to access Office 365 services

At this stage, your AADC server would have already done an initial run, which would have created objects in Azure AD. These are visible using the Office 365 Admin Center.

After the initial sync, AADC runs an automatic synchronisation every 30 minutes to Azure AD

Provision Mailboxes

Now that everything has been done, open Office 365 Admin Center. Under Users\Active Users you will see all the on-premise users that have been synchronised.

Click on each of the users and then in the next screen click Edit beside Product licenses and select the location of the user and also the combination of license options you want to assign the user. Ensure you select at least Exchange Online (Plan 2) as this is needed to provision a user mailbox. Click on Save.

As soon as you assign the Exchange Online (Plan 2) license, the mailbox provisioning starts. This shouldn’t take more than 10 minutes to finish. You can check the progress by clicking the user in Office 365 Admin Center and then Mail Settings at the bottom of the screen. Once the mailbox has been successfully provisioned, the We are preparing a mailbox for this user message will disappear and instead details about the user mailbox will be shown.

Once the mailbox has been provisioned, open the Exchange Admin Center and then click on recipients from the left menu. In the right hand side screen, click mailboxes. This will show you details about the mailboxes that have been provisioned so far. The newly created user mailbox should be listed there as well.

Thats it folks! You have successfully created an Exchange Online mailbox that is attached to your on-premises Active Directory user object.

Any changes to the Office 365 object (display name etc) will have to be done via the on-premises Active Directory. These changes will be synchronised to Azure AD every 30 minutes and will be reflected in the Exchange Online mailbox

If you try to modify any of the attributes via the Office 365 or Exchange Online Admin Center, you will receive the following error

The action '<cmdlet>', '<property>', can't be performed on the object '<name>' because the object is being synchronized from your on-premises organisation.

Some Additional Information

Please note that the following is not supported by Microsoft.

There are times when you need to have additional email aliases attached to a user mailbox. To do this, follow the below steps

  1. Open Active Directory Users and Computers in your on-premises Active Directory
  2. In the top menu, click View and then select Advanced Features
  3. Navigate to the user object that you want to add additional email aliases to and double click to open its settings
  4. From the tabs click on Attribute Editor
  5. Under Attributes locate proxyAddresses and click on Edit (or double click) to open it
  6. In the values field, first enter the current email address, prefixed with SMTP: (ensure the smtp is in upper case).
  7. Then add all the email aliases that you want to assign to this user. Ensure each email alias is prefixed with smtp:  The email domain for the aliases has to be a domain that is already added and verified in Office 365 Admin Center.
  8. If you need to change the reply-to (primary smtp) address for the user then remove the value that currently has the upper case SMTP: assigned to it and then re-add it, however prefix it with a lower case smtp:. Then remove the alias that  you want to assign as the reply-to (primary smtp) and re-add it, however prefix it with an upper case SMTP:

ADUser_ProxyAddresses

I hope the blog helps out those who might be wanting to use the Cloud Only instead of the Hybrid deployment approach to Office 365.

Have a great day 😉

Continuous Credential Prompt when accessing MIM Password Registration Portal

First published at https://nivleshc.wordpress.com

Recently I was at a customer site, setting up a Microsoft Identity Manager (MIM) 2016 environment, which included the deployment of the Self Service Password Registration and Self Service Password Reset portals. For additional security, I was using Kerberos instead of the default NTLM.

I finished installing the MIM Portal, Service, Password Registration and Password Reset Portals without any issues.

I then proceeded to securing all http endpoints by enabling them for SSL and after that removing the http bindings, so that you could access the MIM Portal, Password Registration and Password Reset Portals only via https. No issues there as well.

By this time I was pretty pleased with myself 😉 Everything was going as planned, no issues faced at all. Finally, lady luck was showering me with her blessings.

Having finished the installation and configuration, I proceeded to testing the solution.

The first thing to check was the MIM Portal site. I opened up a web browser and navigated to the Microsoft Identity Manager Portal. When prompted, I logged in with the mimadmin domain account credentials. I was successfully logged in and could access all the parts without any hitch.

Now kids, if you are faint at heart, be very wary of what happens next (hint. this is the time when you cover your eyes with your hands when watching a horror movie).

I then tried accessing the Self Service Password Registration Portal and got prompted for credentials.

SSPRegistration_CredentialPrompt

I entered the mimadmin account credentials and pressed enter. Just as I thought I had successfully logged in, the credential prompt returned! hmm, that is weird. I was pretty sure I had typed the username and password correctly. Oh well, maybe I didn’t.

I typed the credentials again and pressed enter. Quick as a flash, the credential prompt returned! Uh? What was happening here?{scratching my head} Hmm, I seem to be making a lot of typos today. I carefully entered the username and password again, taking my time this time, to ensure I was entering it correctly. I then pressed enter and waited.

Well, I didn’t have to wait for long since within a second, I got greeted with the Not Authorized screen!

SSPRegistration_NotAuthorised

Fascinating. It seems that lady luck had flown away because here indeed was an issue with the Self Service Password Registration Portal! Ok, Mister. Lets have a look at whats causing this kerfuffle!

I opened up the event viewer on the Self Service Password Registration server and went through each of the logged events in the Application and System logs, however I couldn’t find any clue as to why the credentials were not working. I secretly had suspicions that the issue could be due to Kerberos token errors, however I couldn’t find anything in the event logs to substantiate my suspicion. Hmm, the plot was indeed getting thicker!

I next started doing some Google searches, thinking that someone else might have encountered the same issue. Alas, it seemed that I was alone in my woes as the results seemed to be quite thin in regards to any possible solution for my issue.

Finally, I decided to follow my dear ol’ Sherlock’s advice “when you have eliminated the impossible, whatever remains, however improbable, must be the truth”

I went through the whole Self Service Password Registration setup process, checking each and every part of the configuration, to ensure that the values were as expected. After 10 minutes, I was almost done checking and no clues so far 😦

Lastly, I opened IIS Manager and checked all the settings. Nothing here as well. Hey back up a bit. What is this?

The Self Service Password Registration Portal site had its useAppPoolCredentials set to False.

SSPRegistration_useAppPoolCredentialsFalse

Now, this should be True! Is this what was causing the issue?

I quickly changed the value for useAppPoolCredentials from False to True.

SSPRegistration_useAppPoolCredentialsTrue

I then opened my web browser again and navigated to the Self Service Password Registration Portal. Once again the familiar credential prompt came up. I entered the same credentials as before and pressed enter.

Woo hoo!! This time around I was successfully logged in.

I sincerely hope that this post helps others who might be encountering the same error.

Have a great day 😉

Ok Google Email me the status of all vms – Part 2

First published at https://nivleshc.wordpress.com

In my last blog, we configured the backend systems necessary for accomplishing the task of asking Google Home “OK Google Email me the status of all vms” and it sending us an email to that effect. If you haven’t finished doing that, please refer back to my last blog and get that done before continuing.

In this blog, we will configure Google Home.

Google Home uses Google Assistant to do all the smarts. You will be amazed at all the tasks that Google Home can do out of the box.

For our purposes, we will be using the platform IF This Then That or IFTTT for short. IFTTT is a very powerful platform as it lets you create actions based on triggers. This combination of triggers and actions is called a recipe.

Ok, lets dig in and create our IFTTT recipe to accomplish our task

1.1   Go to https://ifttt.com/ and create an account (if you don’t already have one)

1.2   Login to IFTTT and click on My Applets menu from the top

IFTTT_MyApplets_Menu

1.3   Next, click on New Applet (top right hand corner)

1.4   A new recipe template will be displayed. Click on the blue + this choose a service

IFTTT_Reicipe_Step1

1.5   Under Choose a Service type “Google Assistant”

IFTTT_ChooseService

1.6   In the results Google Assistant will be displayed. Click on it

1.7   If you haven’t already connected IFTTT with Google Assistant, you will be asked to do so. When prompted, login with the Google account that is associated with your Google Home and then approve IFTTT to access it.

IFTTT_ConnectGA

1.8   The next step is to choose a trigger. Click on Say a simple phrase

IFTTT_ChooseTrigger

1.9   Now we will put in the phrases that Google Home should trigger on.

IFTTT_CompleteTrigger

For

  • What do you want to say? enter “email me the status of all vms
  • What do you want the Assistant to say in response? enter “no worries, I will send you the email right away

All the other sections are optional, however you can fill them if you prefer to do so

Click Create trigger

1.10   You will be returned to the recipe editor. To choose the action service, click on + that

IFTTT_That

1.11  Under Choose action service, type webhooks. From the results, click on Webhooks

IFTTT_ActionService

1.12   Then for Choose action click on Make a web request

IFTTT_Action_Choose

1.13   Next the Complete action fields screen is shown.

For

  • URL – paste the webhook url of the runbook that you had copied in the previous blog
  • Method – change this to POST
  • Content Type – change this to application/json

IFTTT_CompleteActionFields

Click Create action

1.13   In the next screen, click Finish

IFTTT_Review

 

Woo hoo. Everything is now complete. Lets do some testing.

Go to your Google Home and say “email me the status of all vms”. Google Home should reply by saying “no worries. I will send you the email right away”.

I have noticed some delays in receiving the email, however the most I have had to wait for is 5 minutes. If this is unacceptable, in the runbook script, modify the Send-MailMessage command by adding the parameter -Priority High. This sends all emails with high priority, which should make things faster. Also, the runbook is currently running in Azure. Better performance might be achieved by using Hybrid Runbook Workers

To monitor the status of the automation jobs, or to access their logs, in the Azure Automation Account, click on Jobs in the left hand side menu. Clicking on any one of the jobs shown will provide more information about that particular job. This can be helpful during troubleshooting.

Automation_JobsLog

There you go. All done. I hope you enjoy this additional task you can now do with your Google Home.

If you don’t own a Google Home yet, you can do the above automation using Google Assistant as well.

Ok Google Email me the status of all vms – Part 1

First published at https://nivleshc.wordpress.com

Technology is evolving at a breathtaking pace. For instance, the phone in your pocket has more grunt than the desktop computers of 10 years ago!

One of the upcoming areas in Computing Science is Artificial Intelligence. What seemed science fiction in the days of Isaac Asimov, when he penned I, Robot seems closer to reality now.

Lately the market is popping up with virtual assistants from the likes of Apple, Amazon and Google. These are “bots” that use Artificial Intelligence to help us with our daily lives, from telling us about the weather, to reminding us about our shopping lists or letting us know when our next train will be arriving. I still remember my first virtual assistant Prody Parrot, which hardly did much when you compare it to Siri, Alexa or Google Assistant.

I decided to test drive one of these virtual assistants, and so purchased a Google Home. First impressions, it is an awesome device with a lot of good things going for it. If only it came with a rechargeable battery instead of a wall charger, it would have been even more awesome. Well maybe in the next version (Google here’s a tip for your next version 😉 )

Having played with Google Home for a bit, I decided to look at ways of integrating it with Azure, and I was pleasantly surprised.

In this two-part blog, I will show you how you can use Google Home to send an email with the status of all your Azure virtual machines. This functionality can be extended to stop or start all virtual machines, however I would caution against NOT doing this in your production environment, incase you turn off some machine that is running critical workloads.

In this first blog post, we will setup the backend systems to achieve the tasks and in the next blog post, we will connect it to Google Home.

The diagram below shows how we will achieve what we have set out to do.

Google Home Workflow

Below is a list of tasks that will happen

  1. Google Home will trigger when we say “Ok Google email me the status of all vms”
  2. As Google Home uses Google Assistant, it will pass the request to the IFTTT service
  3. IFTTT will then trigger the webhooks service to call a webhook url attached to an Azure Automation Runbook
  4. A job for the specified runbook will then be queued up in Azure Automation.
  5. The runbook job will then run, and obtain a status of all vms.
  6. The output will be emailed to the designated recipient

Ok, enough talking 😉 lets start cracking.

1. Create an Azure AD Service Principal Account

In order to run our Azure Automation runbook, we need to create a security object for it to run under. This security object provides permissions to access the azure resources. For our purposes, we will be using a service principal account.

Assuming you have already installed the Azure PowerShell module, run the following in a PowerShell session to login to Azure

Import-Module AzureRm
Login-AzureRmAccount

Next, to create an Azure AD Application, run the following command

$adApp = New-AzureRmADApplication -DisplayName "DisplayName" -HomePage "HomePage" -IdentifierUris "http://IdentifierUri" -Password "Password"

where

DisplayName is the display name for your AD Application eg “Google Home Automation”

HomePage is the home page for your application eg http://googlehome (or you can ignore the -HomePage parameter as it is optional)

IdentifierUri is the URI that identifies the application eg http://googleHomeAutomation

Password is the password you will give the service principal account

Now, lets create the service principle for the Azure AD Application

New-AzureRmADServicePrincipal -ApplicationId $adApp.ApplicationId

Next, we will give the service principal account read access to the Azure tenant. If you need something more restrictive, please find the appropriate role from https://docs.microsoft.com/en-gb/azure/active-directory/role-based-access-built-in-roles

New-AzureRmRoleAssignment -RoleDefinitionName Reader -ServicePrincipalName $adApp.ApplicationId

Great, the service principal account is now ready. The username for your service principal is actually the ApplicationId suffixed by your Azure AD domain name. To get the Application ID run the following by providing the identifierUri that was supplied when creating it above

Get-AzureRmADApplication -IdentifierUri {identifierUri}

Just to be pedantic, lets check to ensure we can login to Azure using the newly created service principal account and the password. To test, run the following commands (when prompted, supply the username for the service principal account and the password that was set when it was created above)

$cred = Get-Credential 
Login-AzureRmAccount -Credential $cred -ServicePrincipal -TenantId {TenantId}

where Tenantid is your Azure Tenant’s ID

If everything was setup properly, you should now be logged in using the service principal account.

2. Create an Azure Automation Account

Next, we need an Azure Automation account.

2.1   Login to the Azure Portal and then click New

AzureMarketPlace_New

2.2   Then type Automation and click search. From the results click the following.

AzureMarketPlace_ResultsAutomation

2.3   In the next screen, click Create

2.4   Next, fill in the appropriate details and click Create

AutomationAccount_Details

3. Create a SendGrid Account

Unfortunately Azure doesn’t provide relay servers that can be used by scripts to email out. Instead you have to either use EOP (Exchange Online Protection) servers or SendGrid to achieve this. SendGrid is an Email Delivery Service that Azure provides, and you need to create an account to use it. For our purposes, we will use the free tier, which allows the delivery of 2500 emails per month, which is plenty for us.

3.1   In the Azure Portal, click New

AzureMarketPlace_New

3.2   Then search for SendGrid in the marketplace and click on the following result. Next click Create

AzureMarketPlace_ResultsSendGrid

3.3   In the next screen, for the pricing tier, select the free tier and then fill in the required details and click Create.

SendGridAccount_Details

4. Configure the Automation Account

Inside the Automation Account, we will be creating a Runbook that will contain our PowerShell script that will do all the work. The script will be using the Service Principal and SendGrid accounts. To ensure we don’t expose their credentials inside the PowerShell script, we will store them in the Automation Account under Credentials, and then access them from inside our PowerShell script.

4.1   Go into the Automation Account that you had created.

4.2   Under Shared Resource click Credentials

AutomationAccount_Credentials

4.3    Click on Add a credential and then fill in the details for the Service Principal account. Then click Create

Credentials_Details

4.4   Repeat step 4.3 above to add the SendGrid account

4.5   Now that the Credentials have been stored, under Process Automation click Runbooks

Automation_Runbooks

Then click Add a runbook and in the next screen click Create a new runbook

4.6   Give the runbook an appropriate name. Change the Runbook Type to PowerShell. Click Create

Runbook_Details

4.7   Once the Runbook has been created, paste the following script inside it, click on Save and then click on Publish

Import-Module Azure
$cred = Get-AutomationPSCredential -Name 'Service Principal account'
$mailerCred = Get-AutomationPSCredential -Name 'SendGrid account'

Login-AzureRmAccount -Credential $cred -ServicePrincipal -TenantID {tenantId}

$outputFile = $env:TEMP+ "\AzureVmStatus.html"
$vmarray = @()

#Get a list of all vms 
Write-Output "Getting a list of all VMs"
$vms = Get-AzureRmVM
$total_vms = $vms.count
Write-Output "Done. VMs Found $total_vms"

$index = 0
# Add info about VM's to the array
foreach ($vm in $vms){ 
 $index++
 Write-Output "Processing VM $index/$total_vms"
 # Get VM Status
 $vmstatus = Get-AzurermVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Status

# Add values to the array:
 $vmarray += New-Object PSObject -Property ([ordered]@{
 ResourceGroupName=$vm.ResourceGroupName
 Name=$vm.Name
 OSType=$vm.StorageProfile.OSDisk.OSType
 PowerState=(get-culture).TextInfo.ToTitleCase(($vmstatus.statuses)[1].code.split("/")[1])
 })
}
$vmarray | Sort-Object PowerState,OSType -Desc

Write-Output "Converting Output to HTML" 
$vmarray | Sort-Object PowerState,OSType -Desc | ConvertTo-Html | Out-File $outputFile
Write-Output "Converted"

$fromAddr = "senderEmailAddress"
$toAddr = "recipientEmailAddress"
$subject = "Azure VM Status as at " + (Get-Date).toString()
$smtpServer = "smtp.sendgrid.net"

Write-Output "Sending Email to $toAddr using server $smtpServer"
Send-MailMessage -Credential $mailerCred -From $fromAddr -To $toAddr -Subject $subject -Attachments $outputFile -SmtpServer $smtpServer -UseSsl
Write-Output "Email Sent"

where

  • ‘Service Principal Account’ and ‘SendGrid Account’ are the names of the credentials that were created in the Automation Account (include the ‘ ‘ around the name)
  • senderEmailAddress is the email address that the email will show it came from. Keep the domain of the email address same as your Azure domain
  • recipientEmailAddress is the email address of the recipient who will receive the list of vms

4.8   Next, we will create a Webhook. A webhook is a special URL that will allow us to execute the above script without logging into the Azure Portal. Treat the webhook URL like a password since whoever possesses the webhook can execute the runbook without needing to provide any credentials.

Open the runbook that was just created and from the top menu click on Webhook

Webhook_menu

4.9   In the next screen click Create new webhook

4.10  A security message will be displayed informing that once the webhook has been created, the URL will not be shown anywhere in the Azure Portal. IT IS EXTREMELY IMPORTANT THAT YOU COPY THE WEBHOOK URL BEFORE PRESSING THE OK BUTTON.

Enter a name for the webhook and when you want the webhook to expire. Copy the webhook URL and paste it somewhere safe. Then click OK.

Once the webhook has expired, you can’t use it to trigger the runbook, however before it expires, you can change the expiry date. For security reasons, it is recommended that you don’t keep the webhook alive for a long period of time.

Webhook_details

Thats it folks! The stage has been set and we have successfully configured the backend systems to handle our task. Give yourselves a big pat on the back.

Follow me to the next blog, where we will use the above with IFTTT, to bring it all together so that when we say “OK Google, email me the status of all vms”, an email is sent out to us with the status of all the vms 😉

I will see you in Part 2 of this blog. Ciao 😉

MfaSettings.xml updates not taking effect

First published at https://nivleshc.wordpress.com

Last week, I was at a client site, extending their Microsoft Identity Manager (MIM) 2016 Self Service Password Reset Solution so that it could use Azure MultiFactor Authentication (MFA). This is an elegant solution since instead of using Questions and Answers to authenticate yourself when trying to reset your password, you can use One Time Passwords (OTP), sent as a security code via a text message to your registered mobile device.

I followed the steps as outlined in https://github.com/Microsoft/MIMDocs/blob/master/MIMDocs/DeployUse/working-with-self-service-password-reset.md to enable Azure MFA, and everything went smoothly.

I then proceeded to testing the solution.

Using the Password Registration Portal, I registered my mobile number against my test user account.

I then opened the Password Reset Portal, entered my test user username and proceeded to wait for the text message from Microsoft Azure with the security code, so that I could enter it in the next screen.

MIM_Verify_MobilePhoneVerification

I waited and waited (for at least 5 min), unfortunately the text message didn’t arrive 😦

Ok, troubleshooting time.

On my Microsoft Identity Manager 2016 Service Server, I opened the Windows EventLogs viewer and then expanded the section for Forefront Identity Manager event logs. Aha, I was on the right track as I saw a lot of errors reported.

MIMServiceServer_EventLogs

I went through the event log entries and found one which looked abit odd. The error essentially said that the certificate path contained illegal characters.

MIMServiceServer_Error_CertificatePath

I couldn’t make much sense of this error, so I opened the MfaSettings.xml file to check, and I quickly realised my mistake. I had included the certificate file path within quotes!

I quickly removed the unnecessary ” “ , saved the MfaSettings.xml file and restarted my testing process.

I went through the password reset process again, and yet again, I didn’t receive any text message from Microsoft Azure with the security code 😦

I re-checked the eventlogs and noticed the same Exception: Illegal characters in path for the Certificate File Path error. Thinking that I might have forgotten to save the previous modification to the MfaSettings.xml file, I opened the file to confirm. The quotes were no where to be seen! Alas, the plot thickens my dear Watson!

I couldn’t find any explanation for this behaviour. Then, thinking that maybe the MIM server was having issues accessing the long filepath for the Certificate file, I moved the certificate file to a folder that was closer to the root of the C:\ drive, updated the MfaSettings.xml file appropriately and repeated my testing.

Again, no text message 😦

Checking the event logs, I noticed the same dreaded Exception: Illegal characters in path for the Certificate File Path error again.

However, looking closer at the error, I realised that the file path was reported as C:\Program Files\Microsoft Forefront Identity Manager\2010\Service\MfaCerts\cert_key.p12, which wasn’t correct since I had moved the certificate file to another folder and updated the MfaSettings.xml file accordingly!

Suddenly I had that light bulb moment 😉 Updates to the MfaSettings.xml file were not being read by the MIM Server! This could only mean that it wasn’t monitoring this file for any changes, quite the opposite to what I had initially assumed!

To force MIM to re-read the MfaSettings.xml file, I restarted the Forefront Identity Manager Service service and went through my password reset testing process again.

Eureka! This time around, I received the text message from Microsoft Azure with the security code! Checking the Event logs, I couldn’t find any new occurrences of the Exception: Illegal characters in path for the Certificate File Path error. Hurray!

I completed the password reset process and confirmed that the password for my test account had indeed been changed.

I hope this post helps others.

BTW, below is a sample of the MFASettings.xml file (for security reasons, the keys have been scrambled, however as seen below, none of the values need quotes)

<?xml version=”1.0″ encoding=”utf-8″ ?>
<SubscriberKeys>
<LICENSE_KEY>1A3FED2C1BZA</LICENSE_KEY>
<GROUP_KEY>a1b234c567890e123456gh1234567eij</GROUP_KEY>
<CERT_PASSWORD>1ABDCDEF1GHDAVWA</CERT_PASSWORD>
<CertFilePath>C:\Program Files\Microsoft Forefront Identity Manager\2010\Service\MfaCert\cert_key.p12</CertFilePath>
<Username>john.doe</Username>
<DefaultCountryCode>61</DefaultCountryCode>
</SubscriberKeys>

Re-execute the UserData script in an AWS Windows Instance

First published at https://nivleshc.wordpress.com

Bootstrapping is an awesome way of customising your instances in AWS (similar capability exists in Azure).

To enable bootstrapping, while configuring the launch instance, in Step 3: Configure Instance Details scroll down to the bottom and then expand Advanced Details.

You will notice a User data text box. This is where you can provide your bootstrap script. The script will be run when your instance is first launched.

AWS_BootstrapScript

I went ahead and entered my script in the text box and proceeded to complete my instance configuration. Once my instance was running, I initiated a Remote Desktop connection to it, to confirm that my script had run. Unfortunately, I couldn’t see any customisations (which meant my script didn’t run)

Thinking that the instance had not been able to access the user data, I opened up Internet Explorer and then browsed to the following url (this is an internal url that can be used to access the user-data)

http://169.254.169.254/latest/user-data/

I was able to successfully access the user-data, which meant that there were no issues with that.  However when checking the content, I noticed a typo! Aha, that was the reason why my customisations didn’t happen.

Unfortunately, according to AWS, user-data is only executed during launch (for those that would like to read, here is the official AWS documentation). To get the fixed bootstrap script to run, I would have to terminate my instance and launch a new one with the corrected script (I tried re-booting my windows instance after correcting my typo, however it didn’t run).

I wasn’t very happy on terminating my current instance and then launching a new one, since for those that might not be aware, AWS EC2 compute charges are rounded up to the next hour. Which means that if I terminated my current instance and launched a new one, I would be charged for 2 x 1hour sessions instead of just 1 x 1 hour!

So I set about trying to find another solution. And guess what, I did find it 🙂

Reading through the volumes of documentation on AWS, I found that when Windows Instances are provisioned, the service that does the customisations using user-data is called EC2Config. This service runs the initial startup tasks when the instance is first started and then disables them. HOWEVER, there is a way to re-enable the startup tasks later on 🙂 Here is the document that gives more information on EC2Config.

The Amazon Windows AMIs include a utility called EC2ConfigService Settings. This allows you to configure EC2Config to execute the user-data on next service startup. The utility can be found under All Programs (or you can search for it).

AWS_EC2ConfigSettings_AllApps

AWS_EC2ConfigSettings_Search

Once Open, under General you will see the following option

Enable UserData execution for next service start (automatically enabled at Sysprep) eg. or <powershell></powershell>

AWS_EC2ConfigSettings

Tick this option and then press OK. Then restart your Windows Instance.

After your Windows Instance restarts, EC2Config will execute the userData (bootstrap script) and then it will automatically remove the tick from the above option so that the userData is not executed on subsequent restarts (or service starts)

There you go. A simple way to re-run your bootstrap scripts on an AWS Windows Instance without having to terminate the current instance and launching a new one.

There are other options available in the EC2ConfigService Settings that you can explore as well 🙂

Error rebuilding MIMWAL – File MicrosoftServices.IdentityManagement.WorkflowActivityLibrary.dll not found

First published on Nivlesh’s blog at https://nivleshc.wordpress.com

A few days ago, I was going through the steps for compiling MIMWAL, as listed at http://ithinkthereforeidam.com/installing-the-mimwal/ and came across an interesting problem.

After I had rebuilt my Visual Studio package, I went to run Sign.cmd and kept getting the following error message

Signcmd_Error

Error: File “MicrosoftServices.IdentityManagement.WorkflowActivityLibrary.dll” Not Found. You need to compile WAL solution first! Make sure you use REBUILD Solution menu. Aborting script execution…

This was quite bizarre as I had not deviated from the steps listed in the above mentioned article. It was time to put on my Sherlock hat and find the culprit behind this error!

I opened the SolutionOutput folder and compared the contents to what was shown in the article and found something interesting. The dll mentioned in the error was indeed missing!

Also the file MicrosoftServices.IdentityManagement.WorkflowActivityLibrary.pdb was missing.

This meant that there must have been an error when rebuilding the package in Visual Studio. I alt+tabbed to my Visual Studio screen and in the output pane, saw something interesting. It showed that there had been some issues while copying  MicrosoftServices.IdentityManagement.WorkflowActivityLibrary.dll to the SourceOutput folder.

VisualStudioOutputPane_Error

The error

WorkflowActivityLibrary -> C:\MIMWAL-2.16.1028.0\src\WorkflowActivityLibrary\bin\Release\MicrosoftServices.IdentityManagement.WorkflowActivityLibrary.dll
1> Does C:\MIMWAL-2.16.1028.0\src\SolutionOutput specify a file name
1> or directory name on the target
1> (F = file, D = directory)? ?

seemed to indicate that when Visual Studio was trying to copy the two missing files, it hadn’t been able to determine if the destination folder  SourceOutput was a directory or a file. This resulted in Visual Studio skipping the copy of these files. Doing some investigation, I found that the MIMWAL source package didn’t contain a .\src\SourceOutput folder. This explained why Visual Studio was showing the above warnings.

Based on my findings, I found two solutions that helped resolve the issue

Solution 1

Rebuild the Visual Studio Package again. On the second try, since the SourceOutput directory now exists, the files will be successfully copied.

Solution 2

Before rebuilding the MIMWAL package, create a subfolder called SourceOutput inside the src folder

My preference is for Solution 2 as it means that I won’t get any errors.

After successfully rebuilding the MIMWAL package, I ran sign.cmd and this time around – Success! I got the expected result.

VisualStudioOutputPane_Success

Signcmd_Successful

I hope this blog helps anyone else who might be having issues with compiling MIMWAL and running sign.cmd

 

Automate Secondary ADFS Node Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

Additional nodes in an ADFS farm are required to provide redundancy incase your primary ADFS node goes offline. This ensures your ADFS service is still up and servicing all incoming requests. Additional nodes also help in load balancing the incoming traffic, which provides a better user experience in cases of high authentication traffic.

Overview

Once an ADFS farm has been created, adding additional nodes is quite simple and mostly relies on the same concepts for creating the ADFS farm. I would suggest reading my previous blog Automate ADFS Farm Installation and Configuration as some of the steps we will use in this blog were documented in it.

In this blog, I will show how to automatically provision a secondary ADFS node to an existing ADFS farm. The learnings in this blog can be easily used to deploy more ADFS nodes automatically, if needed.

Install ADFS Role

After provisioning a new Azure virtual machine, we need to install the Active Directory Federation Services role on it.  To do this, we will use the same Desired State Configuration (DSC) script that was used in the blog Automate ADFS Farm Installation and Configuration. Please refer to the section Install ADFS Role in the above blog for the steps to create the DSC script file InstallADFS.ps1.

Add to an existing ADFS Farm

Once the ADFS role has been installed on the virtual machine, we will create a Custom Script Extension (CSE) to add it to the ADFS farm.

In order to do this, we need the following

  • certificate that was used to create the ADFS farm
  • ADFS service account credentials that was used to create the ADFS farm

Once the above prerequisites have been met, we need a method for making the files available to the CSE. I documented a neat trick to “sneak-in” the certificate and password files onto the virtual machine by using Desired State Configuration (DSC) package files in my previous blog. Please refer to Automate ADFS Farm Installation and Configuration under the section Create ADFS Farm for the steps.

Also note, for adding the node to the adfs farm, the domain user credentials are not required. The certificate file will be named adfs_certificate.pfx  and the file containing the encrypted adfs service account password will be named adfspass.key.

Assuming that the prerequisites have been satisfied, and the files have been “sneaked” onto the virtual machine, lets proceed to creating the CSE.

Open Windows Powershell ISE and paste the following.

param (
  $DomainName,
  $PrimaryADFSServer,
  $AdfsSvcUsername
)

The above shows the parameters that need to be passed to the CSE where

$DomainName is the name of the Active Directory domain
$PrimaryADFSServer is the hostname of the primary ADFS server
$AdfsSvcUsername is the username of the ADFS service account

Save the file with a name of your choice (do not close the file as we will be adding more lines to it). I named my script AddToADFSFarm.ps1

Next, we need to define a variable that will contain the path to the directory where the certificate file and the file containing the encrypted adfs service account password are stored. Also, we need a variable to hold the key that was used to encrypt the adfs service account password. This will be required to decrypt the password.

Add the following to the CSE file

Next, we need to decrypt the encrypted adfs service account password.

Now, we need to import the certificate into the local computer certificate store. To make things simple, when the certificate was exported from the primary ADFS server, it was encrypted using the adfs service account password.

After importing the certificate, we will read it to get its thumbprint.

Up until now, the steps are very similar to creating an ADFS farm. However, below is where they diverge.

Add the following lines to add the virtual machine to the existing ADFS farm

You now have a custom script extension file that will add a virtual machine as a secondary node to an existing ADFS Farm.

Below is the full CSE

All that is missing now is the method to bootstrap the scripts described above (InstallADFS.ps1 and AddToADFSFarm.ps1) using Azure Resource Manager (ARM) templates.

Below is part of an ARM template that can be added to your existing template to install the ADFS role on a virtual machine and then add the virtual machine as a secondary node to the ADFS farm

In the above ARM template, the parameter ADFS02VMName refers to the hostname of the virtual machine that will be added to the ADFS Farm.

Listed below are the variables that have been used in the ARM template above

The above method can be used to add as many nodes to the ADFS farm as needed.

I hope this comes in handy when creating an ARM template to automatically deploy an ADFS Farm with additional nodes.