Provisioning Hybrid Exchange/Exchange Online Mailboxes with Microsoft Identity Manager

Introduction

Working for Kloud all our projects involve Cloud services, and all our customers have varying and unique requirements. Recently one of our customers embarked on their migration from On-Premise Exchange to Exchange Online. Nothing really groundbreaking there though, however they had a number of unique requirements including management of Litigation Hold. And that needed to be integrated with their existing Microsoft Identity Manager implementation (that currently provisions new users to their Exchange 2013 environment). They also required that management of the Exchange environment still be possible via the Exchange Management Console against a local Exchange server. This post details how I integrated the environments using MIM.

Overview

In order to integrate the Provisioning and Lifecycle management of Exchange Online Mailboxes in a Hybrid Exchange with Microsoft Identity Manager I created a custom PowerShell Management Agent simply because it was going to provide the flexibility I needed.

Provisioning is based on the following process;

  1. MIM Creates new user in Active Directory (no changes to existing MIM provisioning process)
  2. Azure Active Directory Connect synchronises the user to Azure Active Directory
  3. The Exchange Online MIM Management Agent sees the corresponding AAD account for the new user
  4. MIM Declarative Rules trigger the creation of a new Remote Mailbox for the AD/AAD user against the local Exchange 2013 On Premise Server. This allows the EMC to be used to manage mailboxes On Premise even though the mailbox resides in Office365/Exchange Online
  5. AADC/Exchange synchronises the information as part of the Hybrid Exchange topology
  6. MIM sees the EXO Mailbox configuration for the new user and enables Litigation Hold against the EXO Mailbox (if required)

The following diagram graphically depicts this process.

EXO IDM Provisioning Solution.png

Exchange Online PowerShell MA

As always I’m using my favourite PowerShell Management Agent, the Grandfeldt PS MA now available on Github here.

Schema Script

The Schema script configures the schema required for current and future EXO management requirements. The Schema is based on a single Object Class “MailUser” but pulls the information from a combination of Azure AD User and Exchange Online Mailbox object classes for an associated account. Azure AD User objects are prefixed by ‘AAD’. Non AAD prefixed attributes are EXO Mailbox attributes.

Import Script

The Import script connects to both Azure AD and Exchange Online to retrieve Azure AD User accounts and if present the associated mailbox for a user.

It retrieves all Member AAD User Accounts and puts them into a Hash Table. Connectivity to AAD is via the AzureADPreview PowerShell module. It retrieves all Mailboxes and puts them into a Hash Table. It then processes all the mailboxes first including the associated AAD User account (utilising a join via userPrincipalName).

Following processing all mailboxes the remainder of the AAD Accounts (without mailboxes) are processed.

Export Script

The Export script performs the necessary integration against OnPremise Exchange Server 2013 for Provisioning and Exchange Online for the rest of management. Both utilise Remote Powershell. It also leverages the Lithnet MIIS Automation PowerShell Module to query the Metaverse to validate current object statuses.

Wiring it all up

The scripts above will allow you to integrate a FIM/MIM implementation with AAD/EXO for management of users EXO Mailboxes. You’ll need connectivity from the MIM Sync Server to AAD/O365 in order to manage them.  Everything else I wired up using a few Sets, Workflows, Sync Rules and MPR’s.

 

Seamless Multi-identity Browsing for Cloud Consultants

If you’re a technical consultant working with cloud services like Office 365 or Azure on behalf of various clients, you have to deal with many different logins and passwords for the same URLs. This is painful, as your default browser instance doesn’t handle multiple accounts and you generally have to resort to InPrivate (IE) or Incognito (Chrome) modes which mean a lot of copying and pasting of usernames and passwords to do your job. If this is how you operate today: stop. There is an easier way.

Two tools for seamless logins

OK, the first one is technically a feature. The most important part of removing the login bottleneck is Chrome Profiles. This essential feature of Chrome lets you maintain completely separate profiles for Chrome, including saved passwords, browser cache, bookmarks, plugins, etc. Fantastic.

Set one up for each customer that you have a dedicated account for. Once you log in once, the credentials will be cached and you’ll be able to pass through seamlessly.

This is obviously a great improvement, but only half of the puzzle. It’s when Profiles are combined with another tool that the magic happens…

SlickRun your Chrome sessions

If you haven’t heard of the venerable SlickRun (which must be pushing 20 years if it’s a day) – download it right now. This gives you the godlike power of being able to launch any application or browse to any Url nearly instantaneously. Just hit ALT-Q and input the “magic word” (which autocompletes nicely) that corresponds to the command you want to execute and Bob’s your Mother’s Brother! I tend to hide the SlickRun prompt by default, so it only shows up when I use the global ALT-Q hotkey.

First we have to set up our magic word. If you simply put a URL into the ‘Filename or URL’ box, SlickRun will open it using your default browser. We don’t want that. Instead put ‘chrome.exe’ in the box and use the ‘–profile-directory’ command line switch to target the profile you want, followed by the URL to browse to.

N.B. You don’t seem to be able to reference the profiles by name. Instead you have to put “Profile n” (where n is the number of the profile in the order you created it).

SlickRun-MagicWord

That’s all there is to it. Once you’ve set up your magic words for the key web apps you need to be able to access for each client (I go with a naming convention of ‘clientappname‘ and extend that further if I have multiple test accounts I need to log in as, etc), then get to any of them in seconds and usually as seamlessly as single-sign-on would provide.

This hands-down my favourite productivity trick and yet I’ve never seen anyone else do it, or seen a better solution to the multiple logins problem. Hence this post! Hope you find it as awesome a shortcut as I do…

Till next time!

Moving an Office 365 domain to a new tenant

First published at https://nivleshc.wordpress.com

There are times when companies acquire other companies to increase their portfolio. When this happens, the acquired company’s IT infrastructure normally gets merged with the parent company. The opposite of this situation is also true. There are also times when a company sells off one of its acquired companies or divisions. In this scenario, the company that is being sold off must have all its IT infrastructure separated from the parent company.

In this blog I will show the steps that can be used to separate a previously acquired company’s email system from its parent company, after the child company has been sold off.

Background

To provide some context, let me provide some details for the parent company and the child company that is being sold off

  • The parent company is called Tail Spin Toys and owns the domain name tailspintoys.com
  • Tail Spin Toys have their own Active Directory Forest called tailspintoys.com and this is synchronised to Azure Active Directory using Azure Active Directory Connect (AADC) server.
  • Tail Spin Toys has a hybrid Exchange Online setup. Their Office 365 tenant is called tailspintoys and has the tenant address of tailspintoys.onmicrosoft.com
  • Tail Spin Toys users have their userprincipalname and primary email address in the form of firstname.lastname@tailspintoys.com
  • The previously acquired company (referred to as child company in this blog) is trading as Wing Tip Toys and owns the domain name wingtiptoys.com. This domain has been added to Tail Spin Toys Office 365 tenant as an additional domain.
  • Wing Tip Toys have their own Active Directory Forest called wingtiptoys.net and all their users, computers and servers are part of this Active Directory Forest.
  • Wing Tip Toys users have their mailboxes provisioned in Tail Spin Toys Office 365 tenant (the process will be explained in the next bullet point). Wing Tip Toys only use email services in Office 365.
  • Each Wing Tip Toys user has two identities. The first is in the wingtiptoys.net Active Directory Domain, which is used to login to their computers and to access their local resources (fileshares, printers etc).
  • The other identity is a user object created in Tail Spin Toys Active Directory Domain. This is used to provision a mailbox in Tail Spin Toys Office 365 tenant. Unfortunately, for Wing Tip Toys users, their Office 365 experience is not very elegant since their userprincipalname for Office 365 is of the form firstname.lastname@tailspintoys.com but their primary email address is of the form firstname.lastname@wingtiptoys.com. The users login to their computers using the wingtiptoys.net samaccountname. However, all users are aware of this and it doesn’t affect them too much.
  • To ensure users are not burdened with having to manage both the user object passwords, Password Change Notification System has been implemented. This replicates any password changes in the wingtiptoys.net Active Directory domain to the respective user object in tailspintoys.com Active Directory domain.

Requirement

Now that Wing Tip Toys has been sold off, all of its data must be separated from Tail Spin Toys. The following will be done for its emails.

  • A new Office 365 tenant will be provisioned for Wing Tip Toys
  • All mailbox data will be migrated to the new Office 365 tenant
  • Once all mailbox data has been migrated, after a few weeks, the Wing Tip Toys mailboxes in Tail Spin Toys Office 365 tenant will be disabled (or deleted)

Migration Plan

Ok, lets get started.

  • Due to the small size of Wing Tip Toys, it has been decided to use a Cloud-Only Office 365 Deployment. ( details on how to configure a Cloud-Only Office 365 deployment can be found here ) however a hybrid Exchange Online environment will still work with the following steps ( a few extra steps will have to be added)
  • BitTitan MigrationWiz will be used to copy the mailbox data from Tail Spin Toys Office 365 tenant to Wing Tip Toys Office 365 tenant

Here are the steps

  1. Add an additional UPN suffix (wingtiptoys.com) to the Wing Tip Toys Active Directory Domain
  2. Configure all Wing Tip Toys users in the wingtiptoys.net domain to have a userprincipalname of the form firstname.lastname@wingtiptoys.com
  3. For all the Wing Tip Toys users in the Tail Spin Toys Office 365 tenant, export their email addresses (we are after the proxyaddresses). Ensure the X500 addresses are also exported.
  4. In the wingtiptoys.net Active Directory domain, for each user, find their email addresses from the export that was done in step 3 and import it into the proxyaddresses attribute of their Active Directory user object. Ensure only email addresses for the domain wingtiptoys.com are imported (if you import any email address for which the domain hasn’t been added to wingtiptoys Office 365 tenant, these will display as the default Office 365 tenant address [@wingtiptoys.onmicrosoft.com] instead of the actual email address). Also, make sure the X500: addresses are also imported. Check to ensure the proxyaddress firstname.lastname@wingtiptoys.com is prefixed with an uppercase SMTP:. This will make it the primary smtp address
  5. Create a new Office 365 tenant for Wing Tip Toys. For simplicity, we will assume that the tenant name is wingtiptoys and the tenant address is wingtiptoys.onmicrosoft.com. Since the domain wingtiptoys.com is already attached to the Tail Spin Toys Office 365 tenant, we won’t be able to attach it to the new tenant just yet.
  6. Install Azure Active Directory Connect (AADC) Server and configure it to synchronise objects from the wingtiptoys.net Active Directory domain to the Wing Tip Toys Azure AD. For Single Sign On, we will enable AADC server for password hash synchronisation however Active Directory Federation Services (ADFS) servers can also be used.
  7. If required, configure the transport rules in Wing Tip Toys Office 365 tenant to match those in Tail Spin Toys Office 365 tenant. For example, to apply disclaimers to all outgoing emails.
  8. Export configuration of all Distribution groups in Tail Spin Toys Office 365 tenant that belong to Wing Tip Toys.
  9. Create a user in Tail Spin Toys Office 365 tenant with the upn migrationwiz@tailspintoys.onmicrosoft.com. For all Wing Tip Toys mailboxes in Tail Spin Toys Office 365 tenant, give migrationwiz@tailspintoys.onmicrosoft.com full mailbox access. This account will be used by Bittitan MigrationWiz.
  10. Create a user in Wing Tip Toys Office 365 tenant with the upn migrationwiz@wingtiptoys.onmicrosoft.com and give it Global Administrator access. This account will be used by Bittitan MigrationWiz.
  11. Provision Office 365 licenses (at least Exchange Online Plan 2) to all user objects in Wing Tip Toys Office 365 tenant that have been synchronised from the on-premise Active Directory domain. This will create mailboxes for them. At this stage, they will only have the email address firstname.lastname@wingtiptoys.onmicrosoft.com (this is because the domain wingtiptoys.com hasn’t been added to this tenant).
  12. Create an account at Bittitan and purchase licenses for MigrationWiz. One license is required for one mailbox data migration, so purchase the required number of licenses ( when you create an account, you are provided with 3 trial licenses. You can use these to test out the migration process).
  13. Use the steps listed at https://help.bittitan.com/hc/en-us/articles/115008106827-Office-365-to-Office-365-Migration-Guide-While-Keeping-the-Same-Domain-Name to configure Bittitan MigrationWiz. The source and destination mailboxes within MigrationWiz must be specified using the Office 365 tenant address, for example in our case, the source mailboxes will be firstname.lastname@tailspintoys.onmicrosoft.com and the destination mailboxes will be firstname.lastname@wingtiptoys.onmicrosoft.com
  14. Pre-Stage the mailbox data for the Wing Tip Toys mailboxes by using MigrationWiz to copy everything except the last 20 days of data from the Tail Spin Toys Office 365 tenant to the Wing Tip Toys Office 365 tenant mailboxes (the more data you pre-stage, the faster the cut-over will be).
  15. Add wingtiptoys.com domain to the Wing Tip Toys Office 365 tenant. You will get a TXT record, which must be used to create a DNS entry in wingtiptoys.com domain, to prove domain owership. Create the DNS record and have the TTL set to 300s (5min). You won’t be able to verify the domain in Wing Tip Toys Office 365 tenant as it is currently attached to the Tail Spin Toys Office 365 tenant.
  16. Change the TTL for the MX DNS entry for wingtiptoys.com to 300s (5min). Make a note of the existing MX entries.
  17. At this stage, all the preparation has been done and we are ready for the cut-over. I would suggest planning the cutover out of office hours, preferably on a Friday evening. Send out user communications at least a week prior to cut-over so that everyone is aware of the change. In the communications, ensure you state that users will not be able to receive new emails in their Mobile and Outlook clients until it has been reconfigured to use the new Office 365 tenant. Provide instructions on how users can reconfigure their Outlook and Mobile clients and also provide the Outlook Web App address (this would be similar to https://outlook.office.com/wingtiptoys.com/ )
  18. Twenty minutes prior to cut-over time, change the MX for wingtiptoys.com to invalid.outlook.com. Setting the MX record to an invalid domain just ahead of the migration causes the sender’s email server to queue the emails and retry later. This will reduce the possibility of Office 365 sending a “recipient not found” Non Delivery Receipt if the email is received for a Wing Tip Toys user after the wingtiptoys.com domain has been removed from the Tail Spin Toys Office 365 tenant but hasn’t been added to the Wing Tip Toys Office 365 tenant.
  19. At cut-over time, remove the domain wingtiptoys.com from Tail Spin Toys Office 365 tenant. A message will be displayed stating that user logins and email addresses will need to be reconfigured. It will also state that the automatic domain removal process will change them to the default domain (tailspintoys.com) and that they will no longer receive emails at wingtiptoys.com. Click Remove. The time taken to remove the domain depends on the amount of mailboxes that have to be reconfigured to use the tenant email address.
  20. Add wingtiptoys.com to the Wing Tip Toys Office 365 tenant. If you receive an error stating that the domain is already attached to another Office 365 tenant, wait for a few minutes and then try again.
  21. Run a Full Synchronisation using Azure Active Directory Connect server and wait for it to complete. Confirm that the userprincipalname and primary smtp address for the users in Wing Tip Toys Office 365 tenant has now changed to the format firstname.lastname@wingtiptoys.com.
  22. Change the MX for wingtiptoys.com to the value shown in the Wing Tip Toys Office 365 Portal under Domain Settings.
  23. Run a full migration using MigrationWiz so that the remaining mailbox data is copied across.
  24. Create a DNS entry in the internal and external DNS server for autodiscover.wingtiptoys.com. The value for this is shown in the Wing Tip Toys Office 365 Portal, under the Domains section.
  25. Test the following scenarios
    • you can login using Outlook Web App to a mailbox in the Wing Tip Toys Office 365 tenant and access all emails.
    • you can configure Outlook client and a Mobile email client to access a mailbox in the new Wing Tip Toys Office 365 tenant using autodiscover (this depends on the autodiscover.wingtiptoys.com dns record. If there are errors, check to ensure this record has been correctly populated and is pointing to the value shown in the Office 365 portal)
    • emails from external senders are successfully received in a wingtiptoys mailbox in the Wing Tip Toys Office 365 tenant.
    • emails sent from a wingtiptoys user in the Wing Tip Toys Office 365 tenant to an external recipient is successfully received
    • emails sent between wingtiptoys users in the Wing Tip Toys Office 365 tenant is successfully received
  26. Import the Distribution groups that were exported from Tail Spin Toys Office 365 tenant
  27. The cut-over is now complete.
  28. By now, users must be able to access their emails using Outlook Web App. If they have followed the instructions, their Outlook and Mobile devices must be working as well.
  29. The mailboxes belonging to the Wing Tip Toys users in the Tail Spin Toys Office 365 tenant can be kept active for at least 30 days (this will come in handy for those cases where a user reports that some emails were not copied across). After this, the mailboxes can be backed up and then deleted (If the Wing Tip Toys user mailboxes in the Tail Spin Toys Office 365 tenant had other email aliases, on which emails were being received, it would be a good idea to configure an Out of Office rule to state that all emails sent to this user should instead be sent to firstname.lastname@wingtiptoys.com)

 

Thats it folks. After following the above steps, you would have moved an additional Office 365 domain to a new Office 365 tenant and would have also moved their mailbox data to the new tenant.

I hope the above helps those that are looking at getting this done.

Have a great day 😉

Migrating Sharepoint 2013 on prem to Office365 using Sharegate

Recently I completed a migration project which brought a number of sub-sites within Sharepoint 2013 on-premise to the cloud (Sharepoint Online). We decided to use Sharegate as the primary tool due to the simplistic of it.

Although it might sound as a straightforward process, there are a few things worth to be checked pre and post migration and I have summarized them here. I found it easier to have these information recorded in a spreadsheet with different tabs:

Pre-migration check:

  1. First thing, Get Site Admin access!

    This is the first and foremost important step, get yourself the admin access. It could be a lengthy process especially in a large corporation environment. The best level of access is being granted as the Site Collection Admin for all sites, but sometimes this might not be possible. Hence, getting Site Administrator access is the bare minimum for getting migration to work.

    You will likely be granted Global Admin on the new tenant at most cases, but if not, ask for it!

  2. List down active site collection features

    Whatever feature activated on the source site would need to be activated on the destination site as well. Therefore, we need to record down what have been activated on the source site. If there is any third party feature activated, you will need to liaise with relevant stakeholder in regards to whether it is still required on the new site. If it is, it is highly likely that a separate piece of license is required as the new environment will be a cloud based, rather than on-premise. Take Nintex Workflow for example, Nintex Workflow Online is a separate license comparing to Nintex Workflow 2013.

  3. Segregate the list of sites, inventory analysis

    I found it important to list down all the list of sites you are going to migrate, distinguish if they are site collections or just subsites. What I did was to put each site under a new tab, with all its site contents listed. Next to each lists/ libraries, I have fields for the list type, number of items and comment (if any).

    Go through each of the content, preferably sit down with the site owner and get in details of it. Some useful questions can be asked

  • Is this still relevant? Can it be deleted or skipped for the migration?
  • Is this heavily used? How often does it get accessed?
  • Does this form have custom edit/ new form? Sometimes owners might not even know, so you might have to take extra look by scanning through the forms.
  • Check if pages have custom script with site URL references as this will need to be changed to accommodate the new site url.

It would also be useful to get a comprehensive knowledge of how much storage each site holds. This can help you working out which site has the most content, hence likely to take the longest time during the migration. Sharegate has an inventory reporting tool, which can help but it requires Site Collection Admin access.

  1. Discuss some of the limitations

    Pages library

    Pages library under each site need specific attention, especially if you don’t have site collection admin! Pages which inherit any content type and master page from the parent site will not have these migrated across by Sharegate, meaning these pages will either not be created at the new site, or they will simply show as using default master page. This needs to be communicated and discussed with each owners.

    External Sharing

    External users will not be migrated across to the new site! These are users who won’ be provisioned in the new tenant but still require access to Sharepoint. They will need to be added (invited) manually to a site using their O365 email account or a Microsoft account.

    An O365 account would be whatever account they have been using to get on to their own Sharepoint Online. If they have not had one, they would need to use their Microsoft account, which would be a Hotmail/ Outlook account. Once they have been invited, they would need to response to the email by signing into the portal in order to get provisioned. New SPO site collection will need to have external sharing enabled before external access can happen. For more information, refer to: https://support.office.com/en-us/article/Manage-external-sharing-for-your-SharePoint-Online-environment-C8A462EB-0723-4B0B-8D0A-70FEAFE4BE85

    What can’t Sharegate do?

    Some of the following minor things cannot be migrated to O365:

  • User alerts – user will need to reset their alerts on new site
  • Personal views – user will need to create their personal views again on new site
  • Web part connections – any web part connections will not be preserved

For more, refer: https://support.share-gate.com/hc/en-us/categories/115000076328-Limitations

Performing the migration:

  1. Pick the right time

    Doing the migration at the low activity period would be ideal. User communications should be sent out to inform about the actual date happening as earlier as possible. I tend to stick to middle of the week as that way we still have a couple of days left to solve any issues instead of doing it on Friday or Saturday.

  2. Locking old sites

    During the migration, we do not want any users to be making changes to the old site. If you are migrating site collections, fortunately there’s a way to lock it down, provided you having access to the central admin portal. See https://technet.microsoft.com/en-us/library/cc263238.aspx

    However, if you are migrating sub-sites, there’s no way to lock down a sole sub-site, except changing its site permissions. That also means changing the site permissions risk having all these permissions information lost, so it would be ideal to record these permissions before making any changes. Also, take extra note on lists or libraries with unique permissions, which means they do not inherit site permissions, hence won’t be “locked unless manually changed respectively.

  3. Beware of O365 traffic jam

    Always stick to the Insane mode when running the migration in Sharegate. The Insane mode makes use of the new Offie 365 Migration API which is the fastest way to migrate huge volumes of data to Office365. While it’s been fast to export these data to Office365, I did find a delay in waiting for Office365 to import these into Sharepoint tenant. Sometimes, it could sit there for an hour before continuing with the import. Also, avoid running too many sessions if your VM is not powerful enough.

  4. Delta migration

    The good thing with using Sharegate is that you could do delta migration, which means you only migrate those files which have been modified or added since last migrated. However, it doesn’t handle deletion! If any files have been removed since you last migrated, running a delta sync will not delete these files from the destination end. Therefore, best practice is still delete the list from the destination site and re-create it using the Site Object wizard.

Post-migration check:

Doing the migration at the low activity period would be ideal. User communications should be sent out to inform about the actual date happening as earlier as possible. I tend to stick to middle of the week as that way we still have a couple of days left to solve any issues instead of doing it on Friday or Saturday.

Things to check:

  • Users can still access relevant pages, list and libraries
  • Users can still CRUD files/ items
  • Users can open Office web app (there can be different experience related to authentication when opening Office files, in most cases, users should only get prompted the very first time opening)

Exchange Online & Splunk – Automating the solution

NOTES FROM THE FIELD:

I have recently been consulting on, what I think is a pretty cool engagement to integrate some Office365 mailbox data into the Splunk reporting platform.

I initially thought about using a .csv export methodology however through trial & error (more error than trial if I’m being honest), and realising that this method still required some manual interaction, I decided to embark on finding a fully automated solution.

The final solution comprises the below components:

  • Splunk HTTP event collector
    • Splunk hostname
    • Token from HTTP event collector config page
  • Azure automation account
    • Azure Run As Account
    • Azure Runbook
    • Exchange Online credentials (registered to Azure automation account

I’m not going to run through the creation of the automation account, or required credentials as these had already been created, however there is a great guide to configuring the solution I have used for this customer at  https://www.splunk.com/blog/2017/10/05/splunking-microsoft-cloud-data-part-3.html

What the PowerShell script we are using will achieve is the following:

  • Connect to Azure and Exchange Online – Azure run as account authentication
  • Configure variables for connection to Splunk HTTP event collector
  • Collect mailbox data from the Exchange Online environment
  • Split the mailbox data into parts for faster processing
  • Specify SSL/TLS protocol settings for self-signed cert in test environment
  • Create a JSON object to be posted to the Splunk environment
  • HTTP POST the data directly to Splunk

The Code:

#Clear Existing PS Sessions
Get-PSSession | Remove-PSSession | Out-Null
#Create Split Function for CSV file
function Split-array {
param($inArray,[int]$parts,[int]$size)
if($parts) {
$PartSize=[Math]::Ceiling($inArray.count/$parts)
}
if($size) {
$PartSize=$size
$parts=[Math]::Ceiling($inArray.count/$size)
}
$outArray=New-Object’System.Collections.Generic.List[psobject]’
for($i=1;$i-le$parts;$i++) {
$start=(($i-1)*$PartSize)
$end=(($i)*$PartSize)-1
if($end-ge$inArray.count) {$end=$inArray.count-1}
$outArray.Add(@($inArray[$start..$end]))
}
return,$outArray
}
function Connect-ExchangeOnline {
param(
$Creds
)
#Connect to Exchange Online
$Session=New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/-Credential $Credentials-Authentication Basic -AllowRedirection
$Commands=@(“Add-MailboxPermission”,”Add-RecipientPermission”,”Remove-RecipientPermission”,”Remove-MailboxPermission”,”Get-MailboxPermission”,”Get-User”,”Get-DistributionGroupMember”,”Get-DistributionGroup”,”Get-Mailbox”)
Import-PSSession-Session $Session-DisableNameChecking:$true-AllowClobber:$true-CommandName $commands|Out-Null
}
#Create Variables
$SplunkHost = “Your Splunk hostname or IP Address”
$SplunkEventCollectorPort = “8088”
$SplunkEventCollectorToken = “Splunk Token from Http Event Collector”
$servicePrincipalConnection = Get-AutomationConnection -Name ‘AzureRunAsConnection’
$credentials = Get-AutomationPSCredential -Name ‘Exchange Online’
#Connect to Azure
Add-AzureRMAccount -ServicePrincipal -Tenant $servicePrincipalConnection.TenantID -ApplicationId $servicePrincipalConnection.ApplicationID -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
#Connect to Exchange Online
Connect-ExchangeOnline -Creds $credentials
#Invoke Script
$mailboxes = Get-Mailbox -resultsize unlimited | select-object -property DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Get Current Date & Time
$time = get-date -Format s
#Convert Timezone to Australia/Brisbane
$bnetime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($time, [System.TimeZoneInfo]::Local.Id, ‘E. Australia Standard Time’)
#Adding Time Column to Output
$mailboxes = $mailboxes | Select-Object @{expression = {$bnetime}; Name = ‘Time’}, DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Create Split Array for Mailboxes Spreadsheet
$recipients = Split-array -inArray $mailboxes -parts 5
#Create JSON objects and HTTP Post to Splunk HTTP Event Collector
foreach ($recipient in $recipients) {
foreach($rin$recipient) {
#Create SSL Validation Bypass for Self-Signed Certificate in Testing
$AllProtocols = [System.Net.SecurityProtocolType]’Ssl3,Tls,Tls11,Tls12′
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
#Get JSON string to post to Splunk
$StringToPost = “{ `”Time`”: `”$($r.Time)`”, `”DisplayName`”: `”$($r.DisplayName)`”, `”PrimarySMTPAddress`”: `”$($r.PrimarySmtpAddress)`”, `”IsMailboxEnabled`”: `”$($r.IsMailboxEnabled)`”, `”ForwardingSmtpAddress`”: `”$($r.ForwardingSmtpAddress)`”, `”GrantSendOnBehalfTo`”: `”$($r.GrantSendOnBehalfTo)`”, `”ProhibitSendReceiveQuota`”: `”$($r.ProhibitSendReceiveQuota)`”, `”AddressBookPolicy`”: `”$($r.AddressBookPolicy)`” }”
$uri = “https://” + $SplunkHost + “:” + $SplunkEventCollectorPort + “/services/collector/raw”
$header = @{“Authorization”=”Splunk ” + $SplunkEventCollectorToken}
#Post to Splunk Http Event Collector
Invoke-RestMethod -Method Post -Uri $uri -Body $StringToPost -Header $header
}
}
Get-PSSession | Remove-PSSession | Out-Null

 

The final output that can be seen in Splunk looks like the following:

11/13/17
12:28:22.000 PM
{ [-]
AddressBookPolicy:
DisplayName: Shane Fisher
ForwardingSmtpAddress:
GrantSendOnBehalfTo:
IsMailboxEnabled: True
PrimarySMTPAddress: shane.fisher@xxxxxxxx.com.au
ProhibitSendReceiveQuota: 50 GB (53,687,091,200 bytes)
Time: 11/13/2017 12:28:22
}Show as raw text·         AddressBookPolicy =  

·         DisplayName = Shane Fisher

·         ForwardingSmtpAddress =  

·         GrantSendOnBehalfTo =  

·         IsMailboxEnabled = True

·         PrimarySMTPAddress = shane.fisher@xxxxxxxx.com.au

·         ProhibitSendReceiveQuota = 50 GB (53,687,091,200 bytes)

I hope this helps some of you out there.

Cheers,

Shane.

 

 

 

Connect SharePoint Online and SQL Server On-Premises with BCS/SharePoint Apps using Hybrid Connection and WCF Services

SharePoint Online cannot directly connect to on-premises data sources such as SQL Server. A recommended approach is to use Hybrid with SharePoint 2013/2016 but adds an overhead of infrastructure and maintenance costs. Hence to overcome it, I am going to describe in this blog how to use the Azure PaaS workloads and connect to on-premises data sources using BCS.

Using Azure Hybrid Connection (refer this post) and BCS with Azure Web App hosting WCF endpoint, we can now expose on-premises SQL data to SharePoint Online and Cloud by external content types (ECTs) or SharePoint Hosted Apps.

Below are two approaches by which BCS can connect these data sources to SharePoint.
1. Azure Web App hosting WCF Service and External Lists
2. Azure Web App hosting WCF Data Service and Hosted Apps

Azure WCF Service Web App and External Lists
SPOAzureBCSHybrid
Pros: The advantage of using this approach is the reusability of External Content Types (ECT). ECTs can be used across multiple lists and sites in the same site collection. ECTs can also be used for complex associations across multiple types of data.

Cons: Some shortcomings of this approach are:
– Dependency on pass through authentication for users and/or implement custom authentication to authenticate with WCF by passing SQL authentication
– Added development effort because of WCF build and hosting

High-Level Steps:
1. Create a WCF Solution using Visual Studio
2. Use ADO.Net and WCF Service calls to fetch data using web methods. Implement at least two web methods – one to return all items and one to return a specific item
3. Update Web.Config of the WCF service with required configuration for data calls
4. Create an Azure Web App
5. Publish the WCF Service to Azure Web App and get the single wsdl signature from the WCF service
6. Create an External Content Type using SharePoint Designer using the WSDL signature
7. Add GetItems and GetItem finder to ECT
8. Create an External List from ECT

Azure Web App hosting WCF Data Service and Hosted Apps
SPOAzureAppsHybrid
Pros: The advantages of using a WCF Data Service is that the OData method maps directly to the schema of the SQL table which makes it easy to build and maintain. Additionally, using SharePoint hosted apps isolates the CRUD operations from the Host Web decreasing the overhead of external content types and external lists.

Cons: The disadvantage of using this approach is that the data is scoped within the app and cannot be exposed to Host Web components making interaction limited to Web App only. There is a customization requirement to expose and operate on this data in the App Web.

High-Level Steps:
1. Create a WCF service project using Visual Studio
2. Install the EntityFramework Nuget package
3. Add a WCF data service file and implement EntityFrameworkDataService instead of DataService
4. Override the “InitializeService” method as below
5. Add an ADO.Net Entity Data Model project and configure it to fetch data from SQL Tables you want
6. Update Web.config with required configuration for data calls
7. Create an Azure Web App and enable SSL on it
8. Publish the WCF Service to Azure Web App
9. Next create a new SharePoint hosted app solution in Visual Studio
10. In the SharePoint hosted app solution, add an External Content type and select the Azure Web Application hosting the WCF data service as source
11. After the External Content type is created, then create an External List using ECT created above
12. The external list is now added to the Hosted app which can then be referenced in the app default page and app part


Hence in this blog, we have seen the two choices to host BCS connectivity services via Azure PaaS workloads, advantages and disadvantages of each and broad level steps to configure them.

A tool to find mailbox permission dependencies

First published at https://nivleshc.wordpress.com

When planning to migrate mailboxes to Office 365, a lot of care must be taken around which mailboxes are moved together. The rule of the thumb is “those that work together, move together”. The reason for taking this approach is due to the fact that there are some permissions that do not work cross-premises and can cause issues. For instance, if a mailbox has delegate permissions to another mailbox (these are permissions that have been assigned using Outlook email client) and if one is migrated to Office 365 while the other remains on-premises, the delegate permissions capability is broken as it does not work cross-premises.

During the recent Microsoft Ignite, it was announced that there are a lot of features coming to Office 365 which will help with the cross-premises access issues.

I have been using Roman Zarka’s Export-MailboxPermissions.ps1 (part of https://blogs.technet.microsoft.com/zarkatech/2015/06/11/migrate-mailbox-permissions-to-office-365/ bundle) script to export all on-premises mailboxes permissions then using the output to decide which mailboxes move together. Believe me, this can be quite a challenge!

Recently, while having a casual conversation with one of my colleagues, I was introduced to an Excel  spreadsheet that he had created. Being the Excel guru that he is, he was doing various VLOOKUPs into the outputs from Roman Zarka’s script, to find out if the mailboxes he was intending to migrate had any permission dependencies with other mailboxes. I just stared at the spreadsheet with awe, and uttered the words “dude, that is simply awesome!”

I was hooked on that spreadsheet. However, I started craving for it to do more. So I decided to take it on myself to add some more features to it. However, not being too savvy with Excel, I decided to use PowerShell instead. Thus was born Find_MailboxPermssions_Dependencies.ps1

I will now walk you through the script and explain what it does

 

  1. The first pre-requisite for Find_MailboxPermissions_Dependencies.ps1 are the four output files from Roman Zarka’s Export-MailboxPermissions.ps1 script (MailboxAccess.csv, MailboxFolderDelegate.csv, MailboxSendAs.csv, MaiboxSendOnBehalf.csv)
  2. The next pre-requisite is details about the on-premises mailboxes. The on-premises Exchange environment must be queried and the details output into a csv file with the name OnPrem_Mbx_Details.csv. The csv must contain the following information (along the following column headings)“DisplayName, UserPrincipalName, PrimarySmtpAddress, RecipientTypeDetails, Department, Title, Office, State, OrganizationalUnit”
  3. The last pre-requisite is information about mailboxes that are already in Office 365. Use PowerShell to connect to Exchange Online and then run the following command (where O365_Mbx_Details.csv is the output file)
    Get-Mailbox -ResultSize unlimited | Select DisplayName,UserPrincipalName,EmailAddresses,WindowsEmailAddress,RecipientTypeDetails | Export-Csv -NoTypeInformation -Path O365_Mbx_Details.csv 

    If there are no mailboxes in Office 365, then create a blank file and put the following column headings in it “DisplayName”, “UserPrincipalName”, “EmailAddresses”, “WindowsEmailAddress”, “RecipientTypeDetails”. Save the file as O365_Mbx_Details.csv

  4. Next, put the above files in the same folder and then update the variable $root_dir in the script with the path to the folder (the path must end with a )
  5. It is assumed that the above files have the following names
    • MailboxAccess.csv
    • MailboxFolderDelegate.csv
    • MailboxSendAs.csv
    • MailboxSendOnBehalf.csv
    • O365_Mbx_Details.csv
    • OnPrem_Mbx_Details.csv
  6.  Now, that all the inputs have been taken care of, run the script.
  7. The first task the script does is to validate if the input files are present. If any of them are not found, the script outputs an error and terminates.
  8. Next, the files are read and stored in memory
  9. Now for the heart of the script. It goes through each of the mailboxes in the OnPrem_Mbx_Details.csv file and finds the following
    • all mailboxes that have been given SendOnBehalf permissions to this mailbox
    • all mailboxes that this mailbox has been given SendOnBehalf permissions on
    • all mailboxes that have been given SendAs permissions to this mailbox
    • all mailboxes that this mailbox has been given SendAs permissions on
    • all mailboxes that have been given Delegate permissions to this mailbox
    • all mailboxes that this mailbox has been given Delegate permissions on
    • all mailboxes that have been given Mailbox Access permissions on this mailbox
    • all mailboxes that this mailbox has been given Mailbox Access permissions on
    • if the mailbox that this mailbox has given the above permissions to or has got permissions on has already been migrated to Office 365
  10. The results are then output to a csv file (the name of the output file is of the format Find_MailboxPermissions_Dependencies_{timestamp of when script was run}_csv.csv
  11. The columns in the output file are explained below
Column Name Description
PermTo_OtherMbx_Or_FromOtherMbx? This is Y if the mailbox has given permissions to or has permissions on other mailboxes. Is N if there are no permission dependencies for this mailbox
PermTo_Or_PermFrom_O365Mbx? This is TRUE if the mailbox that this mailbox has given permissions to or has permissions on is  already in Office 365
Migration Readiness This is a color code based on the migration readiness of this permission. This will be further explained below
DisplayName The display name of the on-premises mailbox for which the permission dependency is being found
UserPrincipalName The userprincipalname of the on-premises mailbox for which the permission dependency is being found
PrimarySmtp The primarySmtp of the on-premises mailbox  for which the permission dependency is being found
MailboxType The mailbox type of the on-premises mailbox  for which the permission dependency is being found
Department This is the department the on-premises mailbox belongs to (inherited from Active Directory object)
Title This is the title that this on-premises mailbox has (inherited from Active Directory object)
SendOnBehalf_GivenTo emailaddress of the mailbox that has been given SendOnBehalf permissions to this on-premises mailbox
SendOnBehalf_GivenOn emailaddress of the mailbox that this on-premises mailbox has been given SendOnBehalf permissions to
SendAs_GivenTo emailaddress of the mailbox that has been given SendAs permissions to this on-premises mailbox
SendAs_GivenOn emailaddress of the mailbox that this on-premises mailbox has been given SendAs permissions on
MailboxFolderDelegate_GivenTo emailaddress of the mailbox that has been given Delegate access to this on-premises mailbox
MailboxFolderDelegate_GivenTo_FolderLocation the folders of the on-premises mailbox that the delegate access has been given to
MailboxFolderDelegate_GivenTo_DelegateAccess the type of delegate access that has been given on this on-premises mailbox
MailboxFolderDelegate_GivenOn email address of the mailbox that this on-premises mailbox has been given Delegate Access to
MailboxFolderDelegate_GivenOn_FolderLocation the folders that this on-premises mailbox has been given delegate access to
MailboxFolderDelegate_GivenOn_DelegateAccess the type of delegate access that this on-premises mailbox has been given
MailboxAccess_GivenTo emailaddress of the mailbox that has been given Mailbox Access to this on-premises mailbox
MailboxAccess_GivenTo_DelegateAccess the type of Mailbox Access that has been given on this on-premises mailbox
MailboxAccess_GivenOn emailaddress of the mailbox that this mailbox has been given Mailbox Access to
MailboxAccess_GivenOn_DelegateAccess the type of Mailbox Access that this on-premises mailbox has been given
OrganizationalUnit the Organizational Unit for the on-premises mailbox

The color codes in the column Migration Readiness correspond to the following

  • LightBlue – this on-premises mailbox has no permission dependencies and can be migrated
  • DarkGreen  – this on-premises mailbox has got a Mailbox Access permission dependency to another mailbox. It can be migrated while the other mailbox can remain on-premises, without experiencing any issues as Mailbox Access permissions are supported cross-premises.
  • LightGreen – this on-premises mailbox can be migrated without issues as the permission dependency is on a mailbox that is already in Office 365
  • Orange – this on-premises mailbox has SendAs permissions given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the SendAs capability will be broken. Lately, it has been noticed that this capability can be restored by re-applying the SendAs permissions to both the migrated and on-premises mailbox post migration
  • Pink – the on-premises mailbox has FolderDelegate given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the FolderDelegate capability will be broken. A possible workaround is to replace the FolderDelegate permission with Full Mailbox access as this works cross-premises, however there are privacy concerns around this workaround as this will enable the delegate to see all the contents of the mailbox instead of just the folders they had been given access on.
  • Red – the on-premises mailbox has SendOnBehalf permissions given to/or on another on-premises mailbox. If both mailboxes are not migrated at the same time, the SendOnBehalf capability will be broken. A possible workaround could be to replace SendOnBehalf with SendAs however the possible implications of this change must be investigated

Yay, the output has now been generated. All we need to do now is to make it look pretty in Excel 🙂

Carry out the following steps

  • Import the output csv file into Excel, using the semi-colon “;” as the delimiter (I couldn’t use commas as the delimiter as sometimes department,titles etc fields use them and this causes issues with the output file)
  • Create Conditional Formatting rules for the column Migration Readiness so that the fill color of this cell corresponds to the word in this column (for instance, if the word is LightBlue then create a rule to apply a light blue fill to the cell)

Thats it Folks! The mailbox permissions dependency spreadsheet is now ready. It provides a single-pane view to all the permissions across your on-premises mailboxes and gives a color coded analysis on which mailboxes can be migrated on their own without any issues and which might experience issues if they are not migrated in the same batch with the ones they have permissions dependencies on.

In the output file, for each on-premises mailbox, each line represents a permission dependency (unless the column PermTo_OtherMbx_Or_FromOtherMbx? is N). If there are more than one set of permissions applicable to an on-premises mailbox, these are displayed consecutively underneath each other.

It is imperative that the migration readiness of the mailbox be evaluated based on the migration readiness of all the permissions associated with that mailbox.

Find_MailboxPermissions_Dependencies.ps1 can be downloaded from  GitHub

A sample of the spreadsheet that was created using the output from the Find_MailboxPermissions_Dependencies.ps1 script can be downloaded from https://github.com/nivleshc/arm/blob/master/Sample%20Output_MailboxPermissions%20Dependencies.xlsx

I hope this script comes in handy when you are planning your migration batches and helps alleviate some of the headache that this task brings with it.

Till the next time, have a great day 😉

Global Navigation and Branding for Modern Site using SharePoint Framework Extensions

Last month at the Microsoft Ignite 2017, SharePoint Framework Extensions became GA. It gave us whole new capabilities how we can customize Modern Team sites and Communication sites.

Even though there are lots of PnP examples on SPFx extensions, while presenting at Office 365 Bootcamp, Melbourne and taking hands-on lab, I realised not many people are aware about the new capabilities that SPFx extensions provide. One of the burning question we often get from clients, if we can have custom header, footer and global navigation in the modern sites and the answer is YES. Here is an example where Global Navigation has been derived from Managed Metadata:

Communication Site with header and footer:

Modern Team Site with header and footer (same navigation):

With the latest Yeoman SharePoint generator, along with the SPFx Web Part now we have options to create extensions:

To create header and footer for the modern site, we need to select the Application Customizer extension.

After the solution has been created, one noticeable difference is TenantGlobalNavBarApplicationCustomizer is extending from BaseApplicationCustomizer and not BaseClientSideWebPart.

export default class TenantGlobalNavBarApplicationCustomizer
extends BaseApplicationCustomizer

Basic Header and Footer

Now to create a very basic Application Customizer with header/footer, make sure to import the React, ReactDom, PlaceholderContent and PlaceholderName:

import * as ReactDom from 'react-dom';
import * as React from 'react';
import {
  BaseApplicationCustomizer,
  PlaceholderContent,
  PlaceholderName
} from '@microsoft/sp-application-base';

In the onInit() function, the top(header) and the bottom (footer) placeholders need to be created:

const topPlaceholder =  this.context.placeholderProvider.tryCreateContent(PlaceholderName.Top);
const bottomPlaceholder =  this.context.placeholderProvider.tryCreateContent(PlaceholderName.Bottom);

Create the appropriate elements for the header and footer:

const topElement =  React.createElement('h1', {}, 'This is Header');
const bottomElement =   React.createElement('h1', {}, 'This is Footer');

Those elements can be render within placeholder domElement:

ReactDom.render(topElement, topPlaceholder.domElement);
ReactDom.render(bottomElement, bottomPlaceholder.domElement);

If you now run the solution:

  • gulp serve –nobrowser
  • Copy the id from src\extensions\.manifest.json file e.g. “7650cbbb-688f-4c62-b5e3-5b3781413223”
  • Open a modern site and append the following URL (change the id as per your solution):
    ?loadSPFX=true&debugManifestsFile=https://localhost:4321/temp/manifests.js&customActions={“7650cbbb-688f-4c62-b5e3-5b3781413223”:{“location”:”ClientSideExtension.ApplicationCustomizer”}}

The above 6 lines code will give the following outcome:

Managed Metadata Navigation

Now back to the first example. That solution has been copied from SPFx Sample and has been updated.

To get the above header and footer:

Go to command line and run

  • npm i
  • gulp serve –nobrowser
  • To see the code running, go to the SharePoint Online Modern site and append the following with the site URL:

    ?loadSPFX=true&debugManifestsFile=https://localhost:4321/temp/manifests.js&customActions={“b1efedb9-b371-4f5c-a90f-3742d1842cf3”:{“location”:”ClientSideExtension.ApplicationCustomizer”,”properties”:{“TopMenuTermSet”:”TopNav”,”BottomMenuTermSet”:”Footer”}}}

Deployment

Create an Office 365 CDN

Update config\write-manifests.json file “cdnBasePath” to the new location. E.g.

"cdnBasePath":"https://publiccdn.sharepointonline.com/<YourTenantName>.sharepoint.com//<YourSiteName>//<YourLibName>/<FoldeNameifAny>"

In the command line, run

  • gulp bundle –ship
  • gulp package-solution –ship

and upload all artefacts from \temp\deploy\ to the CDN location

Upload \sharepoint\solution.sppkg to the App Library

Go to the Modern Site > Site Content > New > App > select and add the app:

Hopefully this post has given an overview how to implement SPFx Application Customizers. There are many samples available in the GitHub SharePoint.

Royal Flying Doctor Service empowers employees with Office 365

Customer Overview

The Royal Flying Doctor Service (Queensland) is a part of one of the largest and most comprehensive aero-medical organisations in the world. It provides emergency and primary health care services for those living in rural, remote and regional Queensland. A not-for-profit organisation, the Flying Doctor relies on the support of the community and in Queensland the Service needs to raise over $12 million per year to keep the Flying Doctor flying.

Business Situation

The Royal Flying Doctor Service (Queensland) provides patient care to 95,000 people in remote communities as well as life-saving transfers to those in need of organ transplants and heart surgery. Their 19 aircraft fly more than 22,000 hours per week, 24 hours a day, spanning thousands of kilometres.

Their internal infrastructure was dated, unreliable and expensive, that hindered the quality of the emergency evacuation and transportation services they provide. Issues in network performance and security provisioning led to staff complaints and impacted on service delivery.

RFDS had recently adopted a cloud-first strategy to encourage innovation and harness technology to enhance the delivery of emergency and primary health services for patients. RFDS wanted to empower their employees with 24/7 access to find information within seconds to help them fix a problem or receive important updates. The solution would need to maintain a central point of administration for their electronic flight bag (EFB) devices, which ensure flight crew can perform flight management tasks and access critical aircraft documentation.

Charged with a mission to transform their workplace through a modern set of collaboration tools, Kloud partnered with RFDS to deliver a solution to provide flexible, transparent and simplified communications and connections across their business.

Solution

Kloud proposed a hybrid approach for deployment and adoption of Office 365 services including Exchange Online, OneDrive for Business, Skype for Business and Yammer. Utilising Telstra’s Cloud Gateway RFDS are able to leverage Microsoft Azure to host their ADFS farm and Azure AD-Connect server which is integral for this operating model. Transitioning RFDS (Queensland) to Office 365 and Azure has enabled the business to provide staff with a more robust up-to-date messaging system and Collaboration services that will lead to improved productivity and user satisfaction.

The change to Office 365 provides RFDS (Queensland) with an evergreen environment, providing more functionality and features then they ever had previously, resolved many operational and infrastructure concerns and more cost effective then running their own Exchange service. Ultimately, the solution means critical patient and business information is centrally maintained in a secure, cloud first infrastructure with multiple copies of data whether in transit or at rest. In parallel with this an additional program of work has commenced to migrate RFDS’ existing SharePoint services to SharePoint Online, which will continue to improve employee experience, though an enhanced collaboration and sharing network.

Technologies used in the solution include:

  • Skype for Business Online
  • Office 365 Exchange Online 2016
  • OneDrive for Business
  • Azure Active Directory
  • Azure AD authentication system
  • Exchange Server Hybrid Configuration
  • Azure Tenancy
  • Telstra Cloud Gateway
  • Azure ExpressRoute

Benefits

  • Highly available “Evergreen” cloud based solution
  • Reduced administrative overhead for IT teams
  • Better collaboration amongst users’ and teams’
  • Efficiency gains by use of Skype for Business
  • Larger user mailboxes and personal archive mailboxes in Exchange Online

Customer Testimonial 

The technical resources provided have been outstanding, as well as from a teaming perspective. Kloud have often gone the extra mile for us. The RFDS business in Queensland will now benefit from utilising Office 365 as it matures into the new set of products and features. – Colin Denzel, ICT Portfolio Manager

 

[Updated] How are email addresses created for Office 365 Mailboxes?

First published at https://nivleshc.wordpress.com

Background

Over the past few weeks, I have been doing some Cloud-Only Office 365 deployments using Azure AD Connect . As you might imagine, this deployment is abit different to the Hybrid Office 365 deployment.

One of the things that got me thinking was, how are the email addresses created for my Office 365 mailboxes? As I was synchronising objects from my on-premises Active Directory, this question held the answer to what values I needed to change in my on-premises Active Directory user object, to get the desired email addresses populated in the Office 365 mailbox that will be created for it.

Preparation

To find the above answer, I devised a simple experiment.

I decided to create four on-premises Active Directory user objects, with different combinations of Netbios, UserPrincipalName (UPN), Mail attribute and ProxyAddresses and trace what happens to these values as they are used to create their corresponding Azure AD object. I will then assign an Office 365 Exchange Online Plan 2 license to these Azure AD objects and see what email addresses got assigned to the resulting Office 365 mailbox.

Simple? Lets begin.

The four user accounts I created in on-premises Active Directory had the following properties (ProxyAddresses were populated using ADSIEdit). The domains used by the UPN, Email and proxyaddresses attributes are all internet routable domains.

FirstName     :Ross
LastName      :McCary
DisplayName.  :Ross McCary
Netbios       :CONTOSO\R.McCary
UPN           :Ross.McCary@contoso.com
Email         :{blank}
ProxyAddresses:{blank}

FirstName.    :Angela
LastName      :Jones
DisplayName.  :Angela Jones
Netbios       :CONTOSO\A.Jones
UPN           :Angela.Jones@contoso.com
Email         :An.Jones@contoso.com
ProxyAddresses:{blank}

FirstName     :Zada
LastName      :Daley
DisplayName.  :Zada Daley
Netbios       :CONTOSO\Z.Daley
UPN           :Zada.Daley@contoso.com
Email         :Zada.Daley@tailspintoys.com
ProxyAddresses:{blank}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
Netbios       :CONTOSO\B.Brown
UPN           :Bob.Brown@contoso.com
Email         :Bob.Brown@contoso.com
ProxyAddresses:SMTP:Bo.Brown@contoso.com
               smtp:Bob@contoso.com
               smtp:Bobi@tailspintoys.com

The Experiment

I initiated an Azure AD Connect delta synchronisation cycle and waited. After a few minutes, I saw new objects created in my Office 365 tenant’s Azure AD that corresponded to the ones that I had created in the on-premises Active Directory.

Here are the values that Azure AD Connect (AADC) added for the newly created AAD objects (tenantid is the id for the Office 365 tenant)

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
SignInName    :Ross.McCary@contoso.com
UPN           :Ross.McCary@contoso.com
ProxyAddresses:{blank}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
SignInName    :Angela.Jones@contoso.com
UPN           :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com}

FirstName     :Zada
LastName      :Daley
DisplayName   :Zada Daley
SignInName.   :Zada.Daley@contoso.com
UPN           :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
SignInName.   :Bob.Brown@contoso.com
UPN           :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

Now, this was quite interesting. AADC added proxyaddresses for only those Azure AD (AAD) objects that had the email field populated in their corresponding on-premises Active Directory user objects. Also, for Bob Brown, two additional proxy addresses had been added. Interesting indeed! (the additional attributes are in orange above)

I then proceeded to assigning an Office 365 Exchange Online Plan 2 license to all the above AAD objects so that a mailbox would be provisioned for them. After waiting a few minutes, I checked to confirm that the mailboxes had been successfully provisioned. I then went back to Azure AD and checked the attributes again.

Below is what I saw (additional values that got added after license assignment are in Orange)

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
SignInName    :Ross.McCary@contoso.com
UPN           :Ross.McCary@contoso.com
ProxyAddresses:{SMTP:Ross.McCary@tenantid.onmicrosoft.com,
                smtp:Ross.McCary@contoso.com}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
SignInName    :Angela.Jones@contoso.com
UPN           :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com,
                smtp:Angela.Jones@contoso.com,
                smtp:An.Jones@tenantid.onmicrosoft.com}

FirstName     :Zada
LastName      :Daley
DisplayName   :Zada Daley
SignInName    :Zada.Daley@contoso.com
UPN           :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com,
                smtp:Zada.Daley@contoso.com,
                smtp:Zada.Daley@tenantid.onmicrosoft.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
SignInName.   :Bob.Brown@contoso.com
UPN           :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

For the mailboxes, below is what I saw

FirstName     :Ross
LastName      :McCary
DisplayName   :Ross McCary
UserID        :Ross.McCary@contoso.com
ProxyAddresses:{SMTP:Ross.McCary@tenantid.onmicrosoft.com,
                smtp:Ross.McCary@contoso.com}

FirstName     :Angela
LastName      :Jones
DisplayName   :Angela Jones
UserID        :Angela.Jones@contoso.com
ProxyAddresses:{SMTP:An.Jones@contoso.com,
                smtp:Angela.Jones@contoso.com,
                smtp:An.Jones@tenantid.onmicrosoft.com}

FirstName     :Zada
LastName      :Daley
DisplayName.  :Zada Daley
UserID        :Zada.Daley@contoso.com
ProxyAddresses:{SMTP:Zada.Daley@tailspintoys.com,
                smtp:Zada.Daley@contoso.com,
                smtp:Zada.Daley@tenantid.onmicrosoft.com}

FirstName     :Bob
LastName      :Brown
DisplayName   :Bob Brown
UserID        :Bob.Brown@contoso.com
ProxyAddresses:{SMTP:Bo.Brown@contoso.com,
                smtp:Bob@contoso.com,
                smtp:Bobi@tailspintoys.com,
                smtp:Bo.Brown@tenantid.onmicrosoft.com,
                smtp:Bob.Brown@contoso.com}

Quite interesting (hmm well not really 😉 ). The newly created mailboxes had the same proxyaddresses as the proxyaddresses assigned to their corresponding AAD object.

The Result

After looking at the results, I could easily see a pattern between the on-premises Active Directory user object’s email and proxyaddresses values and the Azure AD and Office 365 mailbox email addresses.

From my experiment, I deduced the following process that is used to create email addresses for the Office 365 mailboxes

  1. When provisioning Azure AD objects, AADC first checks to see if the on-premises AD user object has proxyaddresses assigned. If there are any, these are added as proxyaddresses for the Azure AD object (the proxyaddress with the SMTP: prefix (uppercase smtp) will be the primarySMTP for the AAD object). If the email attribute is not part of the proxyaddresses, it is added as an additional proxyaddress. Next, the primarySMTP’s part before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address and then added as an additional proxyaddress (for instance Ross.McCary@tenantid.onmicrosoft.com)
  2. If the on-premises Active Directory user object does not have any proxyaddresses, then the email attribute is assigned as the primarySMTP address for the AAD object. If the email attribute is not the same as the UPN, then the UPN is added as an additional proxy address. Next, the primarySMTP’s part before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address and then added as an additional proxyaddress (for instance Ross.McCary@tenantid.onmicrosoft.com)
  3. In cases where both the email attribute and proxyAddresses are blank, the part of the UPN before the @ is prefixed to the tenant email domain to create the mailbox’s routing email address. In this instance, the routing email address is set as the primarySMTP address. The UPN is then added as an additional proxyaddress.

An interesting thing to note is that even before assigning an Office 365 license, you can see what email addresses will be assigned, by using PowerShell to check the proxyaddresses attribute of the Azure AD object.

I hope the above provides some clarity around how email addresses are created for Office 365 mailboxes and helps with your Cloud-Only Office 365 architectures.

Have a great day 😉

[Update] There could be instances, where by some mistake

  • (a) a new user is assigned an email attribute that is already attached to an existing mailbox in Office 365
  • (b) a new user is assigned a proxyaddress that is already attached to an existing mailbox in Office 365

For issue (a), AADC will not create the new user in AAD and instead display an error in the AADC console when doing an exporting to AAD. The error will be similar to below

“Unable to update this object because the following attributes associated with this object have values that may already be associated with another object in your local directory services: [ProxyAddresses xxxxxxxxx;]. Correct or remove the duplicate values in your local directory.”

The error will provide detailed information regarding the values that are causing the issue. It will also contain the ObjectIdInConflict. This is the id of the existing Object that the new user is in conflict with.

Using the ObjectIdInConflict value, search the AADC Metaverse with the clause “cloudAnchor contains ObjectIdInConflict“. (replace ObjectIdInConflict with the ObjectIdInConflict value as shown in the error). This will show the metaverse record of the object that the new user is conflicting with. In this case, remediate the issue in the on-premises Active Directory and then initiate another AADC delta synchronisation cycle.

 

For issue (b), AADC will create a new user in AAD however it will remove the proxyAddress that is causing the conflict from the new user object in AAD. It will also create a record in the Office 365 Admin Center, under Settings\DirSync errors, with details of the new user, the existing user that it is in conflict with and also the attribute that is in conflict. In this case, remediate the issue in the on-premises Active Directory and initiate another AADC delta synchronisation cycle.

Note: In both cases above, the technical contact for the Office 365 tenant gets sent an email with details of the errors.