Deploying Cloud-only mailboxes in Office 365 using On-Premises Directory objects

First published at https://nivleshc.wordpress.com

In this blog, I will show you how to create Cloud-only mailboxes in Exchange Online (Exchange Online is the messaging part of Office 365) that are bound to objects synchronised from your on-premises Active Directory. The Cloud-only approach is different to the Hybrid approach because you do not need an Exchange server deployed in your on-premises environment.

There are a few reasons why you would want to link your Cloud-only mailboxes to your on-premises Active Directory. The most important reason is to ensure you don’t have multiple identities for the same user. Another reason is to provide the notion of single-sign-on. This can be established by using the password synchronisation feature of Azure AD Connect (this will be discussed abit later).

Ok, lets get started.

The diagram below shows what we will be doing. In a nutshell, we will replicate our on-premises Active Directory objects to Azure AD (these will be filtered so that only required objects are synchronised to Azure AD) using Azure AD Connect Server. Once in Azure AD, we will appropriately license the objects using the Office 365 Admin portal (any license bundle that contains the Exchange Online Plan 2 license is fine. Even Exchange Online Plan 2 by itself is sufficient).

Onpremise AD Objects Synchronised AAD

Prepare your Office 365 Tenant

Once you have obtained your Office 365 tenant, add and verify the domain you will be using for your email addresses (for instance, if your email address will be tom.jones@contoso.com, then add contoso.com in Office 365 Admin Center under Setup\Domains). You will be provided with a TXT entry that you will need to use to create a DNS entry under the domain, to prove ownership.

Once you have successfully verified the domain ownership, you will be provided with the MX entry value for the domain. This must be used to create an MX entry DNS record for the domain so that all emails are delivered to Office 365.

Prepare your on-premises Active Directory

You must check to ensure your on-premises Active Directory does not contain any objects that are incompatible with Azure AD. To perform this check, run idFix in your environment.

Note of advice - idFix, by default runs across all your Active Directory objects. You do not have to fix objects that you won't be synchronising to Azure AD

It is highly recommended that your on-premise Active Directory user objects have their userprincipalname (upn) matched to their primary email address. This will remove any confusion that users might face when accessing the Office 365 services via a web browser as Office 365 login pages refer to username as “email address”.

Next, ensure that the E-mail field for all users in Active Directory contains the UPN for the user.

ADUser

Deploy and Configure Azure AD Connect Server

Ensure all the prerequisites have been met, as outlined at https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-prerequisites

Next, follow the article at https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-select-installation to deploy and configure your Azure AD Connect (AADC) Server.

During the configuration of AADC, you will be asked to specify which on-premise Active Directory objects should be synchronised to Azure AD. Instead of synchronising all your on-premise Active Directory objects, choose the Organisational Unit that contains all the users, groups and contacts you want to synchronise to Azure AD.

Choose the Password Synchronisation option while installing the AADC server. This will synchronise your on-premise password hashes to Azure AD, enabling users to use their on-premises credentials to access Office 365 services

At this stage, your AADC server would have already done an initial run, which would have created objects in Azure AD. These are visible using the Office 365 Admin Center.

After the initial sync, AADC runs an automatic synchronisation every 30 minutes to Azure AD

Provision Mailboxes

Now that everything has been done, open Office 365 Admin Center. Under Users\Active Users you will see all the on-premise users that have been synchronised.

Click on each of the users and then in the next screen click Edit beside Product licenses and select the location of the user and also the combination of license options you want to assign the user. Ensure you select at least Exchange Online (Plan 2) as this is needed to provision a user mailbox. Click on Save.

As soon as you assign the Exchange Online (Plan 2) license, the mailbox provisioning starts. This shouldn’t take more than 10 minutes to finish. You can check the progress by clicking the user in Office 365 Admin Center and then Mail Settings at the bottom of the screen. Once the mailbox has been successfully provisioned, the We are preparing a mailbox for this user message will disappear and instead details about the user mailbox will be shown.

Once the mailbox has been provisioned, open the Exchange Admin Center and then click on recipients from the left menu. In the right hand side screen, click mailboxes. This will show you details about the mailboxes that have been provisioned so far. The newly created user mailbox should be listed there as well.

Thats it folks! You have successfully created an Exchange Online mailbox that is attached to your on-premises Active Directory user object.

Any changes to the Office 365 object (display name etc) will have to be done via the on-premises Active Directory. These changes will be synchronised to Azure AD every 30 minutes and will be reflected in the Exchange Online mailbox

If you try to modify any of the attributes via the Office 365 or Exchange Online Admin Center, you will receive the following error

The action '<cmdlet>', '<property>', can't be performed on the object '<name>' because the object is being synchronized from your on-premises organisation.

Some Additional Information

Please note that the following is not supported by Microsoft.

There are times when you need to have additional email aliases attached to a user mailbox. To do this, follow the below steps

  1. Open Active Directory Users and Computers in your on-premises Active Directory
  2. In the top menu, click View and then select Advanced Features
  3. Navigate to the user object that you want to add additional email aliases to and double click to open its settings
  4. From the tabs click on Attribute Editor
  5. Under Attributes locate proxyAddresses and click on Edit (or double click) to open it
  6. In the values field, first enter the current email address, prefixed with SMTP: (ensure the smtp is in upper case).
  7. Then add all the email aliases that you want to assign to this user. Ensure each email alias is prefixed with smtp:  The email domain for the aliases has to be a domain that is already added and verified in Office 365 Admin Center.
  8. If you need to change the reply-to (primary smtp) address for the user then remove the value that currently has the upper case SMTP: assigned to it and then re-add it, however prefix it with a lower case smtp:. Then remove the alias that  you want to assign as the reply-to (primary smtp) and re-add it, however prefix it with an upper case SMTP:

ADUser_ProxyAddresses

I hope the blog helps out those who might be wanting to use the Cloud Only instead of the Hybrid deployment approach to Office 365.

Have a great day 😉

Integrating Yammer data within SharePoint web-part using REST API

Background

We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.

Scenario

We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.

Approach

After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.

Step-1

This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
          callback(yam);
        }
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                  callback(yam);
                }
              });
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                }
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
                }
              });
          }, 30000);
        }
      });
    });
  }

Step-2

This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
          callback(yam);
        }
        else if (requestLogin) {
          this._iframeAuthentication()
              .then((res) => {
                callback(yam);
              })
              .catch((e) => {
                window.location.href="https://www.yammer.com/[your-yammer-network-name]/oauth2/authorize?client_id=[your-yammer-app-client-id]&response_type=token&redirect_uri=[your-sharepoint-home-page-url]";
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
              });
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");
        }
      });
    });
  }

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.


  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");
        document.body.appendChild(element);

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.platform.setAuthToken(token);
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                        resolve(res);
                    } else {
                        reject(res);
                    }
                });
            } catch (ex) {
                reject(ex);
            }
        });

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `https://www.yammer.com/${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;
      });
    }

Conclusion

This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.

Sharing a report using a Power BI app

History

You have created reports and built dashboards in Power BI desktop to surface your data from multiple data sources, it is a time for you to share dashboards to a wider audience in your organisation and looking for how to do it. Power BI service came up with a powerful feature of Power BI apps to cater such scenarios.

If you have not yet created reports or did not setup a gateway for leveraging our on-premises data, please follow my earlier posts Setup a Power BI Gateway and Create reports using a Power BI Gateway to do so.

Approach

Sharing and Collaborating in a Power BI service is a three-step process, each step is explained in this blog post. At a surface level, tasks are as follows:

  1. Creation of an App Workspace
  2. Publishing reports to an App Workspace
  3. Publishing a Power BI App

A typical usage scenario for a Power BI apps in Office 365 services is depicted below:

1) Create an App Workspace

App Workspace is a new concept introduced in Power BI using which you can collaborate on datasets, reports and dashboards (authored by members) and builds/package Power BI apps to be distributed to your wider audience.

  1. Log in to your Power BI service https://app.powerbi.com and click on your Workspace list menu on the left

    If this is your first-time login, you need to create a new app workspace. (it’s just a new name for group workspaces)

  2. A form needs to be filled inside your Office 365 Power BI service for creating and a unique name is required for each app workspace
  3. Whilst creating the workspace, you need to set the privacy which can’t be changed later – so please decide carefully.
  4. And you need to set Permission levels for your workspace accordingly, please only add members who can edit content as viewers can be added later during publishing your Power BI app.

  5. Next step is to add users to it and set admins for the workspace. (default role is Member, change it for Owner against users you are intending to give administrator permissions)Note: you can only add individual users to this list, security group and modern groups support is not yet available at the time of writing this post.
  6. Upon reaching this step, your app workspace has been created successfully and is ready for use.

2) Publishing Reports to an App Workspace

Power BI app workspace is a collaboration tool, any member can create a model using their Power BI desktop and then publish it to a workspace so members can get advantage existing datasets, reports and dashboards. Follow the steps listed below to share your model in an app workspace.

  1. Open your Power BI desktop file (*.pbix) you have created earlier and hit the Publish button
  2. Select app workspace you want to publish your reports to:and Power BI desktop will start publishing reports to your Office 365 Power BI service
  3. Subsequent publishing to same app workspace will remind you if your data set already exists.
  4. Depending on the size of your data and internet speed may take some time to publish reports to Power BI service. Sooner or later you will receive a success message
  5. Upon reaching this step your reports, datasets and dashboards are published and available in your Power BI service.

3) Publishing Power BI App

  1. Login into your Power BI service and go to your app workspaces list and select your newly created workspace from the list
  2. On the right top, you will see a button to publish an app
  3. Provide description for the app in ‘Details’ tab, as your Power BI app will get the same name as of your app workspace
  4. In the next ‘Content’ tab, you will see a list of all contents within app workspace that will be published within this app. In this step, you can set a landing page of a Power BI app which users will see when they click on your Power BI app. I have selected a specific dashboard to be shown
  5. You will then need to set audience for your app in ‘Access’ tab, it can be either whole organisation or a combination of users or groups. On the top right corner, it will show you how many artefacts will be published within this Power BI app.
  6. Once you publish it, Power BI service will advise you the URL of your app as shown below:

AppSource and Power BI

Power BI users intending to use apps shared by other users or organisation must get apps first to use dashboards and reports from it.

  1. You need to go to ‘Apps’ menu in Power BI service (in the left menu)
  2. On selecting Apps from the menu will list apps you are subscribed to, if you are using it for the 1st time it’s usually empty and you need to click on ‘Get apps’ to get Power BI apps from AppSource store
  3. You can then select which apps you want to subscribe to from the list, they are listed by category

Behind the Scenes

The moment users create an app workspace, Office 365 group will be created in the background having the same name as of app workspace and users maintained as Office 365 groups users.

  • Admins of the workspace will become Owners of the group
  • Members of the workspace will become Members of the group

And a SharePoint site will be created as well with same members as of Power BI app workspace and Office 365 group.

You can see the details of users (admins/members) by checking ‘Site permissions’ menu under site settings

In the

Create reports using a Power BI Gateway

Background

Once you have a Power BI gateway setup to ensure data flow from your on-premises data sources to Power BI service in the cloud, next step is to create reports using Power BI desktop and build reports using data from multiple on-premises data sources.

Note: If you didn’t have a gateway setup already, please follow my earlier post to set it up before you continue reading this post.

Scenario

All on-premises data is stored in SQL server instances and spread across few data warehouses and multiple databases built and managed by your internal IT teams.

Before building reports, you need to ensure following key points:

  1. Each data source should be having connectivity to your gateway with minimum latency, this should be ensured.
  2. Every data source intended to be used within reports needs to be configured within a gateway in the Power BI service
  3. List of people needs to be configured against each data source who can publish reports using this data source

An interaction between on-premises data sources and cloud services is depicted below:

Pre-requisites

Before you build reports, you need to setup on-premises data sources in the gateway to ensure Power BI service knows which data sources are allowed by gateway administrator to pull data from on-premises sources.

Login into https://app.powerbi.com with Power BI service administrator service credentials.

  1. Click on Manage gateways to modify settings
  2. You will see a screen with gateway options that your setup earlier while configuring gateway on-premises
  3. Next step is to setup gateway administrators, who will have permission to setup on-premises data sources as and when required
  4. After gateway configuration, you need to add data sources one by one so published reports can use on-premises data sources (pre-configured within gateway)
  5. You need to setup users against each data source within a gateway who can use this data source to pull data from on-premises sources within their published reports
  6. Repeat above steps for each of your on-premises data sources by selecting appropriate data source type and allowing users who can use them while building reports

Reports

Upon reaching this step, you are all good to create reports.

  1. Open Power BI desktop
  2. Select sources you want to retrieve data from
  3. Just ensure while creating reports, data source details are same as what was configured in Power BI service while you were setting up data sources.
  4. Great! once you publish reports to your Power BI service – your gateway will be able to connect to relevant on-premises data sources if you have followed steps above.

 

Where’s the source!

SauceIn this post I will talk about data (aka the source)! In IAM there’s really one simple concept that is often misunderstood or ignored. The data going out of any IAM solution is only as good as the data going in. This may seem simple enough but if not enough attention is paid to the data source and data quality then the results are going to be unfavourable at best and catastrophic at worst.
With most IAM solutions data is going to come from multiple sources. Most IAM professionals will agree the best place to source the majority of your user data is going to be the HR system. Why? Well simply put it’s where all important information about the individual is stored and for the most part kept up to date, for example if you were to change positions within the same company the HR systems are going to be updated to reflect the change to your job title, as well as any potential direct report changes which may come as a result of this sort of change.
I also said that data can come and will normally always come from multiple sources. At typical example of this generally speaking, temporary and contract staff will not be managed within the central HR system, the HR team simply put don’t care about contractors. So where do they come from, how are they managed? For smaller organisations this is usually something that’s manually done in AD with no real governance in place. For the larger organisations this is less ideal and can be a security nightmare for the IT team to manage and can create quite a large security risk to the business, so a primary data source for contractors becomes necessary what this is is entirely up to the business and what works for them, I have seen a standard SQL web application being used to populate a database, I’ve seen ITSM tools being used, and less common is using the IAM system they build to manage contractor accounts (within MIM 2016 this is through the MIM Portal).
There are many other examples of how different corporate applications can be used to augment the identity information of your user data such as email, phone systems and to a lessor extent physical security systems building access, and datacentre access, but we will try and keep it simple for the purpose of this post. The following diagram helps illustrate the dataflow for the different user types.

IAM Diagram

What you will notice from the diagram above, is even though an organisation will have data coming from multiple systems, they all come together and are stored in a central repository or an “Identity Vault”. This is able to keep an accurate record of the information coming from multiple sources to compile what is the users complete identity profile. From this we can then start to manage what information is flowed to downstream systems when provisioning accounts, and we can also ensure that if any information was to change, it can be updated to the users profiles in any attached system that is managed through the enterprise IAM Services.
In my next post I will go into the finer details of the central repository or the “Identity Vault”

So in summary, the source of data is very important in defining an IAM solution, it ensures you have the right data being distributed to any managed downstream systems regardless of what type of user base you have. My next post we will dig into the central repository or the Identity Vault, this will go into details around how we can set precedence to data from specific systems to ensure that if there is a difference in the data coming from the difference sources that only the highest precedence will be applied we will also discuss how we augment the data sets to ensure that we are also only collecting the necessary information related to the management of that user and the applications that use within your business.

As per usual, if you have any comments or questions on this post of any of my previous posts then please feel free to comment or reach out to me directly.

Setup a Power BI Gateway

Scenario

So, you have explored Power BI (free) and wanted to start some action in the cloud. Suddenly you realise that your data is stored in an on-premise SQL data source and you still wanted to get insights up in the cloud and share it with your senior business management.

Solution

Microsoft’s on-premises data gateway is a bridge that can securely transfer your data to Power BI service from your on-premises data source.

Assumptions

  • Power BI pro licenses have been procured already for the required no of users (this is a MUST)
  • Users are already part of Azure AD and can sign in to Power BI service as part of Office 365 offering

Pre-requisites

You can build and setup a machine to act as a gateway between your Azure cloud service and on-premises data sources. Following are the pre-requisites to build that machine:

1) Server Requirements

Minimum Requirements:
  • .NET 4.5 Framework
  • 64-bit version of Windows 7 / Windows Server 2008 R2 (or later)

Recommended:

  • 8 Core CPU
  • 8 GB Memory
  • 64-bit version of Windows 2012 R2 (or later)

Considerations:

  • The gateway is supported only on 64-bit Windows operating systems.
  • You can’t install gateway on a domain controller
  • Only one gateway can be installed on a single machine
  • Your gateway should not be turned off, disconnected from the Internet, or have a power plan to go sleep – in all cases, it should be ‘always on’
  • Avoid wireless network connection, always use a LAN connection
  • It is recommended to have the gateway as close to the data source as possible to avoid network latency. The gateway requires connectivity to the data source.
  • It is recommended to have good throughput for your network connection.

Notes:

  • Once a gateway is configured and you need to change a name, you will need to reinstall and configure a new gateway.

2) Service Account

If your company/client is using a proxy server and your gateway is not having a direct connection to Internet. You may need to configure a windows service account for authentication purposes and change default log on credential (NT SERVICE\PBIEgwService) to a service account you like with a right of ‘Log on as a service’

3) Ports

The gateway creates an outbound connection to Azure Service Bus and does not require inbound ports for communication. It is required to whitelist IP addresses listed in Azure Datacentres IP List.

Installation

Once you are over with a pre-requisite as listed in the previous paragraph, you can proceed to gateway installation.

  1. Login to Power BI with your organisation credentials and download your data gateway setup
  2. While installing, you need to select the highlighted option so a single gateway can be shared among multiple users.
  3. As listed in pre-requisites section, if your network has a proxy requirement – you can change the service account for the following windows service:

  4. You will notice gateway is installed on your server
  5. Once you open a gateway application, you can see a success message

Configuration

Post installation, you need to configure a gateway to be used within your organisation.

  1. Open gateway application and sign in with your credentials
  2. You need to set a name and a recovery key for a gateway that can be used later to reconfigure/restore gateway
  3. Once it is configured successfully, you will see a message window that now it is ready to use

  4. Switch to Gateway’s Network tab and you will see its status as Connected – great!
  5. You are all set, the gateway is up and running and you need to start building reports to use data from your on-premises server using gateway you just configured.

 

Decommissioning Exchange 2016 Server

I have created many labs over the years and never really spent the time to decommission my environment, I usually just blow it away and start again.

So I finally decided to go through the process and decommission my Exchange 2016 server in my lab environment.

My lab consisted of the following:

  • Domain Controller (Windows Server 2012 R2)
  • AAD Connect Server
  • Exchange 2016 Server/ Office 365 Hybrid
  • Office 365 tenant

Being a lab I only had one Exchange server which had the mailbox role configured and was also my hybrid server.

Proceed with the steps below to remove the Exchange Online configuration;

  1. Connect to Exchange Online using PowerShell (https://technet.microsoft.com/en-us/library/jj984289(v=exchg.160).aspx)
  2. Remove the Inbound connector
    Remove-InboundConnector -Identity "Inbound from [Inbound Connector ID]"
  3. Remove the Outbound connector
    Remove-OutboundConnector -Identity "Outbound to [Outbound Connector ID]"
  4. Remove the Office 365 federation
    Remove-OrganizationRelationship "O365 to On-premises - [Org Relationship ID]"
  5. Remove the OAuth configuration
    Remove-IntraOrganizationConnector -Identity "HybridIOC - [Intra Org Connector ID]"

Now connect to your Exchange 2016 server and conduct the following;

  1. Open Exchange Management Shell
  2. Remove the Hybrid Config
    Remove-HybridConfig
  3. Remove the federation configuration
    Remove-OrganizationRelationship "On-premises to O365 - [Org Relationship ID]"
  4. Remove OAuth configuration
    Remove-IntraorganizationConnector -Identity "HybridIOC - [Intra Org Connector ID]"
  5. Remove all mailboxes, mailbox databases, public folders etc.
  6. Uninstall Exchange (either via PowerShell or programs and features)
  7. Decommission server

I also confirmed Exchange was removed from Active Directory and I was able to run another installation of Exchange on the same Active Directory with no conflicts or issues.

Exchange 2010 Hybrid Auto Mapping Shared Mailboxes

Migrating shared mailboxes to Office 365 is one of those things that is starting to become easier over time, especially with full access permissions now working cross premises.

One little discovery that I thought I would share is that if you have an Exchange 2010 hybrid configuration, the auto mapping feature will not work cross premises. (Exchange 2013 and above, you are ok and have nothing to worry about).

This means if an on-premises user has access to a shared mailbox that you migrate to Office 365, it will disappear from their Outlook even though they still have full access.

Unless you migrate the shared mailbox and user at the same time, the only option is to manually add the shared mailbox back into Outlook.

Something to keep in mind especially if your user base accesses multiple shared mailboxes at a time.

 

 

Try/Catch works in PowerShell ISE and not in PowerShell console

I recently encountered an issue with one of my PowerShell scripts. It was a script to enable litigation hold on all mailboxes in Exchange Online.

I connected to Exchange Online via the usual means below.

$creds = Get-Credential
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $Creds -Authentication Basic -AllowRedirection
Import-PSSession $session -AllowClobber

I then attempted to execute the following with no success.

try
{
Set-Mailbox -Identity $user.UserPrincipalName -LitigationHoldEnabled $true -ErrorAction Stop
}
catch
{
Write-Host "ERROR!" -ForegroundColor Red
}

As a test I removed the “-ErrorAction Stop” switch and then added the following line to the top of my script.

ErrorActionPreference = 'Stop'

That’s when it would work in Windows PowerShell ISE but not in Windows PowerShell console.

After many hours of troubleshooting I then discovered it was related to the implicit remote session established when connecting to Exchange Online.

To get my try/catch to work correctly I added the following line to the top of my script and all was well again.

$Global:ErrorActionPreference = 'Stop'

Send mail to Office 365 via an Exchange Server hosted in Azure

Those of you who have attempted to send mail to Office 365 from Azure know that sending outbound mail directly from an email server hosted in Azure is not supported due to elastic nature of public cloud service IPs and the potential for abuse. Therefore, the Azure IP address blocks are added to public block lists with no exceptions to this policy.

To be able to send mail from an Azure hosted email server to Office 365 you to need to send mail via a SMTP relay. There is a number of different SMTP relays you can utilise including Exchange Online Protection, more information can be found here: https://blogs.msdn.microsoft.com/mast/2016/04/04/sending-e-mail-from-azure-compute-resource-to-external-domains

To configure Exchange Server 2016 hosted in Azure to send mail to Office 365 via SMTP relay to Exchange Online protection you need to do the following;

  1. Create a connector in your Office 365 tenant
  2. Configure accepted domains on your Exchange Server in Azure
  3. Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

Create a connector in your Office 365 tenant

  1. Login to Exchange Online Admin Center
  2. Click mail flow | connector
  3. Click +
  4. Select from: “Your organisation’s email server” to: “Office 365”o365-connector1
  5. Enter in a Name for the Connector | Click Nexto365-connector2
  6. Select “By verifying that the IP address of the sending server matches one of these IP addresses that belong to your organization”
  7. Add the public IP address of your Exchange Server in Azureo365-connector3

Configure accepted domains on your Exchange Server in Azure

  1. Open Exchange Management Shell
  2. Execute the following PowerShell command for each domain you want to send mail to in Office 365;
  3. New-AcceptedDomain -DomainName Contoso.com -DomainType InternalRelay -Name Contosoaccepted-domain1

Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

  1. Execute the following PowerShell command;
  2. New-SendConnector -Name “My company to Office 365” -AddressSpaces * -CloudServicesMailEnabled $true -RequireTLS $true -SmartHosts yourdomain-com.mail.protection.outlook.com -TlsAuthLevel CertificateValidationsend-connector1