ADFS Service Communication Certificate Renewal Steps

Hi Guys, adfs service comprises of certificates which serve different purpose for federation service. In this blog post I will share a brief description of these certificates, their purpose and will discuss renewal process of service communication certificate.

 

Type of ADFS Certificates and their purpose

 

Certificate Type Description Purpose
Service Communication certificate

 

Standard Secure Sockets Layer (SSL) certificate that is used for securing communications between federation servers, clients, Web Application Proxy, and federation server proxy computers. Ensures the identity of a remote computer

Proves your identity to a remote computer

 

Encryption Certificates

 

  Token decryption
Signing Certificates

 

Standard X.509 certificate that is used for securely signing all tokens Token signing

 

 

Renewal Steps

Service Communication certificate

In comparison this certificate is very similar to IIS certificate used to secure a website. It is generally issued by a trusted CA authority and can be either SAN or wild card certificate. This certificate is installed an all ADFS servers in the farm and update procedure should be done on primary ADFS server. Below is the list of steps involved in renewal.

 

  1. Generate CSR from primary ADFs server. This can be done via IIS.
  2. Once certificate is issued, add new certificate in Certificate store.
  3. Verify Private Key on the certificate. Make sure new certificate has the private key.
  4. Assign Permissions to the Private Key for ADFS service account. Right click on the certificate, click manage private keys, add adfs service account and assign permissions as shown in below screenshot.

 

 adfs

  1. From ADFS console select “Set Service Communication Certificate”
  2. Select new certificate from prompted list of certificates.
  3. Run Get-AdfsSslCertificate. Make a note of the thumbprint of the new certificate.
  4. If it’s unclear which certificate is new, open MMC snapin, locate the new certificate and scroll down in the list of properties to see the thumbprint.
  5. Run

 

  1. Restart the ADFS service
  2. Copy and import the new certificate to the Web Application Proxy/Proxies
  3. On each wap server run following cmdlet.

 

That’s it you are all done. You can verify that new certificate has been assigned to adfs service by executing Run Get-AdfsSslCertificate. Another verification step would be to open the browser and navigate to federation page. Here you should be able to see the new certificate in the browser. I will further discuss encryption and signing certificate renewal process in upcoming blogs.

 

 

Exchange Online – Mapi over Http Transition

Microsoft has announced that from 31st October 2017, outlook clients using RPC over Http protocol to connect to Office 365 will be no longer supported. Only Mapi over Http clients will be in action onwards. This announcement has left many administrators thinking, What exactly does that mean for my organization? What actions are required to avoid any business impact? Is it time to update outlook clients and upto what level? And last but not the least how can I verify if all necessary steps have been taken to ensure business as usual. Lets try to answer these questions one by one.

So what does this announcement means for any organization? In simple words, any outlook client which still use RPC over Http to connect to Office 365 will be retired and hence would require to be updated if possible. This means that outlook 2007 and earlier versions will be retired and will be no longer able to connect to exchange online. So, this would require following actions from Office 365 administrators.

  1. Update Outlook 2007 or earlier versions of outlook to latest outlook version.
  2. For outlook 2010 and higher minimum required updates are following :
Office version Update Build number
Office 2016 The December 8, 2015 update
  • Subscription: 16.0.6568.20xx
  • MSI: 16.0.4312.1001
Office 2013 Office 2013 Service Pack 1 (SP1) and the December 8, 2015 update 15.0.4779.1002
Office 2010 Office 2010 Service Pack 2 (SP2) and the December 8, 2015 update 14.0.7164.5002

Note The December 8, 2015 updates for Office are listed in Microsoft Knowledge Base article, 3121650: “December 8, 2015, update for Office”. It is  recommended that you keep outlook clients updated with the most recent product updates as several MAPI over HTTP issues have been fixed since December 2015.

Additionally, you may have to make sure that Outlook clients aren’t using a registry key to disable MAPI over HTTP. For more information, see Microsoft Knowledge Base article, 2937684 : “Outlook 2013 or 2016 may not connect using MAPI over HTTPs as expected”.

Now while you make all efforts to ensure you meet the deadline and take all necessary steps to update your environment, you do not need assurance that you have completed your job. A simple report providing information from office 365 about outlook clients connecting to your tenant should do the job. Lets get this report now following below steps.

To retrieve this information, enable owner access auditing for each mailbox, and then query the audit log for the Outlook version that’s used to log on to the mailbox. To do this, follow these steps:

  1. Connect to Exchange Online using remote PowerShell.
  2. Enable mailbox auditing for the owner. To do this, run one of the following commands:
    • For one mailbox:

    • For all mailboxes:

Note: Mailbox auditing may take up to 24 hours to get enabled.

  1. Search the audit log. To do this, run one of the following commands:
    • For one mailbox:

    • For all mailboxes and export results to a .csv file

The above powershell command will produce a comprehensive report which you can use as a guide line to ensure that all your clients are ready for switch to Mapi over Http. Here is a sample output.

Integrating Yammer data within SharePoint web-part using REST API

Background

We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.

Scenario

We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.

Approach

After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.

Step-1

This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
          callback(yam);
        }
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                  callback(yam);
                }
              });
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                }
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
                }
              });
          }, 30000);
        }
      });
    });
  }

Step-2

This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
          callback(yam);
        }
        else if (requestLogin) {
          this._iframeAuthentication()
              .then((res) => {
                callback(yam);
              })
              .catch((e) => {
                window.location.href="https://www.yammer.com/[your-yammer-network-name]/oauth2/authorize?client_id=[your-yammer-app-client-id]&response_type=token&redirect_uri=[your-sharepoint-home-page-url]";
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
              });
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");
        }
      });
    });
  }

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.


  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");
        document.body.appendChild(element);

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.platform.setAuthToken(token);
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                        resolve(res);
                    } else {
                        reject(res);
                    }
                });
            } catch (ex) {
                reject(ex);
            }
        });

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `https://www.yammer.com/${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;
      });
    }

Conclusion

This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.

What’s a DEA?

In my last post I made a reference to a “Data Exchange Agreement” or DEA, and I’ve since been asked a couple of times about this. So I thought it would be worth while writing a post about what it is, why it’s of value to you and to your business.

So what’s a DEA? Well in simply terms it’s exactly what the name states, it’s an agreement that defines the parameters in which data is exchanged between Service A and Service B. Service A being the Producer of Attributes X and Services B, the consumers. Now I’ve intentionally used a vague example here as a DEA is used amongst many services in business and or government and is not specifically related to IT or IAM Services. But if your business adopts a controlled data governance process, it can play a pivotal role in the how IAM Services are implemented and adopted throughout the entire enterprise.

So what does a DEA look like, well in an IAM service it’s quite simple, you specify your “Source” and your “Target” services, an example of this could be the followings;

Source
ServiceNow
AurionHR
PROD Active Directory
Microsoft Exchange
Target
PROD Active Directory
Resource Active Directory Domain
Microsoft Online Services (Office 365)
ServiceNow

As you can see this only tells you where the data is coming from and where it’s going to, it doesn’t go into any of the details around what data is being transported and in which direction. A separate section in the DEA details this and an example of this is provided below;

MIM Flow Service Now Source User Types Notes
accountName –> useraccountname MIM All  
employeeID –> employeeid AurionHR All  
employeeType –> employeetype AurionHR All  
mail <– email Microsoft Exchange All  
department –> department AurionHR All
telephoneNumber –> phone PROD AD All  
o365SourceAnchor –> ImmutableID Resource Domain All  
employeeStatus –> status AurionHR All  
dateOfBirth –> dob AurionHR CORP Staff yyyy-MM-dd
division –> region AurionHR CORP Staff  
firstName –> preferredName AurionHR CORP Staff  
jobTitle –> jobtitle AurionHR CORP Staff  
positionNumber –> positionNumber AurionHR CORP Staff
legalGivenNames <– firstname ServiceNow Contractors
localtionCode <– location ServiceNow Contractors  
ManagerID <– manager ServiceNow Contractors  
personalTitle <– title ServiceNow Contractors  
sn <– sn ServiceNow Contractors  
department <– department ServiceNow Contractors
employeeID <– employeeid ServiceNow Contractors  
employeeType <– employeetype ServiceNow Contractors  

This might seem like a lot of detail, but this is actually only a small section of what would be included in a DEA of this type, as the whole purpose of this agreement is to define what attributes are managed by which systems and going to which target systems, and as many IAM consultants can tell you, would be substantially more then what’s provided in this example. And this is just an example for a single system, this is something that’s done for all applications that consume data related to your organisations staff members.

One thing that you might also notice is that I’ve highlighted 2 attributes in the sample above in bold. Why might you ask? Well the point of including this was to highlight data sets that are considered “Sensitive” and within the DEA you would specify this being classified as sensitive data with specific conditions around this data set. This is something your business would define and word to appropriately express this but it could be as simple as a section stating the following;

“Two attributes are classed as sensitive data in this list and cannot be reproduced, presented or distributed under any circumstances”

One challenge that is often confronted within any business is application owners wanting “ownership” of the data they consume. Utilising a DEA provides clarity over who owns the data and what your applications can do with the data they consume removing any uncertainty.

To summarise this post, the point of this wasn’t to provide you with a template, or example DEA to use, it was to help explain what a DEA is, what its used for and examples of what parts can look like. No DEA is the same, and providing you with a full example DEA is only going to make you end up recreating it from scratch anyway. But it is intended to help you with understanding what is needed.

As with any of my posts if you have any questions please pop a comment or reach out to me directly.

 

SharePoint Online forecast storage requirements

Background

Office 365 users get quite a bit of storage on SharePoint Online for content, be it files, metadata, etc. But still to manage the storage and forecast as to when additional storage has to be added becomes a challenge with very limited analytics available in SharePoint Online. Since adding more storage cost money so adding before you actually require or a bit too late would not be ideal.

SharePoint Online provides two ways to track the storage one from the admin center and other one from within the site collection using Storage metrics.

In SharePoint Online tenant’s admin center it gives us a bar matrix with details about the total storage, used storage and storage available. Here customers can manage the total amount of space allocated to each site collection, the total amount of space utilized and available.

To check Storage Metrics to the Site collection under site settings we can find the Storage metrics link, storage metrics provides us with the space utilization breakdown by each sub site, library and list in that Site Collection.

As shown in the image below sub sites, libraries, lists etc. are listed with size, % of parent. This can help in finding resources which are consuming most resources.

There are some third party storage monitoring utilities available to connect to SharePoint online to determine usage patterns and trends.

But if we need to forecast storage usage, find out the site collection which is growing the most over a period of time in a tenant, with more than hundreds of site collection then it becomes a challenge to keep that track.

Solution

We can use PowerShell script and get the storage information from the tenant, store it in a SharePoint list on a periodic basis. Once we have this data available then on this data Power BI can be used to build a report which shows the growth % for each site collection over a period and get some insight to the storage.

By using PowerShell, we get heaps of information from SharePoint tenant using the command Get-SPOSite but we will here use it to get site collection list, storage information.

Get-SPOSite -Detailed -Limit All | select *

Then once the information is retrieved from SPO, use PowerShell to iterate over the data and save the relevant data to a SharePoint list.

SharePoint list Columns: Site Name/Title, Site Url, Storage, Report Run Date.

Data in SharePoint list can be filtered, grouped and sorted in list views to make it more usable for business to analyze and predict storage growth and requirements.

In order to get detailed analytics on the data, use Power BI reports which can help in putting data together in form of reports and predict the growth percentage for each site collection. PowerBI provides many connectors one of them is for SharePoint Online to easily connect with lists. To set up connector in Power BI check this Kloud Blog.

Below is the Power BI report that was built on the SharePoint list data which was pulled in using PowerShell, Data in the report below is sorted by sites with maximum storage increase within a selected interval.

Using Power BI more smarts can be added like filters, graphs and make the data more usable.

Report details

  • x-axis : List of site collection
  • y-axis : Storage value’s in MB

As we hover the curser over the graph it shows the Max and Min value for the storage over the selected period of time. The graph captures the sites in decreasing order of growth value by storage.

Tabular view for the report.

PowerShell script can be scheduled to run periodically and gather the data by weekly or fortnightly we can calculate the average weekly or fortnightly storage requirement for the tenant.

Secondly data captured is used to forecast weekly storage requirement. From SPO tenant we get details about total storage, used storage, storage available.

Average weekly storage requirement can be taken from list view grouped by week and aggregate of the storage as shown below in the image.

Once we have the above information then we get the average requirement for week is 24,349 MB i.e. 24 GB (difference in storage growth in two weeks) in this case, this can be used to predict the weekly storage requirement which can help forecast as to when the tenant storage is going to run out and additional storage would be required.

SharePoint Online external user access error “User Not in directory”

Background

Organization wants to share their SharePoint online site collection, documents and collaborate with external partners, vendors or customers. By default site collection are shared to internal user’s only. But this can be extended to authenticated external users or also with limited sharing to anonymous users. External users do not have a license to office 365 subscription, they are limited to basic collaboration tasks.

I had recently enabled external access for site collection on the SPO tenant only to selected domains and authenticated external users. External sharing in SharePoint online works well in most scenario’s but there are few issues which pop up while enabling access for external user’s and with limited error details it becomes a bit challenging to understand the cause.

Problem

Error: “User Not in directory”

Error message which user’s get as they try to login to external SharePoint site is quiet generic ” User not in directory “it is not that descriptive and did not points to the cause for the issue.

Solution

To troubleshoot the access for the user. Clear browser cache or open Incognito or Private session, then try below steps.

First check to make sure the account which is used to accept the email invitation to the site is the same account which is being used to login later.

In Office 365 login screen if below screen is popping up prompting “Which account do you want to use?” when you sign in, it means that two different accounts have been configured with Microsoft using the same email address:

A “Work or school” account, which probably was created by your IT department

A “personal” account, which you probably created later on by the user.   

            

Personal account can be renamed which means using a different email address to sign in to it. To fix it follow this KB article (https://support.microsoft.com/en-us/help/11545/microsoft-account-rename-your-personal-account)

If external user’s accepted the invite using the personal account and later on try to connect by selecting the work account and is getting the error “User not in directory”. This is the most common cause for the error. Make sure the user is using the same account to accept the invite and log-on to the site.

Secondly if the account used for accepting the invite and login are same and still the error screen pop’s up, then user account has to be set up again, but before we need to do the clean up on the existing references for user profile and remove the user from SharePoint and then send fresh invite. To remove the user and all references follow the below steps.

External users are managed from a site-collection–by–site-collection basis. An external user account must be removed from each site collection that the user was granted access to.

Browse to each site collection that the user previously had access to, and then follow below steps:

  • In the site collection, edit the URL in the browser by appending the following string to the site address:
    _layouts/15/people.aspx/membershipGroupId=0

  • Select the user from the list and Click Delete. Then once user is removed next

  • Start the SharePoint Online Management Shell.
  • Type the following cmdlet:
    $cred = Get-Credential
    In the Windows PowerShell Credential required dialog box, type your site collection admin account and password, and then click OK.
  • Connect to SharePoint Online, and then type the following cmdlet:
    Connect-SPOService -Url https://tenant-admin.sharepoint.com -Credential $cred
  • Remove the user from each site collection by using the following cmdlet:
    $ExtUser = Get-SPOExternalUser -filter someone@example.com
  • Type the following cmdlet:
    Remove-SPOExternalUser -UniqueIDs @($ExtUser.UniqueId)

Then we can add back the user and resend the invite. It should fix up the issue.

Last thing to check is that user has a proper role assigned to user account under user profile in the office portal.

  • To check the role assigned to the user, Go to office 365 admin center.
  • Sign in with global administrator’s account.
  • Check the external user in Users>Active users, then, check the roles of the external user and change it to User (no admin access).

Sharing a report using a Power BI app

History

You have created reports and built dashboards in Power BI desktop to surface your data from multiple data sources, it is a time for you to share dashboards to a wider audience in your organisation and looking for how to do it. Power BI service came up with a powerful feature of Power BI apps to cater such scenarios.

If you have not yet created reports or did not setup a gateway for leveraging our on-premises data, please follow my earlier posts Setup a Power BI Gateway and Create reports using a Power BI Gateway to do so.

Approach

Sharing and Collaborating in a Power BI service is a three-step process, each step is explained in this blog post. At a surface level, tasks are as follows:

  1. Creation of an App Workspace
  2. Publishing reports to an App Workspace
  3. Publishing a Power BI App

A typical usage scenario for a Power BI apps in Office 365 services is depicted below:

1) Create an App Workspace

App Workspace is a new concept introduced in Power BI using which you can collaborate on datasets, reports and dashboards (authored by members) and builds/package Power BI apps to be distributed to your wider audience.

  1. Log in to your Power BI service https://app.powerbi.com and click on your Workspace list menu on the left

    If this is your first-time login, you need to create a new app workspace. (it’s just a new name for group workspaces)

  2. A form needs to be filled inside your Office 365 Power BI service for creating and a unique name is required for each app workspace
  3. Whilst creating the workspace, you need to set the privacy which can’t be changed later – so please decide carefully.
  4. And you need to set Permission levels for your workspace accordingly, please only add members who can edit content as viewers can be added later during publishing your Power BI app.

  5. Next step is to add users to it and set admins for the workspace. (default role is Member, change it for Owner against users you are intending to give administrator permissions)Note: you can only add individual users to this list, security group and modern groups support is not yet available at the time of writing this post.
  6. Upon reaching this step, your app workspace has been created successfully and is ready for use.

2) Publishing Reports to an App Workspace

Power BI app workspace is a collaboration tool, any member can create a model using their Power BI desktop and then publish it to a workspace so members can get advantage existing datasets, reports and dashboards. Follow the steps listed below to share your model in an app workspace.

  1. Open your Power BI desktop file (*.pbix) you have created earlier and hit the Publish button
  2. Select app workspace you want to publish your reports to:and Power BI desktop will start publishing reports to your Office 365 Power BI service
  3. Subsequent publishing to same app workspace will remind you if your data set already exists.
  4. Depending on the size of your data and internet speed may take some time to publish reports to Power BI service. Sooner or later you will receive a success message
  5. Upon reaching this step your reports, datasets and dashboards are published and available in your Power BI service.

3) Publishing Power BI App

  1. Login into your Power BI service and go to your app workspaces list and select your newly created workspace from the list
  2. On the right top, you will see a button to publish an app
  3. Provide description for the app in ‘Details’ tab, as your Power BI app will get the same name as of your app workspace
  4. In the next ‘Content’ tab, you will see a list of all contents within app workspace that will be published within this app. In this step, you can set a landing page of a Power BI app which users will see when they click on your Power BI app. I have selected a specific dashboard to be shown
  5. You will then need to set audience for your app in ‘Access’ tab, it can be either whole organisation or a combination of users or groups. On the top right corner, it will show you how many artefacts will be published within this Power BI app.
  6. Once you publish it, Power BI service will advise you the URL of your app as shown below:

AppSource and Power BI

Power BI users intending to use apps shared by other users or organisation must get apps first to use dashboards and reports from it.

  1. You need to go to ‘Apps’ menu in Power BI service (in the left menu)
  2. On selecting Apps from the menu will list apps you are subscribed to, if you are using it for the 1st time it’s usually empty and you need to click on ‘Get apps’ to get Power BI apps from AppSource store
  3. You can then select which apps you want to subscribe to from the list, they are listed by category

Behind the Scenes

The moment users create an app workspace, Office 365 group will be created in the background having the same name as of app workspace and users maintained as Office 365 groups users.

  • Admins of the workspace will become Owners of the group
  • Members of the workspace will become Members of the group

And a SharePoint site will be created as well with same members as of Power BI app workspace and Office 365 group.

You can see the details of users (admins/members) by checking ‘Site permissions’ menu under site settings

In the

Create reports using a Power BI Gateway

Background

Once you have a Power BI gateway setup to ensure data flow from your on-premises data sources to Power BI service in the cloud, next step is to create reports using Power BI desktop and build reports using data from multiple on-premises data sources.

Note: If you didn’t have a gateway setup already, please follow my earlier post to set it up before you continue reading this post.

Scenario

All on-premises data is stored in SQL server instances and spread across few data warehouses and multiple databases built and managed by your internal IT teams.

Before building reports, you need to ensure following key points:

  1. Each data source should be having connectivity to your gateway with minimum latency, this should be ensured.
  2. Every data source intended to be used within reports needs to be configured within a gateway in the Power BI service
  3. List of people needs to be configured against each data source who can publish reports using this data source

An interaction between on-premises data sources and cloud services is depicted below:

Pre-requisites

Before you build reports, you need to setup on-premises data sources in the gateway to ensure Power BI service knows which data sources are allowed by gateway administrator to pull data from on-premises sources.

Login into https://app.powerbi.com with Power BI service administrator service credentials.

  1. Click on Manage gateways to modify settings
  2. You will see a screen with gateway options that your setup earlier while configuring gateway on-premises
  3. Next step is to setup gateway administrators, who will have permission to setup on-premises data sources as and when required
  4. After gateway configuration, you need to add data sources one by one so published reports can use on-premises data sources (pre-configured within gateway)
  5. You need to setup users against each data source within a gateway who can use this data source to pull data from on-premises sources within their published reports
  6. Repeat above steps for each of your on-premises data sources by selecting appropriate data source type and allowing users who can use them while building reports

Reports

Upon reaching this step, you are all good to create reports.

  1. Open Power BI desktop
  2. Select sources you want to retrieve data from
  3. Just ensure while creating reports, data source details are same as what was configured in Power BI service while you were setting up data sources.
  4. Great! once you publish reports to your Power BI service – your gateway will be able to connect to relevant on-premises data sources if you have followed steps above.

 

Where’s the source!

SauceIn this post I will talk about data (aka the source)! In IAM there’s really one simple concept that is often misunderstood or ignored. The data going out of any IAM solution is only as good as the data going in. This may seem simple enough but if not enough attention is paid to the data source and data quality then the results are going to be unfavourable at best and catastrophic at worst.
With most IAM solutions data is going to come from multiple sources. Most IAM professionals will agree the best place to source the majority of your user data is going to be the HR system. Why? Well simply put it’s where all important information about the individual is stored and for the most part kept up to date, for example if you were to change positions within the same company the HR systems are going to be updated to reflect the change to your job title, as well as any potential direct report changes which may come as a result of this sort of change.
I also said that data can come and will normally always come from multiple sources. At typical example of this generally speaking, temporary and contract staff will not be managed within the central HR system, the HR team simply put don’t care about contractors. So where do they come from, how are they managed? For smaller organisations this is usually something that’s manually done in AD with no real governance in place. For the larger organisations this is less ideal and can be a security nightmare for the IT team to manage and can create quite a large security risk to the business, so a primary data source for contractors becomes necessary what this is is entirely up to the business and what works for them, I have seen a standard SQL web application being used to populate a database, I’ve seen ITSM tools being used, and less common is using the IAM system they build to manage contractor accounts (within MIM 2016 this is through the MIM Portal).
There are many other examples of how different corporate applications can be used to augment the identity information of your user data such as email, phone systems and to a lessor extent physical security systems building access, and datacentre access, but we will try and keep it simple for the purpose of this post. The following diagram helps illustrate the dataflow for the different user types.

IAM Diagram

What you will notice from the diagram above, is even though an organisation will have data coming from multiple systems, they all come together and are stored in a central repository or an “Identity Vault”. This is able to keep an accurate record of the information coming from multiple sources to compile what is the users complete identity profile. From this we can then start to manage what information is flowed to downstream systems when provisioning accounts, and we can also ensure that if any information was to change, it can be updated to the users profiles in any attached system that is managed through the enterprise IAM Services.
In my next post I will go into the finer details of the central repository or the “Identity Vault”

So in summary, the source of data is very important in defining an IAM solution, it ensures you have the right data being distributed to any managed downstream systems regardless of what type of user base you have. My next post we will dig into the central repository or the Identity Vault, this will go into details around how we can set precedence to data from specific systems to ensure that if there is a difference in the data coming from the difference sources that only the highest precedence will be applied we will also discuss how we augment the data sets to ensure that we are also only collecting the necessary information related to the management of that user and the applications that use within your business.

As per usual, if you have any comments or questions on this post of any of my previous posts then please feel free to comment or reach out to me directly.

Windows 10 Domain Join + AAD and MFA Trusted IPs

Background

Those who have rolled out Azure MFA (in the cloud) to non-administrative users are probably well aware of the nifty Trusted IPs feature.   For those that are new to this, the short version is that this capability is designed to make it a little easier on the end user experience by allowing you to define a set of ‘trusted locations’ (e.g. your corporate network) in which MFA is not required.

This capability works via two methods:

  • Defining a set of ‘Trusted” IP addresses.  These IP addresses will be the public facing IP addresses of your Web Proxies and/or network gateways and firewalls
  • Utilising issued claims from Federated Users.   This uses the insidecorporatenetwork = true claim, sent by ADFS, to determine that this user is coming from a ‘trusted location’.  Enabling this capability is discussed in this article.

The Problem

Now, the latter of these is what needs further consideration when you are looking to moving to the ‘modern world’ of Windows 10 and Azure AD (AAD).  Unfortunately, due to some changes made in the way that ‘Win10 Domain Joined with AAD Registration (AAD+DJ) machines performs Single Sign On (SSO) with AAD, the method of utilising federated claims to determine a ‘trusted location’ for MFA will no longer work.

To understand why this is the case, I highly encourage that you first read Jairo Cadena‘s truly excellent blog series that discuss in detail around how Win10 AAD SSO and its associated services works.  The key takeaways from those posts are that Win10 now has this concept of a Primary Refresh Token (PRT) and with this approach to authentication you now have the following changes:

  • The PRT is what is used to obtain access tokens to AAD applications
  • The PRT is cached and has a sliding window lifetime from 14 days up to 90 days
  • The use of the PRT is built into the Windows 10 credential provider.  Both IE and Edge know to utilise the PRT when communicating with AAD
  • It effectively replaces the ADFS with Integrated Windows Auth (IWA) approach to achieve SSO with Azure AD
    • That is, the auth flow is no longer: Browser –> Login to AAD –> Redirect to ADFS –> Perform IWA SSO –> SAML Token provided with claims –> AAD grants access
    • Instead, the auth flow is a lot more streamlined:  Browser –> Login and provide PRT to AAD –> AAD grants access

Hopefully from this auth flow change you can see why Microsoft have done this.  Because the old way relied on IWA to perform ‘seamless’ SSO, it only worked when the device was domain joined and you had line of sight to a DC to perform kerberos.  So when connecting externally, you would always see the prompt from the ADFS forms based authentication.  In the new way, whenever an auth prompt came in from AAD, the credential provider could see this and immediately provide the cached PRT, providing SSO regardless of your network location.  It also meant that you no longer needed a domain joined machine to achieve ‘seamless’ SSO!

The side effect though is that because the SAML token provided by ADFS is no longer involved in gaining access, Azure AD loses visibility on those context based claims like insidecorporatenetwork which subsequently means that specific Trusted IPs feature no longer works.   While this is most commonly used for MFA scenarios, be aware that this will also apply to any Azure AD Conditional Access rules you define that uses the Trusted IPs criteria (e.g. block access to app when external).

Side Note: If you want to confirm this behaviour yourself, simply use a Win10 machine that is both Domain Joined and AAD Registered, perform a fiddler capture, and compare the sign in experience differences between a IE and Edge (i.e. PRT aware) and Chrome (i.e. not PRT aware)

The Solution/Workaround?

So, you might ask, how do you fix this apparent gap in capability?   Does this mean you’re going backwards now?   For any enterprise customer of decent size, managing a set of IP address ranges may not be practical or desireable in order to drive MFA (or conditional access) behaviours between internal and external users.   The federated user claim method was a simple, low admin, way of solving that problem.

To answer this question, I would actually take a step back and look at the underlying problem that you’re trying to solve.  If we remind ourselves of the MFA mantra, the idea is to ensure that the user provides “something they know” (e.g. a secret/password) and “something they have” (e.g. a mobile device) to prove their ‘trustworthiness’.

When we make a decision to allow an MFA bypass for internal users, we are predicating this on the fact that, from a security standpoint, they have met their ‘trustworthiness’ level through a seperate means.  This might be through a security access card that lets them into an office location or utilising a corporate laptop that can perform a VPN connection.  Both of which ultimately lets them connect to the internal network and thus is what you use as your criteria for granting them the luxury of not having to perform another factor of authentication.

So with that in mind, what you could then do is to also expand that critera to include Domain Joined machines.  That is, if a user is utilising a corporate issued device that has been domain joined (and registered to AAD), this can now act as your “something you have” aspect of the MFA mantra to prove your trustworthiness, and so you no longer need to differentiate whether they are actually internal or external anymore.

To achieve this, you’ll need to use Azure AD Conditional Access policies, and modify your Access Grant rules to look something like that below:

Win10PRT1

You’ll also need to perform the steps outlined in the How to configure automatic registration of Windows domain-joined devices with Azure Active Directory article to ensure the devices properly identify themselves as being domain joined.

Side Note:  If you include the Workplace Join packages as outlined above, this approach can also expand to Windows 7 and Windows 8.1 devices.

Side Note 2: You can also include Intune managed mobile devices for your ‘bypass criterias’ if you include the Require device to be marked as compliant critera as well.

Fun Fact: You’ll note that in my image the (preview) reference for ‘require one of the selected controls’ is still there.  This is because until recently (approx. May/June 2017), the MFA or domain joined device criteria didn’t acutally work because of the behaviour/order of how the evaluations were being done.  When AAD was evaluating the domain joined criteria, if it failed it would immediately block access rather then trying the MFA approach next, thus preventing an ‘or’ scenario.   This has now been fixed and I expect the (preview) tag to be removed soon.

Summary

The move to the modern ‘any where, any device’ approach to end user computing means that there is a need to start re-assessing how you approach security.  Old world views of security being defined via network boundaries will eventually disappear and instead you’ll need to consider user-and device based contexts to define when to initiate security controls.

With Windows 10’s approach to authentication with AAD, internal and external access is no longer relevant and should not be used for your criteria in driving MFA or conditional access. Instead, use the device based conditions such as ‘device compliance’ or ‘domain join’ as one of your deciding factors.