Azure Log Analytics and Power BI Desktop for Advanced SharePoint Reporting

In a previous blog post we explored some of the basics around integration of OMS and Power BI to report on user activity. In this blog post we’ll look at this subject in more detail and show what can be achieved with Power BI Desktop, especially with the updates now available in Azure Log Analytics as part of the Operations Management Suite (OMS).

Power BI presents a wealth of data visualisation capability, primarily as two use types, the online version which is geared toward sharing and collaboration (accessed on this URL: https://app.powerbi.com/) and the Power BI Desktop which is more of a high-powered data import and modelling tool, though both can be used to create visuals.

So, why are we looking at Power BI Desktop? Well, while the online flavour has some definite advantages – such as easy sharing and dashboard as well as the flexibility of being accessible to anyone with a browser, there’s a few things I want to do with my data that I need the Desktop version for – specifically merging multiple queries to create some interesting insights into Office 365 activity.

Our Requirement: Show user activity in SharePoint Online and correlate and filter the data based on the users’ business unit and location.

Now, while the Office 365 Monitoring Solution in OMS provides detailed logging of user activity, aggregation of this data to show interesting things like “how many users in each state are active”, or, “how often are users in the Finance business unit sharing documents externally” is a little difficult. The reason for this is that Office 365 Activity API logs the user’s User Principal Name (UPN), e.g. daisy.smith@contoso.com but no other identifying information. We want to track internal users, as well as those from outside the organisation who have been invited to collaborate.

For those external users, we want to see what groups they are part of and what they are doing in our tenant.

So, if we want to report on activity that is filtered just for an individual business unit or location we need to gather data from the user’s identity source and merge that with the OMS logs.

As we see from the example below, we can match Office 365 logs using the UPN with the user’s Azure AD account, and with that we can enrich our report with all the attributes of that user. And as we’re dealing with two separate datasets, we’re going to use Power BI Desktop and not the online version.

So, we have a few steps here to do:

  1. Get the OMS log query data into Power BI Desktop
  2. From Power BI Desktop, query the user’s account to gain attributes
  3. Create relationships between these datasets in Power BI, based on UPN
  4. Visualise the data
  5. Publish to Power BI Online for consumption

An overview of the moving parts:

‘The Overview’

Step 1 – get the Azure Log Analytics log query data into Power BI Desktop

Microsoft recently rolled out upgrades for Azure Log Analytics workspaces, and the new iteration integrates quite nicely with Power BI Desktop by exposing a REST API: api.loganalytics.io.

Using Power BI Desktop we can simply drop a query directly into PBI and have it pull directly from your Azure Log Analytics workspace.

Firstly, let’s create a query to get some interesting data from Office 365. Let’s say we want to find out what external users are up to, so here’s a simple query to pull in all logs from externals. Note this is a fairly simple one, in your environment you may want to be more precise and/or aggregate the results into a smaller data set:

OfficeActivity | where UserId contains_cs “#ext#”

Now it’s a couple of simple steps to get this query into Power BI, we just want to click on the little Power BI button above where we enter the query, this will allow us to download the configuration for an M-language query we can use directly in Power BI desktop.

Azure Log Analytics – Power BI Export

The instructions are included in this file, so I’ll just repeat them here:

The exported Power Query Formula Language (M Language) can be used with Power Query in Exceland Power BI Desktop.

For Power BI Desktop follow the instructions below:

1) Download Power BI Desktop from https://powerbi.microsoft.com/desktop/

2) In Power BI Desktop select: ‘Get Data’ -> ‘Blank Query’->’Advanced Query Editor’

3) Paste the M Language script into the Advanced Query Editor and select ‘Done’

2.    From Power BI Desktop, query the user’s account to gain attributes

To get our user information form Azure Active Directory we’re going to query the Microsoft graph, you can just enter the following into the Advanced Query Editor same as above:

2017-11-20 21_35_03-Advanced Editor

Power BI Advanced Editor

Note, you’ll need to authenticate with a Global Admin account for this to work

3.    Create relationships between these datasets in Power BI, based on UPN

In the home screen of Power BI desktop, select “Manage Relationships”

Select “New”, and select to use both your new Azure Log Analytics query and the Azure AD User query and match on UserId and UserPrincipalName from the respective queries

4.    Visualise the data

There’s no strict formula here, think about what questions to ask, play around a bit to find ways to show meaningful data, use visual and page level filters

Power BI Desktop

5.    Publish to Power BI Online for consumption

When you’re happy with your report, click on the “Publish’ button in Power BI Desktop, and select an appropriate workspace in Power BI Web to publish the report to. Once it’s in Power BI Web, you can then create your dashboard.

To add pages of the report to the dashboard, open the page you want to select, and use the “pin live page” option at the top of the screen. Add the pages of the report you want to the same dashboard, this is the one I built reasonably quickly from our demo tenant:

Power BI Dashboard

Good luck, and happy reporting!

Monitor SharePoint Online Activity with OMS and Power BI

As more organisations move their data into the cloud there is now a big focus on getting more insight and visibility in what data is being moved up into the cloud, where it is being stored, how it is being used and by whom.

This post looks at how we can provide greater insight into SharePoint Online, but can also be applicable across other document management systems.

While Office 365 provides an audit capability out of the box to view user activity, this is not highly accessible to those who do not administer Office 365 directly. This blog post looks at what is possible using off-the-shelf Microsoft cloud apps that can provide this visibility to those in the business that require it.

Two main use cases are:

  1. Change and Adoption – tracking uptake of SharePoint and getting visibility into who in the business is using it the most/least
  2. Document Governance – track access, changes and external sharing of documents to find patterns and for forensic analysis

What is useful is to have a high-level dashboard view of trends with items of interest flagged, and the ability to then ‘drill down’ as required to get those next levels of detail. We also need to be able to provide this capability outside of the Office 365

What is involved?

Firstly, there is your Office 365 tenant, where we’re going to draw the data from. As you may know, once auditing is enabled SharePoint makes audit logs available so that an administrator can see who has done what and when.

More details on enabling auditing can be found here: https://support.office.com/en-us/article/Configure-audit-settings-for-a-site-collection-a9920c97-38c0-44f2-8bcb-4cf1e2ae22d2

Now, these audit logs can be pulled into Microsoft Operations Management Suite (OMS) which is the Microsoft cloud-first Management as a Service platform. From there we can start to get some useful views of the data, search and aggregate log data, and also set alerts as desired.

Getting the log info into OMS is as simple as enabling the Microsoft Operations Management Suite solution for Office 365 from the OMS marketplace, and once this is enabled and you have configured OMS to connect to your O365 tenant, OMS will pull the audit log data from the Office 365 Management Activity API.

More information on the O365 Management API can be found here: https://msdn.microsoft.com/en-nz/office-365/office-365-managment-apis-overview

With all this useful information now in OMS we start building out custom solutions to show us a good high-level dashboard view of what’s happening in SharePoint Online, providing a visual representation and helping to identify patterns and anomalies.

These views are based on log queries using the Office 365 Management API Schema

Type=OfficeActivity OfficeWorkload=sharepoint

You can get a copy of the SPO Activity.omsview solution that I created (containing the views above) and import it directly into your OMS workspace from here:

https://github.com/ColdHarbour/OMS_SOLS

Reporting for the Business

Now, these OMS views and log searches are great for those in your business who are adept at using a SIEM (Security Information and Event Management) tool such as OMS, but for consumption of reports by a wider audience you may want to consider a different medium. In this case, Microsoft has you covered with Power BI.

OMS has native integration with Power BI, you just need to enable the feature and from then on, any log searches can be pushed across to Power BI.

Once the dataset is in Power BI, these can be used to create reports and dashboards that can then be consumed by your stakeholders in the business so they can have a view into usage of SharePoint as well

How does it work?

The technical integration details of how all this works under the covers is pretty straightforward.

So that’s a high level view of what’s possible using OMS and Power BI to pull log data from Office 365 and turn it into useful reports and dashboards.

Please let me know in the comments if you’d like to see more on OMS and how we’re using it at Kloud (e.g. more O365, Azure, Windows Infrastructure, Containers, ServiceNow integration)

Microsoft Azure Stack is the New Hybrid Cloud

Last week Microsoft released the public technical preview of new Azure Stack. Azure Stack, along with its predecessor Windows Azure Pack, gives anyone the ability to extend Azure management capabilities to their on-premises datacentre.

Firstly, a bit of background.

With Windows Server 2012 R2, Microsoft made available Windows Azure Pack. Azure Pack offered an on-premise integration point between Windows Server, System Centre, and SQL Server to offer a self-service portal and private cloud services including virtual machine provisioning and management (IaaS), database as a services (DBaaS), and scalable web application hosting (PaaS). Azure Pack can be integrated with Service Management Automation (SMA) and leverages Desired State Configuration (DSC) to manage IaaS deployments.

Microsoft built Windows Azure Pack using a common Service Management API based on the Azure Service Management (ASM) API which is how the “Classic” Azure platform is managed. Azure Pack delivers a Self Service portal based on System Centre Service Provider Foundation (a subset of Orchestrator features) that utilises templates and other library resources from System Centre to provision virtual networks, virtual machines, web apps and other IaaS and PaaS resources on-premises.

It’s about the Stack

The rapid rate of change in Azure and the introduction of a new management API (Azure Resource Manager – ARM) meant Azure Pack features fell behind what is possible in Azure today.

In 2015 Microsoft announced that with the release of Windows Server 2016, they will provide Azure Stack, which is the next iteration of hybrid cloud integration, looking to address Azure Pack’s shortcomings by providing an Azure-consistent API (using ARM) for managing underlying resource providers (compute, network and storage) across public and private clouds.

By utilising the ARM API across both Azure cloud and Hyper-V on premise, Azure Stack users will be able to:

  • Deploy resources across public and private clouds from the same JSON templates and PowerShell Cmdlets
  • Reduce reliance on System Centre to manage underlying resources (ARM does not use System Centre on-premises, though System Centre may still be used to manage infrastructure)
  • Utilise the same resource management model over public and private clouds.

Adopting a Hybrid Cloud approach marks a movement away from integrating hardware with software and towards making software control the underlying hardware, without regard to it, other than as additional capacity.

Now, when we consider the design of an on-premises Private Cloud, we know it needs to take into account the Azure tenancy and the toolsets and processes that have been implemented or are planned for the near future. Having this inter-operability between platforms empowers us to extend a true ‘Hybrid Cloud’ for greater cloud consistency.

It is important to note that at launch only a fraction of the full Azure offerings are available on the Azure Stack preview. At this time this includes compute, storage and network resources (IaaS) plus some of the PaaS features such as web and mobile app services.

Aside from feature gaps, there are also markedly different use cases for deploying applications to on-premises or to the cloud, such as

  • Compliance related concerns as to where your customer or corporate data resides. For some industry sectors it can sometimes be a lengthy process to get cloud use approved by legal, regulatory, security and industry stakeholders.
  • Financial constraints can determine one approach or the other – if deploying on-premises infrastructure, you will have to deploy for peak demand and if this is only intermittent (e.g. monthly or quarterly usage spikes) this can provide a better case for scalable cloud services. On the other hand, for static loads that need to run 24/7, on-premises can be a more cost effective approach (although it can also be misleading to compare without including sunk costs of the datacentre vs. the prospective cost of a cloud service… but that’s an argument for another time).
  • The other big reason for deploying workloads on-premises is that your organisation’s journey to the cloud is going to be a long one, so while the aspiration is there, a pragmatic approach is to deploy the easy workloads to the cloud first and the more difficult workloads and applications remain on-premises.

As most large organisations face one or more of the scenarios above, Azure Stack delivers the benefits of cloud manageability, speed of deployment, and the economy and simplicity of having one management framework across the Public and Private clouds.

One the great use cases that springs to mind is to deploy your dev/test workloads to Azure and take advantage of scaling up during heavy use and UAT, but when there is little activity simply turn off your servers and automate out of hours and weekend shutdowns to further reduce costs. When code is ready to be deployed to production you can then use all the same templates to deploy into your production systems that reside in your datacentre hosted on Azure Stack.

To get started, download and deploy Windows Server 2016 Technical Preview 4: https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-technical-preview.

Azure Stack can be downloaded by registering at the following link: https://azure.microsoft.com/en-us/overview/azure-stack/try/

And you will have to be mindful of the Azure Stack requirements: https://azure.microsoft.com/en-us/documentation/articles/azure-stack-deploy/

That’s it for now, but there will be much more to come on Azure Stack as we explore the platform and start rolling out with our customers.