A modern way to track FIM/MIM Attribute Value History utilizing Power BI

Introduction

Microsoft Identity Manager is fantastic for keeping data consistent between connected systems. Often however you want to know what a previous value of an attribute was. FIM/MIM however can only tell you the current value and the Management Agent it was received on and when.

In the past where I’ve had to provide a solution to either make sure an attribute has a unique value forever (e.g email address or loginID (don’t reuse email addresses or loginID)) or just attribute value history I’ve used two different approaches;

  • Store previous values in an SQL Table and have an SQL MA that flows out the values
  • Store historical values in a Multi-Valued attribute on the user object in the Metaverse

Both are valid approaches but often fall down when you want to quickly get a report on that metadata.

Recently we had a similar request to be able to know when Employees EndDates were updated in HR. Specifically useful for contractors who have their contracts extended. Instead of stuffing the info into a Multi-Valued attribute or an SQL DB this time I used Power BI. This provides the benefit of being able to quickly develop a graphical report and embed it in the FIM/MIM Portal.

Such a report looks like the screenshot below. Power BI Report

Using the filters on the right hand side of the report you can find a user (by EmployeeID or DisplayName), select them and see attribute value history details for that user in the main part of the report. As per the screenshot below Andrew’s EndDate was originally the 8th of December (as received on the 5th of November), but was changed to the 24th of November on the 13th of November.

End Date History

In this Post I describe how I quickly built the solution.

Overview

The process to do this involves;

  • creating a Power BI Application
  • creating a Power BI Dataset
  • creating a script to retrieve the data from the MV and inject it into the Power BI Dataset
  • creating a Power BI Report for the data
  • embedding the Report in the MIM Portal

Registering a Power BI Application

Head over to Power BI for Developers and Register an Application for Power BI. Login to Power BI with an account for the tenant you’ll be reporting data for. Give your Application a name and choose Native Application. Set the Redirect URL to https://localhost

CreatePBIApp

Choose the permissions for you Application. As we’ll be writing data into Power BI you’ll need a minimum of Read and Write all Datasets. Select Register App.

Create PBI App Permissions

Record your Client ID for your Application. We’ll need this to connect to Power BI.

Register the App

We need to authenticate to Power BI the first time using a UI to provide Authorization for our Application. In order to do that we need to add another Reply URL to our application. Head to the Apps Dev Portal, select your application and Edit the Application Manifest. Add an additional Reply URL for https://login.live.com/oauth20_desktop.srf as shown below.

Add Reply URL for AuthZ

The following PowerShell commands will then allow us to Authenticate utilizing the Power BI PowerShell module. If you don’t have the Power BI PowerShell Module installed un-comment Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force  to install the PowerShell Power BI PowerShell Module.

Update for your Client ID for the App you registered in the previous steps.

# Install-Module PowerBIPS -RequiredVersion 1.2.0.9 -Force
Import-Module PowerBIPS -RequiredVersion 1.2.0.9

# PowerBI App
$clientID = "4036df76-4de6-43cb-afe6-1234567890"

$authtoken = Get-PBIAuthToken -ClientId $clientID

Sign in with an account for the Tenant where you created the Power BI App.

Interactive Login for Dataset Creation

Accept the permissions you chose when registering the Power BI App.

Authorize PowerBI App

Creating the Power BI Dataset

Now we will create the Power BI Table (Dataset) that we will use when we insert the records.

My table is named Employee and the DataSet EmployeeEndDateReport.  I’m keeping the table slim to enough info for our purpose. Date added to the dataset, employees Accountname, Displayname, Active state, EndDate and EndDateReceived. The following script will create the Dataset.

Populating the Dataset

With our table created, lets populate the table with employees that have an EndDate. As this is the first time we run it, we set a watermark date to add people from. I’ve gone with the previous year.  I then query the MV for Employees with an EndDate within the last 365 days, build a PowerShell Object with the columns from our table and insert them into Power BI. I also set a watermark of the last time we had an EndDate Received from the MA and output that to the watermark file. This is so next time we can quickly get only users that have an EndDate that was received since the last time we ran the process.

NOTE: for full automation you’ll need to change line 6 for your secure method of choice of providing credentials to scripts. 

Create a Power BI Report

Now in Power BI select your Data Set and design your report. Here is a sample one that I’ve put together. I simply selected the columns from the dataset and updated the look and feel. I then added in a column (individually for AccountName, DisplayName and Active) and chose it as Filter so that I have various ways of filtering whoever I’m looking for.

Power BI Report.png

Once you have run the process for a while and you have changed values for the attribute you are keeping history for, you will see when you select a user with changed values, you will see the history.

End Date History

Summary

To complete the solution you’ll want to automate the script that queries the MV for changes (probably after each run from the MA that provides the attribute you are recording history for), and you’ll want to embed the report in the MIM Portal. In this post here I detail how to do that step by step.

 

Azure AD Identity and Access Management & Features

I’ve been using Azure AD Identity for quite a while now. I thought it would be good to share the summary of Azure AD Identity features and gather some feedbacks.

Azure AD Identity

Azure Active Directory: A comprehensive identity and access management cloud solution for your employees, partners, and customers. It combines directory services, advanced identity governance, application access management, and a rich standards-based platform for developers.

Identity and access management License option: Azure Active Directory Premium P2 (E5), P1 (E3)

“Identity as the Foundation of Enterprise Mobility”

Identity and access management

Protect at the front door: innovative and advanced risk-based conditional accesses, protect your data against user mistakes, detect attacks before they cause damage

Identity and access management in the cloud:

  • 1000s of apps, 1 identity: Provide one persona to the workforce for SSO to 1000s of cloud and on-premises apps.
  • Enable business without borders: Stay productive with universal access to every app and collaboration capability.
  • Manage access at Scale: Manage identities and access at scale in the cloud and on-premises, advanced user lifecycle management and advanced identity monitor tools
  • Cloud-powered protection: Ensure user and admin accountability with better security and governance

Azure AD portal:

Configure users & groups, Configured SaaS applications identity, configure on-prem applications with Application proxy, license management, password reset, password reset notifications, password reset authentication methods, company branding, whether users can register/consent applications, whether users can invite external contacts, whether guest can invite external contacts, whether users can register devices with Azure AD, whether require MFA, Define whether use pass-through authentication or federation authentication.

Azure AD application integration:

3 types of applications integration:

  • LOB applications: using Azure AD for authentication
  • SaaS applications: configure SSO
  • Azure AD Application proxy: we can publish on-prem applications to internet through Azure AD application proxy.

Inbound/outbound user provisioning to SaaS apps

User Experience with Integrated apps: Access Panel https://myapps.microsoft.com. Custom Branding? Load by appending your organization’s domain https://myapps.microsoft.com/company_domain_name. From Myapps, users can: change PW, Edit PW reset, MFA, view account details, view launch apps, self-management groups. Admins can configure apps to be self-service -users add apps by themselves.

Authentication (Front End & Back End) & Reporting (reporting access & alerts, reporting API, MFA)

Front End Authentication 

Back End Authentication 

Pass-thru authentication:

  • Traffic to the backend app NOT authenticated in Azure AD
  • Useful for NDES, CRLs, etc
  • Still has benefits of not exposing backend apps to http based attacks

Pass-thru authentication:

  • Does not try and authenticate to the backend
  • Useful with forms based applications
  • Auth headers returned to client
  • Can be used with front-end pre-authentication

Pre-Authentication

  • Users must authenticate to AAD to access backend app
  • Allows ability to plug into AAD control plane
  • Can also be extended to provide true SSO to the backend app

Kerberos/IWA

  • Must use pre-authentication on front end
  • Allows for an SSO experience from AAD to the app
  • Support for SPNego (i.e. non AD Kerberos)

 

Azure AD Connect health

Monitor & Report on ADFS, AAD Sync, ADDS. Advanced logs for configuration troubleshooting.

Azure Identity protection (Azure AD premium P2)

  • AIP dashboard is a consolidated view to examine suspicious user activities and configuration vulnerabilities
  • Remediation recommendations
  • Risk Severity calculation
  • Risk-based policies for protection for future threats

If user is at risk, either we can block users or we can trigger MFA automatically

AIP can help to identify spoof attack happening or leak credentials, suspicious sign in activities. infected devices, configurations vulnerabilities, for example, when a user signed in from unfamiliar location, then we can trigger to reset his/her password or we can use user risk condition to allow user access to corporate resources with password change or block access straight away. Alternatively, we can configure the alert to send an approval request to admin.

Identity protection risk types and reports generated:

Azure AD privileged Identity Management

For examples, I am on leave for 2 days and I want my colleagues to become global admin for only two days. if I come back from leave and forget to remove the global admin permissions from that colleagues, he will still be global admin, this will be put company at risk, because potentially either global admin password can be compromised.

Just in time administrative access, we can use this to give only has 2 days “global admin” access

Securing Privileged access: just in Time administration

  • Assume breach of existing AD forests may have occurred
  • Provide privileged access through a workflow
  • Access is limited in time and audited
  • Administrative account not used when reading mail/etc.

Result = limited in time & capability

 

 

 

Migrating Sharepoint 2013 on prem to Office365 using Sharegate

Recently I completed a migration project which brought a number of sub-sites within Sharepoint 2013 on-premise to the cloud (Sharepoint Online). We decided to use Sharegate as the primary tool due to the simplistic of it.

Although it might sound as a straightforward process, there are a few things worth to be checked pre and post migration and I have summarized them here. I found it easier to have these information recorded in a spreadsheet with different tabs:

Pre-migration check:

  1. First thing, Get Site Admin access!

    This is the first and foremost important step, get yourself the admin access. It could be a lengthy process especially in a large corporation environment. The best level of access is being granted as the Site Collection Admin for all sites, but sometimes this might not be possible. Hence, getting Site Administrator access is the bare minimum for getting migration to work.

    You will likely be granted Global Admin on the new tenant at most cases, but if not, ask for it!

  2. List down active site collection features

    Whatever feature activated on the source site would need to be activated on the destination site as well. Therefore, we need to record down what have been activated on the source site. If there is any third party feature activated, you will need to liaise with relevant stakeholder in regards to whether it is still required on the new site. If it is, it is highly likely that a separate piece of license is required as the new environment will be a cloud based, rather than on-premise. Take Nintex Workflow for example, Nintex Workflow Online is a separate license comparing to Nintex Workflow 2013.

  3. Segregate the list of sites, inventory analysis

    I found it important to list down all the list of sites you are going to migrate, distinguish if they are site collections or just subsites. What I did was to put each site under a new tab, with all its site contents listed. Next to each lists/ libraries, I have fields for the list type, number of items and comment (if any).

    Go through each of the content, preferably sit down with the site owner and get in details of it. Some useful questions can be asked

  • Is this still relevant? Can it be deleted or skipped for the migration?
  • Is this heavily used? How often does it get accessed?
  • Does this form have custom edit/ new form? Sometimes owners might not even know, so you might have to take extra look by scanning through the forms.
  • Check if pages have custom script with site URL references as this will need to be changed to accommodate the new site url.

It would also be useful to get a comprehensive knowledge of how much storage each site holds. This can help you working out which site has the most content, hence likely to take the longest time during the migration. Sharegate has an inventory reporting tool, which can help but it requires Site Collection Admin access.

  1. Discuss some of the limitations

    Pages library

    Pages library under each site need specific attention, especially if you don’t have site collection admin! Pages which inherit any content type and master page from the parent site will not have these migrated across by Sharegate, meaning these pages will either not be created at the new site, or they will simply show as using default master page. This needs to be communicated and discussed with each owners.

    External Sharing

    External users will not be migrated across to the new site! These are users who won’ be provisioned in the new tenant but still require access to Sharepoint. They will need to be added (invited) manually to a site using their O365 email account or a Microsoft account.

    An O365 account would be whatever account they have been using to get on to their own Sharepoint Online. If they have not had one, they would need to use their Microsoft account, which would be a Hotmail/ Outlook account. Once they have been invited, they would need to response to the email by signing into the portal in order to get provisioned. New SPO site collection will need to have external sharing enabled before external access can happen. For more information, refer to: https://support.office.com/en-us/article/Manage-external-sharing-for-your-SharePoint-Online-environment-C8A462EB-0723-4B0B-8D0A-70FEAFE4BE85

    What can’t Sharegate do?

    Some of the following minor things cannot be migrated to O365:

  • User alerts – user will need to reset their alerts on new site
  • Personal views – user will need to create their personal views again on new site
  • Web part connections – any web part connections will not be preserved

For more, refer: https://support.share-gate.com/hc/en-us/categories/115000076328-Limitations

Performing the migration:

  1. Pick the right time

    Doing the migration at the low activity period would be ideal. User communications should be sent out to inform about the actual date happening as earlier as possible. I tend to stick to middle of the week as that way we still have a couple of days left to solve any issues instead of doing it on Friday or Saturday.

  2. Locking old sites

    During the migration, we do not want any users to be making changes to the old site. If you are migrating site collections, fortunately there’s a way to lock it down, provided you having access to the central admin portal. See https://technet.microsoft.com/en-us/library/cc263238.aspx

    However, if you are migrating sub-sites, there’s no way to lock down a sole sub-site, except changing its site permissions. That also means changing the site permissions risk having all these permissions information lost, so it would be ideal to record these permissions before making any changes. Also, take extra note on lists or libraries with unique permissions, which means they do not inherit site permissions, hence won’t be “locked unless manually changed respectively.

  3. Beware of O365 traffic jam

    Always stick to the Insane mode when running the migration in Sharegate. The Insane mode makes use of the new Offie 365 Migration API which is the fastest way to migrate huge volumes of data to Office365. While it’s been fast to export these data to Office365, I did find a delay in waiting for Office365 to import these into Sharepoint tenant. Sometimes, it could sit there for an hour before continuing with the import. Also, avoid running too many sessions if your VM is not powerful enough.

  4. Delta migration

    The good thing with using Sharegate is that you could do delta migration, which means you only migrate those files which have been modified or added since last migrated. However, it doesn’t handle deletion! If any files have been removed since you last migrated, running a delta sync will not delete these files from the destination end. Therefore, best practice is still delete the list from the destination site and re-create it using the Site Object wizard.

Post-migration check:

Doing the migration at the low activity period would be ideal. User communications should be sent out to inform about the actual date happening as earlier as possible. I tend to stick to middle of the week as that way we still have a couple of days left to solve any issues instead of doing it on Friday or Saturday.

Things to check:

  • Users can still access relevant pages, list and libraries
  • Users can still CRUD files/ items
  • Users can open Office web app (there can be different experience related to authentication when opening Office files, in most cases, users should only get prompted the very first time opening)

Create a PowerApp from SharePoint list

In my last post we explored PowerApps and the associated development tools available like PowerApps desktop studio and PowerApps web studio. Without writing a single line of code we created our first basic three screen PowerApp and were able to perform CURD (create, update, read, delete) operations on our SharePoint data source.

In this, we will try an create a App from SharePoint list. we’ll create an app from a “Roster” SharePoint list.

First, we will see how PowerApps is integrated into SharePoint Online. Second, we will try to customize the app from the basic app created by PowerApp.

Lets get going…

To start here:

  • Login on to our SharePoint Online site.
  • Go to the list, “Roster”in our case.
  • List has columns like Employee Name, Skills, Shift Timings (Choice) , Image (Picture), Hours (Number), Shift Frequency, Overtime (Calculated).

PowerAppRosterList.png

To build an app, click PowerApps and then Create an app. In the right hand pane, enter a name of the app “RosterDetails”, then click Create.

Things to note:

  • The App will open an PowerApp web studio.
  • It will use our SharePoint list as our datasource

PowerAppCreatePowerAppFromList.png

So, after much of hard work!!! Yes I know… I get my Roster App as below, which to be honest looks very funny… and not very useful.

GeneratedApp.png

Lets make it better presentable…

Lets try quickly changing the theme:

  •  From left pane ‘Screens’, select thumbnail view and select ‘BrowseScreen1’

ScreensThumbnail.png

  • Click Home -> Theme and select ‘Coral’ (thats my choice, we can chose any we want…)

ScreensTheme.pngScreensCoralThemeView.png

Lets add more information on browse screen:

  • On the middle section, select browse gallery -> Layouts -> and select Layout as Image, title, subtitle and body
  • We set the values respective to reach property like Title 4 as ‘Title’, Body3 as ‘Hours’ and so on…

ScreensUpdateBrowseView.png

Small Confession to make…

I added the Images in picture library on SharePoint and linked it with my custom list ‘Roster’… but GUESS WHAT… PowerApp didn’t loaded the images.

But, when I linked the Image column with public URL for the image, it works. And yes… I did tried absolute URL but no luck…

So, after updating my custom list, my RosterDetails App look like:

ScreensNewBrowseView.png

Time for updating Detail screen

ScreensDetailsView.png

You know what, I really like this view, but I don’t want to show field Compliance Assest Id and Overtime field looks to be having too many 000000000000000, so we need to fix it.

  • We select the DetailForm on detailscreen
  • Click Layouts on right -> Uncheck the checkbox next to Compliance Assest Id field.
  • The field is no more on the view.

ScreenDetailFormView.png

  • On the same screen press ‘…’ next to Overtime field -> click Advanced options -> click Unlock to change properties

ScreenDetailFormUnlockPropertiesView.png

  • Unlockling the card will make this card as ‘Custom card’, do this will make all the properties of the field editable
  • Click next to label showing the overtime value, then on the right panel, click Text property and we update the function next to Text property from

Parent.Default ——–> Value(Parent.Default)

So, in order to get rid of ‘0’ decimals, we used function Value which converts a string to a number.

ScreenDetailFormFormula.png

Point to note:

  • I tried using formula as Text(Parent.Default,”#.00″), but it doesn’t do anything, but if use the formula as Text(2.0000000,”#.00″) it would give us 2.00
  • ‘[$-en-US]’ the language placeholder can appear anywhere in the custom format but only once. While writing a formula, if we do not provide a language placeholder, the format string is ambiguous from a global standpoint, the authoring tool will automatically insert the language tag for your current language.[$-en-US] is assumed if this placeholder is not present when your app is run.NOTE: In a future version, the syntax of this placeholder may change to avoid confusion with a similar, but different, placeholder supported by Excel.

Time to make changes to Edit Screen

ScreenEditScreen.png

  • Skills can be more then one so, lets change skills to have multi-line input field.

To do this, we just select Edit form -> click Layouts -> click ‘…’ next to Skills -> select Edit multi-line text.

ScreenEditScreenMultiline.png

  • Overtime needs to be same as detail screen

We will follow exactly the same steps as we did for Browse Screen.

  • Remove Compliance Assest Id

We will follow exactly the same steps as we did for Browse Screen.

And, yes we are done!!!

 

Conclusion

We created an PowerApp from SharePoint in no time, updated all the three views with minimum effort and got ourself a fine looking App which displays all the records, add and allow edit of the items in SharePoint.

References

Exchange Online & Splunk – Automating the solution

NOTES FROM THE FIELD:

I have recently been consulting on, what I think is a pretty cool engagement to integrate some Office365 mailbox data into the Splunk reporting platform.

I initially thought about using a .csv export methodology however through trial & error (more error than trial if I’m being honest), and realising that this method still required some manual interaction, I decided to embark on finding a fully automated solution.

The final solution comprises the below components:

  • Splunk HTTP event collector
    • Splunk hostname
    • Token from HTTP event collector config page
  • Azure automation account
    • Azure Run As Account
    • Azure Runbook
    • Exchange Online credentials (registered to Azure automation account

I’m not going to run through the creation of the automation account, or required credentials as these had already been created, however there is a great guide to configuring the solution I have used for this customer at  https://www.splunk.com/blog/2017/10/05/splunking-microsoft-cloud-data-part-3.html

What the PowerShell script we are using will achieve is the following:

  • Connect to Azure and Exchange Online – Azure run as account authentication
  • Configure variables for connection to Splunk HTTP event collector
  • Collect mailbox data from the Exchange Online environment
  • Split the mailbox data into parts for faster processing
  • Specify SSL/TLS protocol settings for self-signed cert in test environment
  • Create a JSON object to be posted to the Splunk environment
  • HTTP POST the data directly to Splunk

The Code:

#Clear Existing PS Sessions
Get-PSSession | Remove-PSSession | Out-Null
#Create Split Function for CSV file
function Split-array {
param($inArray,[int]$parts,[int]$size)
if($parts) {
$PartSize=[Math]::Ceiling($inArray.count/$parts)
}
if($size) {
$PartSize=$size
$parts=[Math]::Ceiling($inArray.count/$size)
}
$outArray=New-Object’System.Collections.Generic.List[psobject]’
for($i=1;$i-le$parts;$i++) {
$start=(($i-1)*$PartSize)
$end=(($i)*$PartSize)-1
if($end-ge$inArray.count) {$end=$inArray.count-1}
$outArray.Add(@($inArray[$start..$end]))
}
return,$outArray
}
function Connect-ExchangeOnline {
param(
$Creds
)
#Connect to Exchange Online
$Session=New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/-Credential $Credentials-Authentication Basic -AllowRedirection
$Commands=@(“Add-MailboxPermission”,”Add-RecipientPermission”,”Remove-RecipientPermission”,”Remove-MailboxPermission”,”Get-MailboxPermission”,”Get-User”,”Get-DistributionGroupMember”,”Get-DistributionGroup”,”Get-Mailbox”)
Import-PSSession-Session $Session-DisableNameChecking:$true-AllowClobber:$true-CommandName $commands|Out-Null
}
#Create Variables
$SplunkHost = “Your Splunk hostname or IP Address”
$SplunkEventCollectorPort = “8088”
$SplunkEventCollectorToken = “Splunk Token from Http Event Collector”
$servicePrincipalConnection = Get-AutomationConnection -Name ‘AzureRunAsConnection’
$credentials = Get-AutomationPSCredential -Name ‘Exchange Online’
#Connect to Azure
Add-AzureRMAccount -ServicePrincipal -Tenant $servicePrincipalConnection.TenantID -ApplicationId $servicePrincipalConnection.ApplicationID -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
#Connect to Exchange Online
Connect-ExchangeOnline -Creds $credentials
#Invoke Script
$mailboxes = Get-Mailbox -resultsize unlimited | select-object -property DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Get Current Date & Time
$time = get-date -Format s
#Convert Timezone to Australia/Brisbane
$bnetime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($time, [System.TimeZoneInfo]::Local.Id, ‘E. Australia Standard Time’)
#Adding Time Column to Output
$mailboxes = $mailboxes | Select-Object @{expression = {$bnetime}; Name = ‘Time’}, DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Create Split Array for Mailboxes Spreadsheet
$recipients = Split-array -inArray $mailboxes -parts 5
#Create JSON objects and HTTP Post to Splunk HTTP Event Collector
foreach ($recipient in $recipients) {
foreach($rin$recipient) {
#Create SSL Validation Bypass for Self-Signed Certificate in Testing
$AllProtocols = [System.Net.SecurityProtocolType]’Ssl3,Tls,Tls11,Tls12′
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
#Get JSON string to post to Splunk
$StringToPost = “{ `”Time`”: `”$($r.Time)`”, `”DisplayName`”: `”$($r.DisplayName)`”, `”PrimarySMTPAddress`”: `”$($r.PrimarySmtpAddress)`”, `”IsMailboxEnabled`”: `”$($r.IsMailboxEnabled)`”, `”ForwardingSmtpAddress`”: `”$($r.ForwardingSmtpAddress)`”, `”GrantSendOnBehalfTo`”: `”$($r.GrantSendOnBehalfTo)`”, `”ProhibitSendReceiveQuota`”: `”$($r.ProhibitSendReceiveQuota)`”, `”AddressBookPolicy`”: `”$($r.AddressBookPolicy)`” }”
$uri = “https://” + $SplunkHost + “:” + $SplunkEventCollectorPort + “/services/collector/raw”
$header = @{“Authorization”=”Splunk ” + $SplunkEventCollectorToken}
#Post to Splunk Http Event Collector
Invoke-RestMethod -Method Post -Uri $uri -Body $StringToPost -Header $header
}
}
Get-PSSession | Remove-PSSession | Out-Null

 

The final output that can be seen in Splunk looks like the following:

11/13/17
12:28:22.000 PM
{ [-]
AddressBookPolicy:
DisplayName: Shane Fisher
ForwardingSmtpAddress:
GrantSendOnBehalfTo:
IsMailboxEnabled: True
PrimarySMTPAddress: shane.fisher@xxxxxxxx.com.au
ProhibitSendReceiveQuota: 50 GB (53,687,091,200 bytes)
Time: 11/13/2017 12:28:22
}Show as raw text·         AddressBookPolicy =  

·         DisplayName = Shane Fisher

·         ForwardingSmtpAddress =  

·         GrantSendOnBehalfTo =  

·         IsMailboxEnabled = True

·         PrimarySMTPAddress = shane.fisher@xxxxxxxx.com.au

·         ProhibitSendReceiveQuota = 50 GB (53,687,091,200 bytes)

I hope this helps some of you out there.

Cheers,

Shane.

 

 

 

Connect SharePoint Online and SQL Server On-Premises with BCS/SharePoint Apps using Hybrid Connection and WCF Services

SharePoint Online cannot directly connect to on-premises data sources such as SQL Server. A recommended approach is to use Hybrid with SharePoint 2013/2016 but adds an overhead of infrastructure and maintenance costs. Hence to overcome it, I am going to describe in this blog how to use the Azure PaaS workloads and connect to on-premises data sources using BCS.

Using Azure Hybrid Connection (refer this post) and BCS with Azure Web App hosting WCF endpoint, we can now expose on-premises SQL data to SharePoint Online and Cloud by external content types (ECTs) or SharePoint Hosted Apps.

Below are two approaches by which BCS can connect these data sources to SharePoint.
1. Azure Web App hosting WCF Service and External Lists
2. Azure Web App hosting WCF Data Service and Hosted Apps

Azure WCF Service Web App and External Lists
SPOAzureBCSHybrid
Pros: The advantage of using this approach is the reusability of External Content Types (ECT). ECTs can be used across multiple lists and sites in the same site collection. ECTs can also be used for complex associations across multiple types of data.

Cons: Some shortcomings of this approach are:
– Dependency on pass through authentication for users and/or implement custom authentication to authenticate with WCF by passing SQL authentication
– Added development effort because of WCF build and hosting

High-Level Steps:
1. Create a WCF Solution using Visual Studio
2. Use ADO.Net and WCF Service calls to fetch data using web methods. Implement at least two web methods – one to return all items and one to return a specific item
3. Update Web.Config of the WCF service with required configuration for data calls
4. Create an Azure Web App
5. Publish the WCF Service to Azure Web App and get the single wsdl signature from the WCF service
6. Create an External Content Type using SharePoint Designer using the WSDL signature
7. Add GetItems and GetItem finder to ECT
8. Create an External List from ECT

Azure Web App hosting WCF Data Service and Hosted Apps
SPOAzureAppsHybrid
Pros: The advantages of using a WCF Data Service is that the OData method maps directly to the schema of the SQL table which makes it easy to build and maintain. Additionally, using SharePoint hosted apps isolates the CRUD operations from the Host Web decreasing the overhead of external content types and external lists.

Cons: The disadvantage of using this approach is that the data is scoped within the app and cannot be exposed to Host Web components making interaction limited to Web App only. There is a customization requirement to expose and operate on this data in the App Web.

High-Level Steps:
1. Create a WCF service project using Visual Studio
2. Install the EntityFramework Nuget package
3. Add a WCF data service file and implement EntityFrameworkDataService instead of DataService
4. Override the “InitializeService” method as below
5. Add an ADO.Net Entity Data Model project and configure it to fetch data from SQL Tables you want
6. Update Web.config with required configuration for data calls
7. Create an Azure Web App and enable SSL on it
8. Publish the WCF Service to Azure Web App
9. Next create a new SharePoint hosted app solution in Visual Studio
10. In the SharePoint hosted app solution, add an External Content type and select the Azure Web Application hosting the WCF data service as source
11. After the External Content type is created, then create an External List using ECT created above
12. The external list is now added to the Hosted app which can then be referenced in the app default page and app part


Hence in this blog, we have seen the two choices to host BCS connectivity services via Azure PaaS workloads, advantages and disadvantages of each and broad level steps to configure them.

Resolving unable to access App published with Barracuda WAF over Azure Express Route

Recently, one of the customers reported they can’t access to all UAT apps from their Melbourne office, but it worked fine for other offices. When they tried to access the UAT app domains, they were getting below errors: “The request service is temporarily unavailable. It is either overloaded or under maintenance. Please try later.”

WAF error

Due to the UAT environment IP restrictions on the WAF, it is normal behaviour for me to get the error messages due to the fact our Kloud office’s public IPs are not in the WAFs’ whitelist. This error approved the web traffic did hit the WAFs. Ping the URL hostname, it returned the correct IP without DNS problems, this means that the web traffic did go to the correct WAF farm considering the customer has a couple of other WAF farms in other countries. So we can focus on the AU WAFs now for the troubleshooting.

I pulled out all the WAFs access logs and planned to go through those to verify if the web traffic was hitting on the AU WAFs or went to somewhere else. I did a log search based on the public IPs which were provided by customer, no results returned for the last 7 days.Search Result 1

interesting. did it mean no traffic from Melbourne office came in? I did another search based on my public IPs, it clearly returned a couple of access logs related with my testing, correct time, correct client IP, correct WAF domain hostname, method is GET, Status is 503 which is correct because my office IP is restricted.

Search Result 2

Since customer mentioned all other offices had no problem to access the UAT app environment, I asked them to provide me with one public IP from another office, we tested it again and verified people in Indian office can successfully open the web app and I can see their web traffic appear in the WAF logs as well. I believed when Melbourne staff tried to browse the web app, the traffic should go to the same WAF farm because the DNS hostname was resolved to the same IP no matter whether in Melbourne or in India.

The question is what exactly happened and what was the root cause? :/

In order to capture another good example, I noted down the time and asked the customer to browse the website again. This time I did an access log search based on the time instead of Melbourne public IPs. I got a couple of results returned with some unknown IPs.

Search result 3

I googled the unknown IPs, it turned out they are Microsoft Australian data centre IPs. Now I kind of felt there are some routing or NAT issues in the customer network. I contacted the customer and provided the unknown IPs, customer did a bit of investigations on this and advised that those unknown IPs are the public IPs for their Azure Express Route interfaces. It makes sense now. Because customer didn’t whitelist their new Azure public IPs, so when web traffic came from the unknown source IPs (Azure Public IPs), WAF doesn’t know them and they were all being blocked as well, just like me. Once I added the new Azure IPs into the app whitelist IPs, all the access issues were resolved.

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

Building my first PowerApp, a basic roster, pulling data from SharePoint Online

What is PowerApps?

PowerApps is a set of services and apps, that enable power users to build line of business application rapidly. It can connect to the cloud services and data sources that we may be already using.

Why PowerApps?

PowerApps gives power user ability to quickly build apps that suit specific needs. They can share apps instantly with Team across the web, tablets, and mobile devices. To list down, few of the advantages of PowerApps are:

  • Simple and fast – capable of producing an app in minutes that pulls in data from Excel or a cloud service.

  • Can be integrated with Microsoft Flow, making it possible to trigger workflows from within apps.

  • Robust and enterprise-grade, so can be used for complex requirements.

What you need to get started?

We can chose from two options:

  • PowerApps Studio for web

  1. Go to the url: https://web.powerapps.com

  2. You need to use your tenant account which would be something like xxx@yyy.onmicrosoft.com

  3. Choose from any of the start option
  • PowerApps Studio for Windows

  1. Go to the url: https://powerapps.microsoft.com/en-us/downloads/

  2. Download the App for windows.

  3. You need to login using your tenant account which would be something like xxx@yyy.onmicrosoft.com

We would need to use web.powerapps.com to configure and manage data connections and on-premises gateways, and to work with the Common Data Service.

Few points to note:

After we create an app, we can administer it in the admin center.

We run apps in a browser from Microsoft Dynamics 365 or by using PowerApps Mobile, which is available for Windows, iOS, and Android devices.

PowerApps components:

  • web.powerapps.com – manage and share the apps we build
  • PowerApps Studio – build powerful apps with easy to use visual tools
  • PowerApps Mobile – run apps on Windows, iOS, and Android devices
  • PowerApps admin center – administer PowerApps environments and other components

PowerApps Studio

PowerApps Studio has three panels and a ribbon, which gives the view same as creating a PowerPoint presentation:

  1. Left navigation bar: shows thumbnail
  2. Middle pane: shows the screen that you’re working on
  3. Right-hand pane: shows options such as layout and data sources
  4. Property drop-down list: where you select the properties that formulas apply to
  5. Formula bar: where you add formulas
  6. Ribbon: where you add controls and customize design elements

PowerAppStudio.png

PowerApps Mobile

PowerApps Mobile for Windows, iOS, and Android provides an environment where instead of going to separate app stores, we stay in PowerApps and have access to all the apps that we have created and that others have shared with us.

Admin center

The PowerApps admin center is the centralized place to administer PowerApps for an organization. This is where we define different environments, data connections, and other elements. The admin center is also where we create Common Data Service databases, and manage permissions and data policies.

Lets create our first App

Ok, now time to create our first PowerApp… so, lets fire up PowerApps desktop studio. So, for this demo, we will use data source as SharePoint and layout as phone layout

  • Click New -> SharePoint -> Phone layout

PowerAppStudioDesktopStart.png

Connect to a data source

On the next screen we need to specify the connection for SharePoint, where we will enter SharePoint Url and click Go.

PowerAppSourceConnection.png

On the next screen, we can select the list on the specified SharePoint site and select the respective list, in our example we will chose the list “Roster” and click Connectafter which PowerApps then start generating the app.

powerappsourceurl2.png

The generated app is always based on a single list and we can add more data to the app later. An app with default screen is built for us, which we can see in action by click play icon Start app preview arrow.

powerappsharepointbaseapp1.png

Our app in action

MyFirstPowerApp.gif

Our three screen roster app opens in PowerApps Studio. All apps generated from data have the same set of screens:

  • The browse screen: On this, we can browse, sort, filter, and refresh the data pulled in from the list, as well as add items by clicking the (+) icon.
  • The details screen: On this, we view detail about an item, and can choose to delete or edit the item.
  • The edit/create screen: On this, we edit an existing item or create a new one.

Conclusion

So, in this post, we tried exploring PowerApps and the development tool available like PowerApps desktop studio and PowerApps web studio. Without writing a single line of code we created our first basic three screen PowerApp and were able to perform CURD (create, update, read, delete) operations on our SharePoint data source.

In the next post we will try creating PowerApp from SharePoint list.

Resolving “User not found” issue while assigning permissions using SharePoint CSOM

I was recently working on a SharePoint Online project where we were trying to automate library creation and provide required permissions on those libraries. We had an issue while modifying permissions with CSOM code on SharePoint libraries when the Created By user had left the company.

In this post I will outline the cause and the resolution as there was no online reference for resolving this error.

Issue: The CSOM code was throwing an error “User not found” even when creating a User object from web.EnsureUser() method.

Cause: The User object returned by web.EnsureUser() method was empty but not null and hence couldn’t be instantiated while adding after breaking permissions.

Resolution: The resolution to this issue was to explicitly load of the user object, then catch the exception while loading, and set a flag to false which could be later be checked to prevent the add method from erroring out. Yeah, this is a roundabout way of overcoming the issue but it works. Hopefully it will save you some hours.

Below is the code that could be used to do that.