Options to consider for SharePoint Framework solutions deployment

There are various options to package and deploy a SharePoint Framework solution and as part of packaging and deployment process, the developers have to identify a best approach for their team. Sometimes it becomes a nightmare to plan the right approach for your solution, if you haven’t weighed the options properly.

Working at multiple implementations of SPFx solution for sometime now, I have been able to get an idea of various options and approach for them. Hence in this blog, we will at these options and look at merits and limitations for each.

At a high level, below are the main components that are deployed as part of SPFx package deployment:

  1. The minified js file for all code
  2. The webpart manifest file
  3. Webpart compiled file
  4. The package (.pckg) file with all package information

Deployment Options

Please check this blog for an overview of the steps for building and packaging a SPFx solution. The packaged solution (.sppkg) file can then be deployed to a App catalog site. The assets of the package (point 1-3 of above) could be deployed by any of the four below options. We will look at the merits and limitations for each.

1. Deploy to Azure CDN or Office 365 CDN

The assets could be deployed to an Azure CDN. The deployment script is already a part of SPFx solution and could be done from within the solution. More detailed steps for setting this up are here.

Note: Please remember to enable CORS on the storage account before deployment of the package.

If CORS is not enabled before CDN profile is used, you might have delete and recreate the Storage account.

Merits:

  • Easy deployment using gulp
  • Faster access to assets and web part resources because of CDN hosting
  • Add geographical restrictions (if needed)

Limitations:

  • Dependency on Azure Subscription
  • Proper set up steps required for setting up Azure CDN. In some cases if the CDN if not set properly, then the deployment has to be done again.
  • Since the assets are deployed to a CDN endpoint, so if assets need restricted access then this mayn’t be recommended

2. Deploy to Office 365 Public CDN

For this option, you will need to enable and set up Office 365 CDN in your tenancy before deployment. For more details of setting this up, check the link here.

Merits:

  • Faster access to assets and web part resources because of CDN hosting
  • No Azure subscription requirement
  • Content is accessible from SharePoint Interface

Limitations:

  • Manual copy of assets files to CDN enabled SharePoint library
  • Office 365 CDN is a tenant setting and has to be enabled for the whole tenancy
  • Since the assets are deployed to a CDN endpoint, so if assets needs restricted access then this mayn’t be recommended
  • Accidental deletion could cause issues

3. Deploy to SharePoint document library

This is also an option to copy for the compiled assets to be copied to a SharePoint document library anywhere in the tenancy. Setting this up is quite simple, first set the setting “includeClientSideAssets”: false in package-solution.json file and then set the CDN details in write-manifests.json  file.

Merits:

  • No need of additional Azure hosting or enabling Office 365 CDN
  • Access to Assets files from SharePoint interface

Limitations:

  • Manual Copy of assets file to SharePoint library
  • Accidental deletion could cause issues

4. Deploy to ClientAssets in App Catalog

From SPFx version 1.4, it is possible to include assets as part of the package file and deploy it to the hidden ClientAssets library residing in App Catalog. It is set in the package-solution.json file “includeClientSideAssets”: true.

Merits:

  • No extra steps needed to package and deploy assets

Limitations:

  • Increases the payload of the package file
  • Risk for Tenant admins to deploy script files to the Tenant App Catalog.

Conclusion

In this blog, we saw the various options for SPFx deployment, and merits and limitations of each approach.

Happy Coding !!!

 

Set up Accounts and secure passwords to run automation workloads in Azure Functions

In some of my previous blogs here, we have seen how we could use Azure Functions to to automate processes and SharePoint workloads.

Most of these jobs run using elevated or stored privileged accounts as the Azure Function is in a different context than the user context. There are various ways we could setup these accounts. Some of these approaches are below:

  1. Azure AD Service Accounts
    • Suitable for all operations
    • Need access to resource
    • Reusable across multiple workloads
  2. Azure AD Apps
    • Suitable for Graph Access
    • Need exact permissions set up
    • Might need Tenant Admin authentication
  3. SharePoint App Accounts
    • Suitable for SharePoint workloads.
    • Need Site and App specific privileges

The details of these accounts could be stored in the Azure Functions App Settings (for dev and production) or local.settings.json file during local development.

The most important consideration would be to prevent from exposing password details in the Azure functions in case of unauthorized access. There are two ways we could achieve this:

1. Encrypting the password and store in the Azure Function (PowerShell)
2. Using Azure Key Vault to store and access password details (C#)

Encrypting Passwords in Azure Functions

For doing this, first lets’ create an encrypted password using PowerShell using the script below.

Next, copy the file to a bin folder in Azure Function using Azure File Explorer (Application Settings -> App Service Editor) and decrypt using the code below

Using Azure Key Vault

For using Azure Key Vault, the steps are as below

1. Create an Azure AD App and get the Client ID and Client Secret

2. Create a Azure Key Vault and add the above Azure AD app to have Get Access to the key vault. The below permissions will suffix to read the secret.
Azure Key Vault Secret Permissions

3. Create Secret in key vault, then store the password and the secure Uri

4. Store the Secret Uri, Client ID and Client Secret in Azure App Settings

5. Use the below code to get the secure pass.

Conclusion

Hence above we saw how we could set up accounts in Azure Function for elevated access to SharePoint and Resource locations.

Deploy and Add SPFx webparts to Modern Pages using OfficeDevPnP CSOM library

In the previous blog here, we looked at how to install apps on a SharePoint site. With SharePoint and Office Dev PnP CSOM, we could also add web parts to Modern Pages, both out of the box (OOB) web parts and custom web parts. For out of box web parts, refer to Chris O’Brien article here , where he has provided steps and also the web part IDs for the OOB webparts which is really helpful.

In this blog, we will look at steps to add a custom web part and set it properties.

For the below code, we will use OfficeDevPnP CSOM library. The high level steps for implementation are below:

1. Find the page where to add the web part. For creating a modern page programmatically, refer to this link here.

2. Find if the web part is added to the page, if not then add it using web part ID. We will read the web part ID from a JSON class where we have kept the details of the custom web part

3. After the web part is added, then set the web part properties using JSON schema of the properties using NewtonSoft.Json.

Note: If the web part app is just installed, then it might take time for 
the web part to be ready for use. 
Hence put a wait state till the web part is loaded for use.

The below code has the steps for adding the web part programmatically using CSOM

Conclusion

The above process could be used to add web part programmatically onto a page and set it’s properties.

Happy Coding !!

Programmatically deploy and add SharePoint Framework Extensions using SharePoint CSOM and PowerShell

In the previous blog here, we looked at how to deploy and install SharePoint Apps. Now let’s look at installing SharePoint Framework extensions – Listview command sets programmatically.

SharePoint CSOM

SharePoint Framework has three type of extensions that could be created – Application customiser, Listview command sets and Field customisers. In this blog, we will look at adding list view command sets programmatically.

Listview command extensions are actually custom actions installed in a library or list. Hence to activate it we will go to the library/list, find the installed custom actions, if not installed we will install the new custom action. Below is the code for that.

PowerShell

We could also use PnP PowerShell to add the Library extension onto a page using the code below

Hence, above we saw how we could add extensions onto a library or list using CSOM or PowerShell

Happy Coding!!

Deploy and Install SharePoint Apps using SharePoint CSOM and PnP PowerShell

In this blog, we will look at steps to install and deploy SharePoint apps to Modern Sites using SharePoint ALM CSOM and PnP PowerShell. Using the below steps, it is possible to programmatically deploy and install custom SharePoint Framework apps using an Azure Function or a Local PowerShell script.

Installing SharePoint Apps

SharePoint Apps can be deployed on a site using ALM (Application Lifecycle Management) APIs. After the app is installed in the App catalog, we could add it to a SharePoint site.

SharePoint CSOM

The steps are simple. The below snippet has the code for deploying and installing apps.

1. Get the App Id
2. Create an App Manager Object
3. Deploy the App
4. After deploy the app, install the app.
5. Use Try – catch to handle if the installation has already done

PnP PowerShell

First, lets’ get a list of apps in the App catalog.

Note: There are two values that is supported by scope paramaters for Apps – Tenant and Site.

Get-PnPApp -Scope Tenant
or
Get-PnPApp -Scope Site

If the App is not installed, then we will add the app to the App catalog

Add-PnPApp -Path "<.sppkg file location>" -Scope Site

Then, publish the App

Publish-PnPApp -Identity  -Scope Tenant -SkipFeatureDeployment

-SkipFeatureDeployment is helpful to deploy Apps across the SharePoint tenancy if the app is developed for tenant wide deployments

After the above, we will install the App

Install-PnPApp -Identity  -Scope Site

After the app is installed, it is ready to be added or used at the site.

In the upcoming blog, we will see how to add SharePoint Framework extensions and web parts programmatically.

Happy Coding!!

Office 365 URLs and IP address updates for firewall and proxy configuration, using Flow and Azure Automation

tl;dr

To use Microsoft Office 365, an organisation must allow traffic to [and sometimes from] the respective cloud services via the internet on specific ports and protocols to various URLs and/or IP addresses, or if you meet the requirements via Azure ExpressRoute. Oh duh?!
To further expand on that, connections to trusted networks (which we assume Office 365 falls into this category) that are also high in volume (since most communication and collaborative infrastructure resides there) should be via a low latency egress that is as close to the end user as possible.
As more and more customers use the service, as more and more services and functionality is added, so to will the URLs and IP addresses need to change over time. Firewalls and proxies need to be kept up to date with the destination details of Office 365 services. This is an evergreen solution, lets not forget. So, it’s important to put the processes in-place to correctly optimise connectivity to Office 365. It’s also very important to note that these processes, around change management, if left ignored, will result in services being blocked or delivering inconsistent experiences for end users.

Change is afoot

Come October 2nd 2018, Microsoft will change the way customers can keep up to date with these changes to these URLs and IP addresses. A new web service is coming online that publishes Office 365 endpoints, making it easier for you to evaluate, configure, and stay up to date with changes.

Furthermore, the holistic overview of these URLs and IP addresses is being broken down into three new key categories: OPTIMISE, ALLOW and DEFAULT.

You can get more details on these 3x categories from the following blog post on TechNet: https://blogs.technet.microsoft.com/onthewire/2018/04/06/new-office-365-url-categories-to-help-you-optimize-the-traffic-which-really-matters/
 
It’s not all doom and gloom as your RSS feed no longer works. The new web service (still in preview, public preview, at the time of writing this blog) is rather zippy and allows for some great automation. So, that’s the target state: automation.
Microsoft wants to make it nice and easy for firewall, proxy or whatever edge security appliance vendor or service provider to programmatically interact with the web service and offer dynamic updates for Office 365 URL and IP address information. In practice, change management and governance processes will evidently still be followed. In most circumstances, organisations are following whatever ITIL or ITIL like methodologies are in place for those sorts of things.
The dream Microsoft has, though, is actually one that is quite compelling.
Before we get to this streamlined utopia where my customers edge devices update automatically, I’ve needed to come up with a process for the interim tactical state. This process runs through as follows:

  • Check daily for changes in Office 365 URLs and IP addresses
  • Download changes in a user readable format (So, ideally, no XML or JSON. Perhaps CSV for easy data manipulation or even ingestion into production systems)
  • Email intended parties that there has been a change in the global version number of the current Office 365 URLs and IP addresses
  • Allow intended parties to download the output

NOTE – for my use case here, the details for the output is purely IP addresses. That’s because the infrastructure that the teams I’ll be sending this information to only allows for that data type. If you were to tweak the web service request (details further down), you can grab both URLs and IP addresses, or one or the other.

 

Leveraging Microsoft Flow and Azure Automation

My first instinct here was to use Azure Automation and run a very long PowerShell script with If’s and Then’s and so on. However, when going through the script, 1) my PowerShell skills are not that high level to bang this out and 2) Flow is an amazing tool to run through some of the tricky bits in a more effortless way.
So, leveraging the goodness of Flow, here’s a high level rundown of what the solution looks like:

 
The workflow runs as follows:

  1. Microsoft Flow
  2. On a daily schedule, the flow is triggered at 6am
  3. Runbook #1
    1. Runbook is initiated
    2. Runbook imports CSV from Azure Blob
    3. Powershell runs comment to query web service and saves output to CSV
    4. CSV is copied to Azure Blob
  4. Runbook #2 imports a CSV
    1. Runbook is initiated
    2. Runbook imports CSV from Azure Blob
    3. The last cell in the version column is compared to the previous
    4. An Output is saved to Azure Automation if a newer version found, “NEW-VERSION-FOUND”
  5. The Output is taken from the prvious Azure Automation Runbook run
  6. A Flow Condition is triggered – YES if Output is found, NO if nothing found

Output = YES

  • 7y1 = Runbook #3 is run
    • Runbook queries web service for all 3 conditions: optimise, allow and default
    • Each query for that days IP address information is saved into 3 separate CSV files
  • 7y2 = CSV files are copied to Azure Blob
  • 7y3 = Microsoft Flow queries Azure Blob for the three files
  • 7y4 = An email template is used to email respective interested parties about change to the IP address information
    • The 3x files are added as attachments

Output = Nothing or NO

  • 7n1 = Sent an email to the service account mailbox to say there was no changes to the IP address information for that day

 

The process

Assuming, dear reader, that you have some background with Azure and Flow, here’s a detailed outlined of the process I went through (and one that you can replicate) to automate checking and providing relevant parties with updates to the Office 365 URLs and IP address data.
Lets begin!

Step 1 – Azure AD
  • I created a service account in Azure AD that has been given an Office 365 license for Exchange Online and Flow
  • The user details don’t really matter here as you can follow your own naming convention
  • My example username is as follows: svc-as-aa-01@[mytenant].onmicrosoft.com
    • Naming convention being: “Service account – Australia South East – Azure Automation – Sequence number”
Step 2 – Azure setup – Resource Group
  • I logged onto my Azure environment and created a new Resource Group
  • My solution has a couple of components (Azure Automation account and a Storage account), so I put them all in the same RG. Nice and easy
  • My Resource Group details
    • Name = [ASPRODSVCAA01RG]
    • Region = Australia South East as that’s the local Azure Automation region
    • That’s a basic naming convention of: “Australia South East – Production environment – Purpose, being for the SVC account and Azure Automation – Sequence number – Resource Group”
  • Once the group was created, I added my service account as a Contributor to the group
    • This allows the account downstream permissions to the Azure Automation and Storage accounts I’ll add to the resource group
Step 3 – Azure Setup – Storage account
  • I created a storage account and stored that in my resource group
  • The storage account details are as follows
    • Name = [asprodsvcaa01] = Again, follow your own naming convention
    • Deployment model = Resource manager
    • Storage General Purpose v2
    • Local redundant storage only
    • Standard performance
    • Hot access tier
  • Within the storage account, I’ve used Blob storage
    • There’s two containers that I used:
      • Container #1 = “daily”
      • Container #2 = “ipaddresses”
    • This is where the output CSV files will be stored
  • Again, we don’t need to assign any permissions as we assigned Contributor permissions to the resource group
Step 4 – Azure Setup – Azure Automation
  • I created a new Azure Automation account with the following parameters
    • Name = [SVCASPROD01AA] = Again, follow your own naming convention
    • Default parameters, matching my resource group settings
    • Yes, I also had a Run As account created (default option)
  • I created three Runbooks created, as per below

 

  • Step1-GetGlobalVersion = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion
#region CONNECT
$pass = ConvertTo-SecureString "[pass phrase here]" -AsPlainText –Force
$cred = New-Object -TypeName pscredential –ArgumentList "[credential account]", $pass
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId "[tenant id]"
#endregion
#region IMPORT CSV FILE FROM BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[account key here]'
Get-AzureStorageBlob -Context $storageContext -Container "[name here]" | Get-AzureStorageBlobContent -Destination . -Context $storageContext -Force
#endregion
#region GET CURRENT VERION
$DATE = $(((get-date).ToUniversalTime()).ToString("yyyy-MM-dd"))
Invoke-RestMethod -Uri https://endpoints.office.com/version/Worldwide?ClientRequestId=b10c5ed1-bad1-445f-b386-b919946339a7 | Select-Object @{Label="VERSION";Expression={($_.Latest)}},@{Label="DATE";Expression={($Date)}} | Export-Csv [daily-export.csv] -NoTypeInformation -Append
# SAVE TO BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[account key here]'
Set-AzureStorageBlobContent -File [.\daily-export.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
#endregion
#region OUTPUT
Write-Output "SCRIPT-COMPLETE"
#endregion

 

  • Step2-CheckGlobalVersion = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion
#region CONNECT
$pass = ConvertTo-SecureString "[pass phrase here]" -AsPlainText –Force
$cred = New-Object -TypeName pscredential –ArgumentList "[credential account]", $pass
Login-AzureRmAccount -Credential $cred -ServicePrincipal –TenantId "[tenant id]" #endregion
#region IMPORT CSV FILE FROM BLOB
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[key here]'
Get-AzureStorageBlob -Context $storageContext -Container [name here] | Get-AzureStorageBlobContent -Destination . -Context $storageContext -Force
#endregion
#region CHECK IF THERE IS A DIFFERENCE IN THE VERSION
$ExportedCsv = import-csv [.\daily-export.csv]
$Last = $ExportedCsv | Select-Object -Last 1 -ExpandProperty Version # Last value in Version column
$SecondLast = $ExportedCsv | Select-Object -Last 1 -Skip 1 -ExpandProperty Version #Second last value in version column
If ($Last –gt $SecondLast) {
Write-Output '[NEW-VERSION-FOUND]'
}

 

  • Step3-GetURLsAndIPAddresses = Again, follow your own naming convention
  • This is a Powershell runbook
  • Here’s the example script I put together:
#region SETUP
Import-Module AzureRM.Profile
Import-Module AzureRM.Resources
Import-Module AzureRM.Storage
#endregion
#region EXECUTE PROCESS TO DOWNLOAD NEW VERSION
$endpoints = Invoke-RestMethod -Uri https://endpoints.office.com/endpoints/Worldwide?ClientRequestId=b10c5ed1-bad1-445f-b386-b919946339a7
$endpoints | Foreach {if ($_.category -in ('Optimize')) {$_.IPs}} | Sort-Object -unique | Out-File [.\OptimizeFIle.csv]
$endpoints | Foreach {if ($_.category -in ('Allow')) {$_.IPs}} | Sort-Object -unique | Out-File [.\AllowFile.csv]
$endpoints | Foreach {if ($_.category -in ('Default')) {$_.IPs}} | Sort-Object -unique | Out-File [.\DefaultFile.csv]
$acctKey = (Get-AzureRmStorageAccountKey -Name [name here] -ResourceGroupName [name here]).Value[0]
$storageContext = New-AzureStorageContext -StorageAccountName "[name here]" -StorageAccountKey '[key here]'
Set-AzureStorageBlobContent -File [.\OptimizeFIle.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
Set-AzureStorageBlobContent -File [.\AllowFile.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
Set-AzureStorageBlobContent -File [.\DefaultFile.csv] -Container "[name here]" -BlobType "Block" -Context $storageContext -Force
#endregion
#region OUTPUT
Write-Output "SCRIPT COMPLETE"
#endregion
  • Note that we don’t need to import the complete AzureRM Powershell modules
  • You’ll find that if you do something “lazy” like that, there’s a whole lot of dependencies in Azure Automation
    • You’ll need to manually add in all the sub-modules which is very time consuming
Step 5 – Microsoft Flow
  • With my service account having a Flow license, I created my Flow there
  • This means that I can pass this onto Managed Services to run with and maintain going forward
  • I started with a blank Flow
  • I added a schedule
    • The schedule runs at 6am every day

  • Step 1 is to add in an Azure Automation Create Job task
    • This is to execute the Runbook “Step1-GetGlobalVersion”
    • Flow will try and connect to Azure with our Service account
    • Because we added all the relevant permissions earlier in Azure, the Resource Group and downstream resources will come up automatically
    • Enter in the relevant details

  • Step 2 is to add in another Azure Automation Create Job task
    • This is to execute the Runbook “Step2-CheckGlobalVersion”
    • Again, Flow will connect and allow you to select resources that the service account has Contributor permissions to

  • Step 3 is to add in an Azure Automation Get Job Output
    • This is to grab the Output data from the previous Azure Automation runbook
    • The details are pretty simply
    • I selected the “JobID” from the Step 2 Azure Automation runbook job

  • Step 4 is where things get interesting
  • This is a Flow Condition
  • This is where we need to specify if a Value of “NEW-VERSION-FOUND” is found in the content of the Output from the Step 2 Job, Do something or Not do something

  • Step 5 is where I added in all the “IF YES” flow to Do something because we have an output of “NEW-VERSION-FOUND”
  • The first sub-process is another Azure Automation Create Job task
  • This is to execute the Runbook “Step3-GetURLsandIPaddresses”
  • Again, Flow will connect and allow you to select resources that the service account has Contributor permissions to

  • Step 6 is to create 3 x Get Blob Content actions
  • This will allow us to connect to Azure Blob storage and grab the 3x CSV files that the previous steps output to Blob created
  • We’re doing this so we can embed them in an email we’re going to send to relevant parties in need of this information

  • Step 7 is to create an email template
  • As we added an Exchange Online license to our service account earlier, we’ll have the ability to send email as the service accounts mailbox
  • The details are pretty straight forward here:
    • Enter in the recipient address
    • The sender
    • The subject
    • The email body
      • This can be a little tricky, but, I’ve found that if you enable HTML (last option in the Send An Email action), you can use <br> or line break to space out your email nicely
    • Lastly, we’ll attach the 3x Blobs that we picked up in the previous step
    • We just need to manually set the name of email file
    • Then select the Content via the Dynamic Content option
      • Note: if you see this error “We can’t find any outputs to match this input format.Select to see all outputs from previous actions.” – simply hit the “See more” button
      • The See more button will show you the content option in the previous step (step 6 above)

  • Step 8 is to go over to the If No condition
  • This is probably option because I believe the old saying goes “no new is good news”
  • However, for the purposes of tracking how often changes happen easily, I thought I’d email the service account and store a daily email if no action was taken
    • I’ll probably see myself as well here to keep an eye on Flow to make sure it’s running
    • I can use inbox rules to move the emails out of my inbox and into a folder to streamline it further and keep my inbox clean
  • The details are pretty much the same as the previous Step 7
    • However, there’s no attachments required here
    • This is a simple email notification where I entered the following in the body: “### NO CHANGE IN O365 URLs and IP ADDRESSES TODAY ###”


 

Final words

Having done many Office 365 email migrations, I’ve come to use Powershell and CSV’s quite a lot to make my life easier when there’s 1000’s of records to work with. This process uses that experience and that speed of working on a solution using CSV files. I’m sure there’s better ways to streamline that component, like for example using Azure Table Storage.
I’m also sure there’s better ways of storing credential information, which, for the time being isn’t a problem while I work out this new process. The overall governance will get ironed out and I’ll likely leverage Azure Automation Credential store, or even Azure Key Vault.
If you, dear reader, have found a more streamlined and novel way to achieve this that requires even less effort to setup, please share!
Best,
Lucian
#WorkSmarterNotHarder
 

Provisioning complex Modern Sites with Azure Functions and Flow – Part 2 – Create and Apply Template

In the previous blog here, we got an overview of the high level Architecture of a Complex Modern team site provisioning process. In this blog, we will look at the step 1 of the process – Create and Apply template process, in detail.
Before that, below are few links to earlier blogs, as a refresher, to prerequisties for the blog.

  1. Set up a Graph App to call Graph Service using App ID and Secret – link
  2. Sequencing HTTP Trigger Azure Functions for simultaneous calls – link
  3. Adding and Updating owners using Microsoft Graph Async calls – link

Overview
The Create and Apply Template process aims at the following
1. Create a blank modern team site using Groups Template (Group#0 Site template)
2. Apply the provisioning template on the created site.
Step 1 : Create a blank Modern team site
For creating a modern team site using CSOM we will use the TeamSiteCollectionCreationInformation class of OfficeDevPnP.  Before we create the site, we will make sure the site doesn’t already exist.

Note: There is an issue with the Site Assets library not getting intialized
when the site is created using the below code.
Hence, calling the EnsureSiteAssets library is necessary.

Step 2:  Apply the Provisioning Template

Note: The Apply template process is a long running process and takes from 60-90 min to complete
for a complex provisioning template with many site columns, content types and libraries.
In order to prevent the Azure function from timing out, it is required to host the Azure Function
using a App Service Plan instead of a Consumption plan so the Azure function
is not affected by the 10 min time out. 

For the Apply Provisioning Template process, use the below steps.
1. Reading the Template
It is important to note that the XMLPnPSchemaFormatter version (in the code below) must match the PnP version used to generate the PnP template. If the version is older, then set the XMLPnPSchemaFormatter to read from the older version. In order to find the version of the PnP Template, open the xml and look at the start of the file
PnPTemplateVersion

2. Apply the Template
For applying the template, we will use the ProvisioningTemplateApplyingInformation class of the OfficeDevPnP module. The ProvisioningTemplateApplyingInformation also has a property called HandlerToProcess which could be used the invoke the particular handler in the provisioning template process. Below is the code for the same.

After the apply template process is complete, since the flow will have timed out, we will invoke another flow to do the post process by updating a list item in the SharePoint list.
Conclusion
In this blog, we saw how we could create a modern team site and apply the template on it. The next blog we will finalize the process by doing site specfic changes after applying the template.

Provisioning complex Modern Sites with Azure Functions and Microsoft Flow – Part 1 – Architecture

In one of my previous blog here,  I have discussed about creating Office 365 groups using Azure Function and Flow. The same process could be used also to provision Modern Team sites in SharePoint Online because Modern Team Sites are Office 365 groups too. However, if you are creating a Complex Modern Team Site with lots of Libraries, Content types, Termstore associated columns etc. it will challenging to do it with a single Azure Function.
Thus, in this blog (part 1), we will look at the Architecture of a Solution to provision a complex Modern Team Site using multiple Azure Function and Flows. This is an approach that went through four months of validation and testing. There might be other options but this one worked for the complex team site which takes around 45-90 mins to provision.
Solution Design
To start with lets’ look at the solution design. The solution consists of two major components
1. Template Creation – Create a SharePoint Modern Team site to be used as a template and generate a Provisioning template from it
2. Provisioning Process – Create a SharePoint Inventory List to run the Flow and Azure Function. There will be three Azure Functions that will run three separate parts of the provisioning lifecycle. More details about the Azure Functions will in upcoming blog.
Get the Provisioning Template
The first step in the process is to  create a clean site that will be used as a reference template site for the Provisioning template. In this site, create all the lists, libraries, site columns, content type and set other necessary site settings.
In order to make sure that the generated template doesn’t have any elements which are not needed for provisioning, use the following PnP PowerShell cmdlet. The below cmdlet removes any content type hub association, ALM api handles and site security for provisioning requirements.

Get-PnPProvisioningTemplate -Out "" -ExcludeHandlers ApplicationLifecycleManagement, SiteSecurity -ExcludeContentTypesFromSyndication

The output of the above cmdlet is ProvisioningTemplate.xml file which could be applied to new sites for setting up the same SharePoint elements. To know more about the provisioning template file, schema and allowed tags, check the link here.
ModernSitesProvisioningFlow_GetTemplate
Team Site Provsioning Process
The second step in the process would be to create and apply the template to a Modern SharePoint Team site using Flow and Azure Function. The detail steps would be as follows:
1. Create an Inventory list to capture all the requirements for Site Creation
2. Create two flows
a) Create and Apply Template flow, and
b) Post Provisioning Flow
3. Create three Azure Functions –
a) Create a blank Modern Team Site
b) Apply Provisioning Template on the above site. This is a long running process and can take about 45-90 min for applying a complex template with about 20 libraries, 20-30 site columns and 10-15 content types
Note: Azure Functions on Consumption plan have a timeout of 10 min. Host the Azure function on an App Service Plan for the above to work without issues
c) Post Provisioning to apply changes that are not supported by Provisioning Template such as Creating default folders etc.
Below is the process flow for the provisioning process. It has steps from 1 – 11 which goes from creating the site to applying it. The brief list of the steps are as follows

  1. Call the Create Site flow to start the Provisioning Process
  2. Call the Create Site Azure Function
  3. Create the Modern Team Site in Azure Function and set any dependencies required for the Apply template such as Navigation items, pages etc, and then return to flow
  4. Call the Apply Template Azure Function.
  5. Get the previously generated ProvisioningTemplate.xml file from a shared location
  6. Apply the Template onto the newly created Modern site. Note: The flow call times out because it cannot wait for such a long running process
  7. Update the status column in the Site Directory for the post provisioning flow to start
  8. Call the Post provisioning flow to run the Post provisioning azure function
  9. The Post provisioning azure function will complete the remaining SharePoint changes which were not completed by the apply template such as, set field default values, create folders in any libraries, associate default values to taxonomy fields etc.

ModernSitesProvisioningFlow_ProvisioningProcess
Conclusion:
Hence in the above blog, we saw how to create a provisioning process to handle complex modern team site creation at a high architectural level. Next, we will deep dive into the Azure functions to create, apply template and post process in the next upcoming blogs.
Happy Coding!!!

Planning Site structure and Navigation in SharePoint Modern Experience Communication and Team sites

If you are planning to implement or implementing Modern team sites or Communication sites, there is change in best practices for planning and managing the Sites structure, Site Hierarchy and Navigation. This is a very common question during my presentations – how do we manage site structures, navigation and content in Modern experiences.
So, in this blog, we will look at few strategies for planning Site structure and Navigation in Modern Experience sites.
1. First and foremost, get rid of nested subsites and Site hierarchy navigation. Recently Microsoft has been pushing for Site Collections flat structure with Modern Team and Communication sites, which adds a lot of benefit for managing isolation and content. So, the new approach – Flat Site Collections and no Subsites. (There are various advantages of flat structure site collections which will be listed in another upcoming blog)
2. Secondly, to achieve a hierarchy relationship among sites such as Navigation, news items, search etc, use Hub Sites. Hub sites are the new way of connecting SharePoint site collections together. Besides, they have added advantage of aggregating information such as News and Search results from related hub associated sites. So, create a Hub site for Navigation across related sites.HubSiteAssociatedTeam
3. A best candidate for Hub sites, in my opinion, is Communication sites. Communication sites have a top navigation that can be easily extended for Hub sites. They are perfect for publishing information and showing aggregrated content. However, it also depends on if the Hub site is meant for a team and business unit or company as a whole. So, use Communication as a Hub site if targeting all company or a major group.QuickLaunchNestedCommunicationSite
4. One Navigation structure – Quick launch (Left hand) is Top Navigation for Communication sites. So no need to maintain two navigations. If you ask me, this a big win and removes a lot of confusion for end users.QuickLaunchEdit_CommSite
5. Quick launch for Modern Team and Communication Sites allows three level sub hierarchy which allows to define a nested custom hierarchy structure for Navigation which could be different from the content structure and site structure.
6. Focus on Content, not on Navigation or location of Content, through new Modern web parts such as Highlighted content, Quick links etc. which allow you to find content anywhere easily.HighlightedContent
7. Finally, few limitations of Modern Site Structure and Navigation (as of June 2018) for reference. Hopefully, this will be changed soon.

    • Permissions management still needs to be managed at each Site Collection, no nested structure there yet. Yet it is possible to use AD groups for consistent permissions set up
    • Office 365 Unified Security Groups cannot have AD or other Office 365 groups nested for Modern Team sites. But SharePoint site permissions could be managed through AD groups
    • Contextual Navigation bolding is missing in Hub sites i.e. if you click on the link to move to a child site then navigation is not automatically bolded, but this might be coming soon.
    • Navigation headers in Modern sites cannot be blank and needs to be set to a link

Conclusion:
Hence in this blog, we looked at an approach for Modern site structures, hierarchy and navigation.

Notes From The Field – Enabling GAL Segmentation in Exchange Online

First published at https://nivleshc.wordpress.com

Introduction

A few weeks back, I was tasked with configuring Global Address List (GAL) Segmentation for one of my clients. GAL Segmentation is not a new concept, and if you were to Google it (as you would do in this day and age), you will find numerous posts on it.
However, during my research, I didn’t find any ONE article that helped me. Instead I had to rely on multiple articles/blogposts to guide me into reaching the result.
For those that are new to GAL Segmentation, this can be a daunting task. This actually is the inspiration for this blog, to provide the steps from an implementers view, so that you get the full picture about the preparation, the steps involved and the gotchas so that you feel confident about carrying out this simple yet scary change.
This blog will be focus on GAL Segmentation for an Exchange Online hybrid setup.

So what is GAL Segmentation?

I am glad you asked 😉
By default, in Exchange Online (and On-Premises Exchange environment as well), a global address list is present. This GAL contains all mail enabled objects contained in the Exchange Organisation. There would be mailboxes, contacts, rooms, etc.
This is all well and good, however, at times a company might not want everyone to see all the objects in the Exchange environment. This might be for various reasons, for instance, the company has too many employees and it won’t make sense to have a GAL a mile long. Or, the company might have different divisions, which do not require to correspond to each other. Or the company might be trying to sell off one of its divisions, and to start the process, is trying to separate the division from the rest of the company.
For this blog, we will use the last reason, as stated above. A “filter” will be applied to all users who are in division to be sold off, so that when they open their GAL, they only see objects from their own division and not everyone in the company. In similar fashion, the rest of the company will see all objects except the division that will be sold off. Users will still be able to send/receive emails with that particular division, however the GAL will not show them.
I would like to make it extremely clear that GAL Segmentation DOES NOT DELETE any mail enable objects. It just creates a filtered version of the GAL for the user.

Introducing the stars

Lets assume there was once a company called TailSpin Toys. They owned the email namespace tailspintoys.com and had their own Exchange Online tenant.
One day, the board of TailSpin Toys decided to acquire a similar company called WingTip ToysWingTip Toys had their own Exchange Online Tenant and used the email namespace wingtiptoys.com. After the acquisition, WingTip Toys email resources were merged into the TailSpin Toys Exchange Online tenant, however WingTip Toys still used their wingtiptoys.com email namespace.
After a few years, the board of TailSpin Toys decided it was time to sell of WingTip Toys. As a first step, they decided to implement GAL Segmentation between TailSpin Toys and WingTip Toys users.
Listed below is what was decided

  • TailSpin Toys users should only see email objects in their GAL corresponding to their own email namespace (any object with the primary smtp address of @tailspintoys.com). They should not be able to see any WingTip Toys email objects.
  • Only TailSpin Toys users will be able to see Public Folders in their GAL
  • WingTip Toys users should only see email objects in their GAL corresponding to their own email namespace (any object with the primary smtp address of @wingtiptoys.com). They should not be able to see any TailSpin Toys email objects.
  • The All Contacts in the GAL will be accessible to both WingTip Toys and TailSpin Toys users.

The Steps

Performing a GAL Segmentation is a very low risk change. The steps that will be carried out are as follows

  • Create new Global Address Lists, Address Lists, Offline Address Book and Address Book Policy for TailSpin Toys and WingTip Toys users.
  • Assign the respective policy to TailSpin Toys users and WingTip Toys users

The only issue is that by default, no users are assigned an Address Book Policy (ABP) in Exchange Online (ABPs are the “filter” that specifies what a user sees in the GAL).
Due to this, when we are creating the new address lists, users might see them in their GAL as well and get confused as to which one to use. If you wish to carry out this change within business hours, the simple remedy to the above issue is to provide clear communications to the users about what they could expect during the change window and what they should do (in this case use the GAL that they always use). Having said that, it is always a good practice to carry carry out changes out of business hours.
Ok, lets begin.

  • By default, the Address Lists Management role is not assigned in Exchange Online. The easiest way to assign this is to login to the Exchange Online Portal using a Global Administrator account and add this role to the Organization Management role group. This will then provide all the Address List commands to the Global Administratos.
  • Next, connect to Exchange Online using PowerShell
  • For TailSpin Toys
    • Create a default Global Address List called Default TST Global Address List
    • New-GlobalAddressList -Name "Default TST Global Address List" -RecipientFilter {((Alias -ne $null) -and (((ObjectClass -eq 'user') -or (ObjectClass -eq 'contact') -or (ObjectClass -eq 'msExchSystemMailbox') -or (ObjectClass -eq 'msExchDynamicDistributionList') -or (ObjectClass -eq 'group') -or (ObjectClass -eq 'publicFolder'))) -and (WindowsEmailAddress -like "*@tailspintoys.com") )}
    • Create the following Address Lists
      • All TST Distribution Lists
      • New-AddressList -Name "All TST Distribution Lists" -RecipientFilter {((Alias -ne $null) -and (ObjectCategory -like 'group') -and (WindowsEmailAddress -like "*@tailspintoys.com"))}
      • All TST Rooms
      • New-AddressList -Name "All TST Rooms" -RecipientFilter {((Alias -ne $null) -and (((RecipientDisplayType -eq 'ConferenceRoomMailbox') -or (RecipientDisplayType -eq 'SyncedConferenceRoomMailbox'))) -and (WindowsEmailAddress -like "*@tailspintoys.com"))}
      • All TST Users
      • New-AddressList -Name "All TST Users" -RecipientFilter {((Alias -ne $null) -and (((((((ObjectCategory -like 'person') -and (ObjectClass -eq 'user') -and (-not(Database -ne $null)) -and (-not(ServerLegacyDN -ne $null)))) -or (((ObjectCategory -like 'person') -and (ObjectClass -eq 'user') -and (((Database -ne $null) -or (ServerLegacyDN -ne $null))))))) -and (-not(RecipientTypeDetailsValue -eq 'GroupMailbox')))) -and (WindowsEmailAddress -like "*@tailspintoys.com"))}
    • Create an Offline Address Book called TST Offline Address Book (this uses the Default Global Address List that we had just created)
    • New-OfflineAddressBook -Name "TST Offline Address Book" -AddressLists "Default TST Global Address List"
    • Create an Address Book Policy called TST ABP
    • New-AddressBookPolicy -Name "TST ABP" -AddressLists "All Contacts", "All TST Distribution Lists", "All TST Users", “Public Folders” -RoomList "All TST Rooms" -OfflineAddressBook "TST Offline Address Book" -GlobalAddressList "Default TST Global Address List"
  • For WingTip Toys
    • Create a default Global Address List called Default WTT Global Address List
    • New-GlobalAddressList -Name "Default WTT Global Address List" -RecipientFilter {((Alias -ne $null) -and (((ObjectClass -eq 'user') -or (ObjectClass -eq 'contact') -or (ObjectClass -eq 'msExchSystemMailbox') -or (ObjectClass -eq 'msExchDynamicDistributionList') -or (ObjectClass -eq 'group') -or (ObjectClass -eq 'publicFolder'))) -and (WindowsEmailAddress -like "*@wingtiptoys.com") )}
    • Create the following Address Lists
      • All WTT Distribution Lists
      • New-AddressList -Name "All WTT Distribution Lists" -RecipientFilter {((Alias -ne $null) -and (ObjectCategory -like 'group') -and (WindowsEmailAddress -like "*@wingtiptoys.com"))}
      • All WTT Rooms
      • New-AddressList -Name "All WTT Rooms" -RecipientFilter {((Alias -ne $null) -and (((RecipientDisplayType -eq 'ConferenceRoomMailbox') -or (RecipientDisplayType -eq 'SyncedConferenceRoomMailbox'))) -and (WindowsEmailAddress -like "*@wingtiptoys.com"))}
      • All WTT Users
      • New-AddressList -Name "All WTT Users" -RecipientFilter {((Alias -ne $null) -and (((((((ObjectCategory -like 'person') -and (ObjectClass -eq 'user') -and (-not(Database -ne $null)) -and (-not(ServerLegacyDN -ne $null)))) -or (((ObjectCategory -like 'person') -and (ObjectClass -eq 'user') -and (((Database -ne $null) -or (ServerLegacyDN -ne $null))))))) -and (-not(RecipientTypeDetailsValue -eq 'GroupMailbox')))) -and (WindowsEmailAddress -like "*@wingtiptoys.com"))}
    • Create an Offline Address Book called WTT Offline Address Book (this uses the Default Global Address List that we had just created)
    • New-OfflineAddressBook -Name "WTT Offline Address Book" -AddressLists "Default WTT Global Address List"
    • Create an Address Book Policy called WTT ABP
    • New-AddressBookPolicy -Name "WTT ABP" -AddressLists "All Contacts", "All WTT Distribution Lists", "All WTT Users" -RoomList "All WTT Rooms" -OfflineAddressBook "WTT Offline Address Book" -GlobalAddressList "Default WTT Global Address List"
  • Once you create all the Address Lists, after a few minutes, you will be able to see them using Outlook Client or Outlook Web Access. One of the obvious things you will notice is that they are all empty! If you are wondering if the recipient filter is correct or not, you can use the following to confirm the membership
  • Get-Recipient -RecipientPreviewFilter (Get -AddressList -Identity {your address list name here}).RecipientFilter

    Aha, you might say at this stage. I will just run the Update-AddressList cmdlet. Unfortunately, this won’t work since this cmdlet is only available for On-Premises Exchange Servers. There is none for Exchange Online. Hmm. How do I update my Address Lists ? Its not too difficult. All you have to do is change some attribute of the members and they will start popping into the Address List! For a hybrid setup, this means we will have to change the setting using On-Premise Exchange Server and use Azure Active Directory Connect Server to replicate the changes to Azure Active Directory, which in turn will update Exchange Online objects, thereby updating the newly created Address Lists. Simple? Yes. Lengthly? Yes indeed

  • I normally use CustomAttribute for such occasions. Before using any CustomAttribute, ensure it is not used by anything else. You might    be able to ascertain this by checking if for all objects, that CustomAttribute currently holds any value or not. Lets assume CustomAttribute10 can be used.
    #Get all On-Premise Mailboxes
    $OnPrem_MBXs = Get-Mailbox -Resultsize unlimited
    #Get all Exchange Online Mailboxes
    $EXO_MBXs = Get-RemoteMailbox -Resultsize Unlimited
    #Get all the Distribution Groups
    $All_DL = Get-DistributionGroup -Resultsize unlimited
    #Update the CustomAttribute10 Value
    #Since Room mailboxes are a special type of Maibox, the following update will
    #address Room Mailboxes as well
    $OnPrem_MBXs | Set-Mailbox -CustomAttribute10 “GAL”
    $EXO_MBXs | Set-RemoteMailbox -CustomAttribute10 “GAL”
    $All_DL | Set-DistributionGroup -CustomAttribute10 “GAL”
  • Using your Azure Active Directory Connect server run a synchronization cycle so that the updates are synchronized to Azure Active Directory and subsequently to Exchange Online
  • One Gotcha here is if you have any Distribution Groups that are not synchronised from OnPremises. You will have to find these and update their settings as well. One simple way to find them is to use the property isDirSynced. Connect to Exchange Online using PowerShell and then use the following command
  • $All_NonDirsyncedDL = Get-DistributionGroup -Resultsize unlimited| ?{$_.isdirsynced -eq $FALSE}   
    #Now, we will update CustomAttribute10 (please check to ensure this customAttribute doesn't have any values)
    $All_NonDirSyncedDL | Set-DistributionGroup -CustomAttribute10 "GAL"
  • Check using Outlook Client or Outlook Address Book to see that the new Address Lists are now populated
  • Once confirmed that the new Address Lists have been populated, lets go assign the new Address Book Policies to TailSpin Toys and WingTip Toys users It can take anywhere from 30min – 1hr for the Address Book Policy to take effect
  • $allUserMbx = Get-Mailbox -RecipientTypeDetails UserMailbox -Resultsize unlimited
    #assign "TST ABP" Address Book Policy to TailSpin Toys users
    $allUserMbx | ?{($_.primarysmtpaddress -like "*@tailspintoys.com")} | Set-Mailbox -AddressBooksPolicy “TST ABP”
    #assign "WTT ABP" Address Book Policy to WingTip Toys users
    $allUserMbx | ?{($_.primarysmtpaddress -like "*@wingtiptoys.com")} | Set-Mailbox -AddressBooksPolicy “WTT ABP”
  • While waiting, remove the CustomAttribute10 values you had populated. Using PowerShell on On-Premises Exchange Server, run the following
  • #Get all On-Premise Mailboxes
    $OnPrem_MBXs = Get-Mailbox -Resultsize unlimited
    #Get all Exchange Online Mailboxes
    $EXO_MBXs = Get-RemoteMailbox -Resultsize Unlimited
    #Get all the Distribution Groups
    $All_DL = Get-DistributionGroup -Resultsize unlimited
    #Set the CustomAttribute10 Value to null
    #Since Room mailboxes are a special type of Maibox, the following update will
    #address Room Mailboxes as well
    $OnPrem_MBXs | Set-Mailbox -CustomAttribute10 $null
    $EXO_MBXs | Set-RemoteMailbox -CustomAttribute10 $null
    $All_DL | Set-DistributionGroup -CustomAttribute10 $null
  • Connect to Exchange Online using PowerShell and remove the value that was set for CustomAttribute10 for nonDirSynced Distribution Groups
  • $All_NonDirsyncedDL = Get-DistributionGroup -Resultsize unlimited| ?{$_.isdirsynced -eq $FALSE}   
    #Change CustomAttribute10 to $null
    $All_NonDirSyncedDL | Set-DistributionGroup -CustomAttribute10 $null

     
    Thats it folks! Your GAL Segmentation is now complete! Users from TailSpin Toys will only see TailSpin Toys mail enabled objects and WingTip Toys users will only see WingTip Toys mail enabled objects

A few words of wisdom

In the above steps, I would advise that once the new Address Lists have been populated

  • apply the Address Book Policy to a few test mailboxes
  • wait between 30min – 1 hour, then confirm that the Address Book Policy has been successfully applied to the test mailboxes and has the desired result
  • once you have confirmed that the test mailboxes had the desired result for ABP, then and ONLY then continue to the rest of the mailboxes

This will give you confidence that the change will be successful. Also, if you find that there are issues, the rollback is not too difficult and time consuming.
Another thing to note is that when users have their Outlook client configured to use  cached mode, they might notice that their new GAL is not fully populated. This is because their Outlook client uses the Offline Address Book to show the GAL and at that time, the Offline Address Book would not have regenerated to include all the new members. Unfortunately in Exchange Online, the Offline Address Book cannot be regenerated on-demand and we have to wait for the the Exchange Online servers to do this for us. I have noticed the regeneration happens twice in 24 hours, around 4am and 4pm AEST (your times might vary). So if users are complaining that their Outlook Client GAL doesn’t show all the users, confirm using Outlook Web Access that the members are there (or you can run Outlook in non-cached mode) and then advise the users that the issue will be resolved when the Offline Address Book gets re-generated (in approximately 12 hours). Also, once the Offline Address Book has regenerated, it is best for users to manually download the latest Offline Address Book, otherwise Outlook client will download it at a random time in the next 24 hours.
The next gotcha is around which Address Lists are available in Offline mode (refer to the screenshot below)
GAL01
When in Offline mode, the only list available is Offline Global Address List . This is the one that is pointed to by the  green arrow. Note that the red arrow is pointing to Offline Global Address List as well however this is an “Address List” that has been named Offline Global Address List by Microsoft to confuse people! To repeat, the Offline Global Address List pointed to by the green arrow is available in Offline mode however the one pointed to by red is not!
In our case, the Offline Global Address List is named Default TST Global Address List and Default WTT Global Address List).
If you try to access any others in the drop down list when in Offline mode, you will get the following error
AddressListError
This has always been the case, unfortunately hardly anyone tries to access all the Address Lists in Offline mode. However, after GAL Segmentation, if users receive the above error, it is very easy to blame the GAL Segmentation implementation 🙁 Rest assured, this is not the case and this “feature” has always been present.
Lastly, the user on-boarding steps will have to be modified to ensure that when their mailbox is created, the appropriate Address Book Policy is applied. This will ensure they only see the address lists that they are supposed to (on the flip side, if no address book policy is applied, they will see all address lists, which will cause a lot of confusion!)
With these words, I will now stop. I hope this blog comes in handy to anyone trying to implement GAL Segmentation.
If you have any more gotchas or things you can think of regarding GAL Segmentation, please leave them in the comments below.
Till the next time, Enjoy 😉
 
 
 
 
 
 

Follow ...+

Kloud Blog - Follow