Options to consider for SharePoint Framework solutions deployment

There are various options to package and deploy a SharePoint Framework solution and as part of packaging and deployment process, the developers have to identify a best approach for their team. Sometimes it becomes a nightmare to plan the right approach for your solution, if you haven’t weighed the options properly.

Working at multiple implementations of SPFx solution for sometime now, I have been able to get an idea of various options and approach for them. Hence in this blog, we will at these options and look at merits and limitations for each.

At a high level, below are the main components that are deployed as part of SPFx package deployment:

  1. The minified js file for all code
  2. The webpart manifest file
  3. Webpart compiled file
  4. The package (.pckg) file with all package information

Deployment Options

Please check this blog for an overview of the steps for building and packaging a SPFx solution. The packaged solution (.sppkg) file can then be deployed to a App catalog site. The assets of the package (point 1-3 of above) could be deployed by any of the four below options. We will look at the merits and limitations for each.

1. Deploy to Azure CDN or Office 365 CDN

The assets could be deployed to an Azure CDN. The deployment script is already a part of SPFx solution and could be done from within the solution. More detailed steps for setting this up are here.

Note: Please remember to enable CORS on the storage account before deployment of the package.

If CORS is not enabled before CDN profile is used, you might have delete and recreate the Storage account.

Merits:

  • Easy deployment using gulp
  • Faster access to assets and web part resources because of CDN hosting
  • Add geographical restrictions (if needed)

Limitations:

  • Dependency on Azure Subscription
  • Proper set up steps required for setting up Azure CDN. In some cases if the CDN if not set properly, then the deployment has to be done again.
  • Since the assets are deployed to a CDN endpoint, so if assets need restricted access then this mayn’t be recommended

2. Deploy to Office 365 Public CDN

For this option, you will need to enable and set up Office 365 CDN in your tenancy before deployment. For more details of setting this up, check the link here.

Merits:

  • Faster access to assets and web part resources because of CDN hosting
  • No Azure subscription requirement
  • Content is accessible from SharePoint Interface

Limitations:

  • Manual copy of assets files to CDN enabled SharePoint library
  • Office 365 CDN is a tenant setting and has to be enabled for the whole tenancy
  • Since the assets are deployed to a CDN endpoint, so if assets needs restricted access then this mayn’t be recommended
  • Accidental deletion could cause issues

3. Deploy to SharePoint document library

This is also an option to copy for the compiled assets to be copied to a SharePoint document library anywhere in the tenancy. Setting this up is quite simple, first set the setting “includeClientSideAssets”: false in package-solution.json file and then set the CDN details in write-manifests.json  file.

Merits:

  • No need of additional Azure hosting or enabling Office 365 CDN
  • Access to Assets files from SharePoint interface

Limitations:

  • Manual Copy of assets file to SharePoint library
  • Accidental deletion could cause issues

4. Deploy to ClientAssets in App Catalog

From SPFx version 1.4, it is possible to include assets as part of the package file and deploy it to the hidden ClientAssets library residing in App Catalog. It is set in the package-solution.json file “includeClientSideAssets”: true.

Merits:

  • No extra steps needed to package and deploy assets

Limitations:

  • Increases the payload of the package file
  • Risk for Tenant admins to deploy script files to the Tenant App Catalog.

Conclusion

In this blog, we saw the various options for SPFx deployment, and merits and limitations of each approach.

Happy Coding !!!

 

User Psychology and Experience

Often times when designing a product or solution for a customer, in planning and concept development, we might consider the user experience to be one of two (or both) things:

  1. User feedback regarding their interaction with their technological environment/platforms
  2. The experience the user is likely to have with given technology based on various factors that contribute to delivering that technology to them; presentation, training, accessibility, necessity, intuitiveness, just to name a few.

These factors are not solely focused on the user and their role in the human – technology interaction process, but also their experience of dealing with us as solution providers. That is to say, the way in which we engage the experience and behaviour of the user is just as important to the delivery of new technology to them, as is developing our own understanding of a broader sense of human interfacing technology behaviour. UX is a colourful – pun intended – profession/skill to have within this industry. Sales pitches, demos and generally ‘wowing the crowd’ are a few of the ways in which UX-ers can deploy their unique set of skills to curve user behaviour and responsiveness in a positive direction, for the supplier especially.

Not all behavioural considerations with regards to technology are underpinned by the needs or requirements of a user, however. There are more general patterns of behaviour and characteristics within people, particularly in a working environment, that can be observed, to indicate how a user experiences [new] technology, including functionality and valued content that, at a base level, captures a user’s attention. The psychology of this attention can be broken down into a simplified pathology: the working mechanisms of perception as a reaction to stimulus, and how consistent the behaviour is that develops out of this. The stimulus mentioned are usually the most common ones when relating to technology; visual, auditory.

You’ve likely heard of, or experienced first-hand, the common types of attention in everyday life. The main three are identified as selective, divided and undivided. Through consistency of behavourial outcomes, or observing in a use case a consistent reaction to stimuli, we look to observe a ‘sustainability of attention or interest’ over an extended period of the time, even if repetition of an activity or a set of activities is involved. This means that the solution, or at very least, the awareness and training developed to sell a solution, should serve a goal of achieving sustainable attention.

How Can We Derive a Positive User Experience through Psychology?

Too much information equals lack of cognitive intake. From observation and general experience, a person’s attention, especially when captured within a session, a day or week, is a finite resource. Many other factors of an individual’s life can create a cocktail of emotions which makes people in general, unpredictable in a professional environment. The right amount of information, training and direct experience should be segmented based on a gauge of the audience’s attention. Including reflection exercises or on-the-spot feedback, especially in user training can give you a good measure of this. The mistake of cognitively overloading the user can be best seen when a series of options are present as viable routes to the desired solution or outcome. Too many options can, at worst, create confusion, an adversity to the solution and new technologies in general, and an overall messy user experience.

Psychology doesn’t have to be a full submersion into the human psyche, especially when it comes to understanding the user experience. Simple empathy can be a powerful tool to mitigate some of the aforementioned issues of attention and to prevent the cultivation of repeated adverse behaviour from users. When it boils down to the users, most scenarios in the way of behaviour and reaction have been seen and experienced before, irrespective of the technology being provided. Fundamentally, it is a human experience that [still] comes first before we look at bridging the user and the technology. For UX practitioners, tools are already in-place to achieve this such as user journey maps and story capturing,

There are new ideas still emerging around the discipline of user experience, ‘UX’. From my experience with it thus far, it presents a case that it could integrate very well with modern business analysis methodologies. It’s more than designing the solution, it’s solutions based on how we, the human element, are designed.

Set up Accounts and secure passwords to run automation workloads in Azure Functions

In some of my previous blogs here, we have seen how we could use Azure Functions to to automate processes and SharePoint workloads.

Most of these jobs run using elevated or stored privileged accounts as the Azure Function is in a different context than the user context. There are various ways we could setup these accounts. Some of these approaches are below:

  1. Azure AD Service Accounts
    • Suitable for all operations
    • Need access to resource
    • Reusable across multiple workloads
  2. Azure AD Apps
    • Suitable for Graph Access
    • Need exact permissions set up
    • Might need Tenant Admin authentication
  3. SharePoint App Accounts
    • Suitable for SharePoint workloads.
    • Need Site and App specific privileges

The details of these accounts could be stored in the Azure Functions App Settings (for dev and production) or local.settings.json file during local development.

The most important consideration would be to prevent from exposing password details in the Azure functions in case of unauthorized access. There are two ways we could achieve this:

1. Encrypting the password and store in the Azure Function (PowerShell)
2. Using Azure Key Vault to store and access password details (C#)

Encrypting Passwords in Azure Functions

For doing this, first lets’ create an encrypted password using PowerShell using the script below.

Next, copy the file to a bin folder in Azure Function using Azure File Explorer (Application Settings -> App Service Editor) and decrypt using the code below

Using Azure Key Vault

For using Azure Key Vault, the steps are as below

1. Create an Azure AD App and get the Client ID and Client Secret

2. Create a Azure Key Vault and add the above Azure AD app to have Get Access to the key vault. The below permissions will suffix to read the secret.
Azure Key Vault Secret Permissions

3. Create Secret in key vault, then store the password and the secure Uri

4. Store the Secret Uri, Client ID and Client Secret in Azure App Settings

5. Use the below code to get the secure pass.

Conclusion

Hence above we saw how we could set up accounts in Azure Function for elevated access to SharePoint and Resource locations.

Provisioning complex Modern Sites with Azure Functions and Flow – Part 3 – Post Provisioning Site Configuration

In the previous two blogs part 1 and part 2, we looked at steps to create a Modern team site and apply a custom provisioning template to it. In this blog, we will have a look at the steps for the post provisioning process to implement site specific requirements. Some of them could be:

1. Apply default values to list fields
2. Create a bunch of default folders
3. Manage Security groups (SP level) and permission level.
4. Navigation level changes
5. Add/Enable web parts or custom actions (SPFx extensions)

Most of the above steps are part of SharePoint Provisioning processes for a long time, just are less complex now with Provisioning templates doing a lot of heavy lifting. I will be writing very soon about PnP Templates, do and don’ts.

Pre-requisties

One key point to add is that, with Modern Team Sites and Office 365 Groups we cannot add AD security groups into a Office 365 Unified Group. For more information, please see this link.

The apply template process (link) takes about 45-90 min (long running process) for complex templates so wouldn’t be possible to start the post process on a flow wait state. Hence, we could trigger the post provisioning process on update of an inventory item or poll an endpoint for the status. In our case, we triggered the process when updating the inventory list with a status that apply template process is complete.

Post Provisioning Process

1. The first step of the Post Provisioning proces is to make sure that noscript is enabled on the team site (link) and all dependencies are ready such as Term stores, Navigation items, Site Pages, Content types, site columns etc. For a complex site template, this step will check for failure conditions to make sure that all artefacts are in place.

Note: The below sequence of steps could vary based on the solution 
and site structure in place but this was the faster way to 
isolate issues and ensure dependencies.

After the failure checks are done, we will start with the Site structure changes and navigation changes. For implementing navigation changes, check the blogs here and here.

2. Next, we will update any site specific site columns, site content types and permission level changes. A upcoming blog will have more details to this.

3. After that, we will update the changes for list structure, we will move to list specific updates such as default value setting, modifying list properties etc.

4. Next let’s move on to Apps and Site Pages updates. These changes take time because of SharePoint ALM lifecycle any possible duplications. So error handling is the key. Please check the blog here and for steps to deploy app and web parts here.

5. Before we finalize the site, let’s provision folders and metadata. This is not a simple process if you have to set metadata values for large number of folders like in our case 800 recursive folders (child in parent). So we will use the metadata file override. All the values in the defaults file have to be hardcoded before override.

Note: The metadata file override is not generally a good approach 
because of possible corruption that renders the library unusable 
so do error handling for all cases. For the CSOM approach check here.

6. Finally, we will set the site Property bag values for Site to set some of the tags making the site searchable. Here is the blog for the same.

Conclusion

The above we saw the final process of Site Provisioning process with setting up site properties and attributes for preparing the site before handing it off to business for use.

Deploy and Add SPFx webparts to Modern Pages using OfficeDevPnP CSOM library

In the previous blog here, we looked at how to install apps on a SharePoint site. With SharePoint and Office Dev PnP CSOM, we could also add web parts to Modern Pages, both out of the box (OOB) web parts and custom web parts. For out of box web parts, refer to Chris O’Brien article here , where he has provided steps and also the web part IDs for the OOB webparts which is really helpful.

In this blog, we will look at steps to add a custom web part and set it properties.

For the below code, we will use OfficeDevPnP CSOM library. The high level steps for implementation are below:

1. Find the page where to add the web part. For creating a modern page programmatically, refer to this link here.

2. Find if the web part is added to the page, if not then add it using web part ID. We will read the web part ID from a JSON class where we have kept the details of the custom web part

3. After the web part is added, then set the web part properties using JSON schema of the properties using NewtonSoft.Json.

Note: If the web part app is just installed, then it might take time for 
the web part to be ready for use. 
Hence put a wait state till the web part is loaded for use.

The below code has the steps for adding the web part programmatically using CSOM

Conclusion

The above process could be used to add web part programmatically onto a page and set it’s properties.

Happy Coding !!

Programmatically deploy and add SharePoint Framework Extensions using SharePoint CSOM and PowerShell

In the previous blog here, we looked at how to deploy and install SharePoint Apps. Now let’s look at installing SharePoint Framework extensions – Listview command sets programmatically.

SharePoint CSOM

SharePoint Framework has three type of extensions that could be created – Application customiser, Listview command sets and Field customisers. In this blog, we will look at adding list view command sets programmatically.

Listview command extensions are actually custom actions installed in a library or list. Hence to activate it we will go to the library/list, find the installed custom actions, if not installed we will install the new custom action. Below is the code for that.

PowerShell

We could also use PnP PowerShell to add the Library extension onto a page using the code below

Hence, above we saw how we could add extensions onto a library or list using CSOM or PowerShell

Happy Coding!!

Deploy and Install SharePoint Apps using SharePoint CSOM and PnP PowerShell

In this blog, we will look at steps to install and deploy SharePoint apps to Modern Sites using SharePoint ALM CSOM and PnP PowerShell. Using the below steps, it is possible to programmatically deploy and install custom SharePoint Framework apps using an Azure Function or a Local PowerShell script.

Installing SharePoint Apps

SharePoint Apps can be deployed on a site using ALM (Application Lifecycle Management) APIs. After the app is installed in the App catalog, we could add it to a SharePoint site.

SharePoint CSOM

The steps are simple. The below snippet has the code for deploying and installing apps.

1. Get the App Id
2. Create an App Manager Object
3. Deploy the App
4. After deploy the app, install the app.
5. Use Try – catch to handle if the installation has already done

PnP PowerShell

First, lets’ get a list of apps in the App catalog.

Note: There are two values that is supported by scope paramaters for Apps – Tenant and Site.

Get-PnPApp -Scope Tenant
or
Get-PnPApp -Scope Site

If the App is not installed, then we will add the app to the App catalog

Add-PnPApp -Path "<.sppkg file location>" -Scope Site

Then, publish the App

Publish-PnPApp -Identity  -Scope Tenant -SkipFeatureDeployment

-SkipFeatureDeployment is helpful to deploy Apps across the SharePoint tenancy if the app is developed for tenant wide deployments

After the above, we will install the App

Install-PnPApp -Identity  -Scope Site

After the app is installed, it is ready to be added or used at the site.

In the upcoming blog, we will see how to add SharePoint Framework extensions and web parts programmatically.

Happy Coding!!

SharePoint Integration for Health Care eLearning – Moving LMS to the Cloud

Health care systems often face challenges in the way of being unkept and unmaintained or managed by too many without consistency in content and harbouring outdated resources. A lot of these legacy training and development systems also wear the pain of constant record churning without a supportable record management system. With the accrual of these records over time forming a ‘Big Data concern’, modernising these eLearning platforms may be the right call to action for medical professionals and researchers. Gone should be the days of manually updating Web Vista on regular basis.
Cloud solutions for Health Care and Research should be well on its way, but the better utilisation of these new technologies will play a key factor in how confidence is invested by health professionals in IT providing a means for departmental education and career development moving forward.
Why SharePoint Makes Sense (versus further developing Legacy Systems)
Every day, each document, slide image and scan matters when the paying customer’s dollar is placed on your proficiency to solve pressing health care challenges. Compliance and availability of resources aren’t enough – streamlined and collaborative processes, from quality control to customer relationship management, module testing and internal review are all minimum requirements for building a modern, online eLearning centre i.e. a ‘Learning Management System’.
ELearningIndustry.com has broken down ten key components that a Learning Management System (LMS) requires in order to be effective. From previous cases, working in developing an LMS, or OLC (Online Learning Centre) Site using SharePoint, these ten components can indeed be designed within the online platform:

  1. Strong Analytics and Report Generation – for the purposes of eLearning, e.g. dashboards which contain progress reports, exam scores and other learner data, SharePoint workflows allows for progress tracking of training and user’s engagement with content and course materials while versioning ensures that learning managers, content builders (subject matter experts) and the learners themselves are on the same page (literally).
  1. Course Authoring Capability – SharePoint access and user permissions are directly linked to your Active Directory. Access to content can be managed, both from a hierarchical standpoint or role-based if we’re talking content authors. Furthermore, learners can have access to specific ‘modules’ allocated to them based on department, vocation, etc.
  1. Scalable Content Hosting – flexibility of content or workflows, or plug-ins (using ‘app parts’) to adapt functionality to welcome new learners where learning requirements may shift to align with organisational needs.
  1. Certifications – due to the availability and popularity of SharePoint online in large/global Enterprises, certifications for anywhere from smart to super users is available from Microsoft affiliated authorities or verified third-parties.
  1. Integrations (with other SaaS software, communication tools, etc.) – allow for exchange of information through API’s for content feeds and record management e.g. with virtual classrooms, HR systems, Google Analytics.
  1. Community and Collaboration – added benefit of integrated and packaged Microsoft apps, to create channels for live group study, or learner feedback, for instance (Skype for Business, Yammer, Microsoft Teams).
  1. White Labelling vs. Branding – UI friendly, fully customisable appearance. Modern layout is design flexible to allow for the institutes branding to be proliferated throughout the tenant’s SharePoint sites.
  1. Mobile Capability – SharePoint has both a mobile app and can be designed to be responsive to multiple mobile device types
  1. Customer Support and Success – as it is a common enterprise tool, support by local IT should be feasible with any critical product support inquiries routed to Microsoft
  1. Support of the Institutes Mission and Culture – in Health Care Services, where the churn of information and data pushes for an innovative, rapid response, SharePoint can be designed to meet these needs where, as an LMS, it can adapt to continuously represent the expertise and knowledge of Health Professionals.

Outside of the above, the major advantage for health services to make the transition to the cloud is the improved information security experience. There are still plenty of cases today where patients are at risk of medical and financial identity fraud due to inadequate information security and manual (very implicitly hands-on) records management processes. Single platform databasing, as well as the from-anywhere accessibility of SharePoint as a Cloud platform meets the challenge of maintaining networks, PCs, servers and databases, which can be fairly extensive due to many health care institutions existing beyond hospitals, branching off into neighbourhood clinics, home health providers and off-site services.

Provisioning complex Modern Sites with Azure Functions and Flow – Part 2 – Create and Apply Template

In the previous blog here, we got an overview of the high level Architecture of a Complex Modern team site provisioning process. In this blog, we will look at the step 1 of the process – Create and Apply template process, in detail.
Before that, below are few links to earlier blogs, as a refresher, to prerequisties for the blog.

  1. Set up a Graph App to call Graph Service using App ID and Secret – link
  2. Sequencing HTTP Trigger Azure Functions for simultaneous calls – link
  3. Adding and Updating owners using Microsoft Graph Async calls – link

Overview
The Create and Apply Template process aims at the following
1. Create a blank modern team site using Groups Template (Group#0 Site template)
2. Apply the provisioning template on the created site.
Step 1 : Create a blank Modern team site
For creating a modern team site using CSOM we will use the TeamSiteCollectionCreationInformation class of OfficeDevPnP.  Before we create the site, we will make sure the site doesn’t already exist.

Note: There is an issue with the Site Assets library not getting intialized
when the site is created using the below code.
Hence, calling the EnsureSiteAssets library is necessary.

Step 2:  Apply the Provisioning Template

Note: The Apply template process is a long running process and takes from 60-90 min to complete
for a complex provisioning template with many site columns, content types and libraries.
In order to prevent the Azure function from timing out, it is required to host the Azure Function
using a App Service Plan instead of a Consumption plan so the Azure function
is not affected by the 10 min time out. 

For the Apply Provisioning Template process, use the below steps.
1. Reading the Template
It is important to note that the XMLPnPSchemaFormatter version (in the code below) must match the PnP version used to generate the PnP template. If the version is older, then set the XMLPnPSchemaFormatter to read from the older version. In order to find the version of the PnP Template, open the xml and look at the start of the file
PnPTemplateVersion

2. Apply the Template
For applying the template, we will use the ProvisioningTemplateApplyingInformation class of the OfficeDevPnP module. The ProvisioningTemplateApplyingInformation also has a property called HandlerToProcess which could be used the invoke the particular handler in the provisioning template process. Below is the code for the same.

After the apply template process is complete, since the flow will have timed out, we will invoke another flow to do the post process by updating a list item in the SharePoint list.
Conclusion
In this blog, we saw how we could create a modern team site and apply the template on it. The next blog we will finalize the process by doing site specfic changes after applying the template.

Provisioning complex Modern Sites with Azure Functions and Microsoft Flow – Part 1 – Architecture

In one of my previous blog here,  I have discussed about creating Office 365 groups using Azure Function and Flow. The same process could be used also to provision Modern Team sites in SharePoint Online because Modern Team Sites are Office 365 groups too. However, if you are creating a Complex Modern Team Site with lots of Libraries, Content types, Termstore associated columns etc. it will challenging to do it with a single Azure Function.
Thus, in this blog (part 1), we will look at the Architecture of a Solution to provision a complex Modern Team Site using multiple Azure Function and Flows. This is an approach that went through four months of validation and testing. There might be other options but this one worked for the complex team site which takes around 45-90 mins to provision.
Solution Design
To start with lets’ look at the solution design. The solution consists of two major components
1. Template Creation – Create a SharePoint Modern Team site to be used as a template and generate a Provisioning template from it
2. Provisioning Process – Create a SharePoint Inventory List to run the Flow and Azure Function. There will be three Azure Functions that will run three separate parts of the provisioning lifecycle. More details about the Azure Functions will in upcoming blog.
Get the Provisioning Template
The first step in the process is to  create a clean site that will be used as a reference template site for the Provisioning template. In this site, create all the lists, libraries, site columns, content type and set other necessary site settings.
In order to make sure that the generated template doesn’t have any elements which are not needed for provisioning, use the following PnP PowerShell cmdlet. The below cmdlet removes any content type hub association, ALM api handles and site security for provisioning requirements.

Get-PnPProvisioningTemplate -Out "" -ExcludeHandlers ApplicationLifecycleManagement, SiteSecurity -ExcludeContentTypesFromSyndication

The output of the above cmdlet is ProvisioningTemplate.xml file which could be applied to new sites for setting up the same SharePoint elements. To know more about the provisioning template file, schema and allowed tags, check the link here.
ModernSitesProvisioningFlow_GetTemplate
Team Site Provsioning Process
The second step in the process would be to create and apply the template to a Modern SharePoint Team site using Flow and Azure Function. The detail steps would be as follows:
1. Create an Inventory list to capture all the requirements for Site Creation
2. Create two flows
a) Create and Apply Template flow, and
b) Post Provisioning Flow
3. Create three Azure Functions –
a) Create a blank Modern Team Site
b) Apply Provisioning Template on the above site. This is a long running process and can take about 45-90 min for applying a complex template with about 20 libraries, 20-30 site columns and 10-15 content types
Note: Azure Functions on Consumption plan have a timeout of 10 min. Host the Azure function on an App Service Plan for the above to work without issues
c) Post Provisioning to apply changes that are not supported by Provisioning Template such as Creating default folders etc.
Below is the process flow for the provisioning process. It has steps from 1 – 11 which goes from creating the site to applying it. The brief list of the steps are as follows

  1. Call the Create Site flow to start the Provisioning Process
  2. Call the Create Site Azure Function
  3. Create the Modern Team Site in Azure Function and set any dependencies required for the Apply template such as Navigation items, pages etc, and then return to flow
  4. Call the Apply Template Azure Function.
  5. Get the previously generated ProvisioningTemplate.xml file from a shared location
  6. Apply the Template onto the newly created Modern site. Note: The flow call times out because it cannot wait for such a long running process
  7. Update the status column in the Site Directory for the post provisioning flow to start
  8. Call the Post provisioning flow to run the Post provisioning azure function
  9. The Post provisioning azure function will complete the remaining SharePoint changes which were not completed by the apply template such as, set field default values, create folders in any libraries, associate default values to taxonomy fields etc.

ModernSitesProvisioningFlow_ProvisioningProcess
Conclusion:
Hence in the above blog, we saw how to create a provisioning process to handle complex modern team site creation at a high architectural level. Next, we will deep dive into the Azure functions to create, apply template and post process in the next upcoming blogs.
Happy Coding!!!

Follow ...+

Kloud Blog - Follow