Create Modern Pages and update metadata using SPFx Extensions, SP PnP JS and Azure Functions

Modern Site Pages (Site Page content type) have a constraint to associate custom metadata with it. In other words, the “Site Page” content type cannot have other site columns added to it as can be seen below.

SitePageContentTypeMissing

On another note, even though we can create a child content types from Site Page content type, the New Site page creation (screenshot below) process doesn’t associate the new content type when the Page is created. So, the fields from the child content type couldn’t be associated.

For eg. In the below screenshot, we have created a new site page – test.aspx using “Intranet Site Page Content Type” which is a child of “Site Page” content type. After the page is created, it gets associated to Site Page Content type instead of Intranet Site Page Content type. We can edit it again to get it associated to Intranet Page content type but that adds another step for end users to do and added training effort.

 

 

 

Solution Approach:

To overcome the above constraints, we implemented a solution to associate custom metadata into Modern Site Pages creation using SharePoint Framework (SPFx) List View Command Set extension and Azure Function. In this blog, I am going to briefly talk about the approach so it could be useful for anyone trying to do the same.

1. Create a List View Command Item for creating site pages, editing properties of site pages and promoting site pages to news
2. Create an Azure function that will create the Page using SharePoint Online CSOM
3. Call the Azure Function from the SPFx command.

A brief screenshot of the resulting SPFx extension dialog is below.

NewSitePage

Steps:

To override the process for modern page creation, we will use an Azure Function with SharePoint Online PnP core CSOM. Below is an extract of the code for the same. On a broad level, the Azure Function basically does the following

1. Get the value of the Site Url and Page name from the Query parameters
2. Check if the Site page is absent
3. Create the page if absent
4. Save the page

Note: The below code also includes the code to check if the page exists.

Next, create a SPFx extension list view command and SP dialog component that will allow us to call the Azure Function from Site Pages Library to create pages. The code uses ‘fetch api’ to call the Azure Function and pass the parameters for the Site Url and page name required for the Azure Function to create the page. After the page is created, the Azure function will respond with a success status, which can be used to confirm the page creation.

Note: Make sure that the dialog is locked while this operation is working. So, implement the code to stop closing or resubmitting the form.

After the pages are created, lets update the properties of the item using PnP JS library using the below code.

Conclusion:

As we can see above, we have overridden the Page Creation process using our own Azure Function using SPFx List View command and PnP JS. I will be detailing the SP dialog for SPFx extension in another upcoming blog, so keep an eye for it.

There are still some limitations of the above approach as below. You might have to get business approval for the same.

1. Cannot hide the out of the box ‘New Page’ option from inside the extension.
2. Cannot rearrange order of the Command control and it will be displayed at last to the order of SharePoint Out of the box elements.

Update Managed Metadata and Hyperlink column in a SharePoint list using PnP PowerShell

If you are trying to update a Managed Metadata and Hyperlink column in SharePoint online using PnP PowerShell and it is not getting updated, then this blog might be of help.

I had been working on a quick requirement for updating Managed Metadata columns using a PowerShell script for a bulk update requirement and it was not working as planned. The managed metadata column we were updating didn’t have the right format of Term store hierarchy and this was the cause of the issue.

Managed Metadata

The Managed Metadata termstore format below, will not work if anything is missed in the hierarchy or symbols.

$val = "Group|TermSet|Term1|Term2..."

So for eg. Your group is “Test Group” with Termset “Test Termset1” with Level 1 Term “Parent” and Level 2 Term “Child”, then the value is:

$val = "Test Group|Test Termset1|Parent|Child"

The update command in PnP Powershell will be:

Set-PnPListItem -List "[Lib Name]" -Identity [ItemID] -Values  @{"Managed Metadata Column" = $val}

Hyperlink

This one was amusing because I didn’t see the mistake till I realised that strings are not comma delimited, but space delimited! 🙂

HyperlinkImage

The format for updating a hyperlink is below. Notice the space after the comma (,) it is very important to get this right.

Set-PnPListItem -List "[Lib Name]" -Identity [ItemID] -Values  @{"Hyperlink Column" = $val}

Conclusion

So above we saw how we can update Managed Metadata and Hyperlink columns using PnP PowerShell.

Intro to Site Scripts and Site Designs with a Simple SharePoint Modern Site provisioning

Microsoft announced Site Scripts and Site Designs in late 2017 which became available for Targeted release in Jan 2018, and released to general use recently. It is a quick way to allow users to create custom modern sites, without using any scripting hacks. Hence, in this blog we will go through the steps of Site Scripts and Site design for a Simple SharePoint Modern Site Creation.

Before we get into detailed steps, lets’ get a brief overview about Site Designs and Site Scripts.

Site designs: Site designs are like a template. They can be used each time a new site is created to apply a consistent set of actions. You create site designs and register them in SharePoint to one of the modern template sites: the team site, or communication site. A site design can run multiple scripts. The script IDs are passed in an array, and they will run in the order listed.

Site Scripts: Site script is custom JSON based script that runs when a site design is selected. The site scripts detail the provisioning items such as creating new lists or applying a theme. When the actions in the scripts are completed, SharePoint displays detailed results of those actions in a progress pane. They can even call flow trigger that is essential for site creation.

Software Prerequisites:

  1. PowerShell 3.0 or above
  2. SharePoint Online Management Shell
  3. Notepad or any notes editor for JSON creation – I prefer Notepad++
  4. Windows System to run PowerShell
  5. And a must – SharePoint Tenant J

 Provisioning Process Overview:

The Provisioning process is divided into 4 steps

1. Create a Site Script using JSON template to call actions to be run after a Modern Site is created.
2. Add the Script to your tenant
3. Create a Site Design and add the Site Script to the design. Associate the Site Design to Modern Site Templates – Team Site template is 64 and Communication Site Template is 68
4. Create a Modern Site from SharePoint landing page
5. Wait for the Site Script to apply changes and refresh your site

Quick and Easy right!? Now lets’ get to the “how to”.

Steps

1. JSON Template: We will need to create a JSON template that will have the declarations of site resources that will be provisioned after the site is created. For more details, here is a link to Microsoft docs. The brief schema is below.

{
"$schema": "schema.json",
"actions": [
... [one or more verb   actions]  ...
],
"bindata": { },
"version": 1
};

For our blog here, we will use the below schema where we are creating a custom list with few columns.

2. Site Script: Add the above site script to your tenant using PowerShell. The below code with return the Site Script GUID. Copy that GUID and will be used later

Get-Content '[ JSON Script location ]' -Raw | Add-SPOSiteScript -Title “[ Title of the script ]

3. Site Design: After the Site Script is added, create the Site Design from the Site Script to be added to the dropdown menu options for creating the site.

Add-SPOSiteDesign -Title “[ Site design title ]” -WebTemplate "64"  -SiteScripts “[ script GUID from above step ]”  -Description "[ Description ]"

4. Create a Modern Site: After the Site Design is registered, you could see the design while creating a site as shown below

ModernTeamSiteWIthcustom

5. Click on the Manual Refresh button as per screenshot after the site upgrade process is complete.

SiteScriptFinish

When ready, the final Team site will look like the screenshot below after provisioning is complete.

CustomTeamSiteWithScriptResult

In this blog, we came to know about Site Script, Site design and how to use them to provision modern team sites.

Migrate SharePoint contents using ShareGate

Background

The first step in any migration project is to do the inventory and see what is the size of the data you have got which you are looking to migrate. For the simplicity, this post assumes you have already done this activity and have already planned new destination sites for your existing content. This means you have some excel sheet somewhere to identify where your content is going to reside in your migrated site and every existing site (in scope) have got at least a new home pointer if it qualifies for migration.

The project I was working on had 6 level of sites and subsites within a site collection, for simplicity we just consider one site collection in scope for this post and had finalised it will have max. 3 levels of site and a subsite initially after discussing it with business.

In this blog post, we will be migrating SharePoint 2010 and 2013 contents from on-premise to SharePoint Online (SPO) in our customer’s Office 365 tenant.

Creating Structure

After doing the magic of mapping an existing site collection, site and subsites; we came up with an excel sheet to have the mapping for our desired structure as shown below:

Level1.csv

Level2.csv

Level3.csv

The above files will be used as a reference master sheet to create site structure before pulling the trigger for migrating contents. We will be using PowerShell script below to create the structure for our desired state within our new site collection.

SiteCreation.ps1

$url = "https://your-client-tenant-name.sharepoint.com/"

function CreateLevelOne(){
    Connect-PnPOnline -Url $url -Credentials 'O365-CLIENT-CREDENTIALS'

    filter Get-LevelOne {
        New-PnPWeb -Title $_.SiteName -Url $_.URL1 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level1.csv | Get-LevelOne
}

function CreateLevelTwo(){

    filter Get-LevelTwo {
        $connectUrl = $url + $_.Parent

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL2 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level2.csv | Get-LevelTwo
}

function CreateLevelThree(){

    filter Get-LevelThree {
        $connectUrl = $url +  $_.GrandPrent + '/' + $_.Parent 

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL3 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level3.csv | Get-LevelThree
}

CreateLevelOne

CreateLevelTwo

CreateLevelThree

Migrating Contents

Once you have successfully created your site structure, this is the time now to start migrating contents to the newly created structure as per the mapping you have identified earlier. CSV file format looks like below:

MG-Batch.csv

Final step is to execute PowerShell script and migrate content using ShareGate commands from your source site to your destination site (as defined in your mapping file above)

Migration-ShareGate.ps1

<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span># folder where files will be produced
$folderPath = "C:\_Kloud\SGReports\"
$folderPathBatches = "C:\_Kloud\MigrationBatches\"

filter Migrate-Content {
    # URLs for source and destination
    $urlDes = $_.DesintationURL
    $urlSrc = $_.SourceURL 

    # file where migration log will be added again each run of this script
    $itemErrorFolderPath = $folderPath + 'SG-Migration-Log-Webs.csv'

    # migration settings used by ShareGate commands
    $copysettings = New-CopySettings -OnContentItemExists IncrementalUpdate -VersionOrModerationComment "updated while migration to Office 365" 

    # 

    $pwdDest = ConvertTo-SecureString "your-user-password" -AsPlainText -Force
    $siteDest = Connect-Site -Url $urlDes -Username your-user-name -Password $pwdDest

    $listsToCopy = @() 

    $siteSrc = Connect-Site -Url $urlSrc

    $listsInSrc = Get-List -Site $siteSrc 

    foreach ($list in $listsInSrc)
    {
        if ($list.Title -ne "Content and Structure Reports" -and
            $list.Title -ne "Form Templates" -and
            $list.Title -ne "Master Page Gallery" -and
            $list.Title -ne "Web Part Gallery" -and
            $list.Title -ne "Pages" -and
            $list.Title -ne "Style Library" -and
            $list.Title -ne "Workflow Tasks"){

            $listsToCopy += $list
        }
    }

    # building a log entry with details for migration run

    $date = Get-Date -format d
    $rowLog = '"' + $siteSrc.Address + '","' + $siteSrc.Title + '","' + $listsInSrc.Count + '","' +  $siteDest.Address + '","' +  $siteDest.Title + '","' + $listsToCopy.Count + '","' + $date + '"'
    $rowLog | Out-File $itemErrorFolderPath -Append -Encoding ascii

    Write-Host Copying $listsToCopy.Count out of $listsInSrc.Count lists and libraries to '('$siteDest.Address')'
    #Write-Host $rowLog

    $itemLogFolderPath = $folderPath + $siteSrc.Title + '.csv'
    #Write-Host $itemLogFolderPath

    $result = Copy-List -List $listsToCopy -DestinationSite $siteDest -CopySettings $copysettings -InsaneMode -NoCustomPermissions -NoWorkflows
    Export-Report $result -Path $itemLogFolderPath
}

function Start-Migration($batchFileName)
{
    $filePath = $folderPathBatches + $batchFileName

    Import-Csv $filePath | Migrate-Content
}

Start-Migration -batchFileName "MG-Batch.csv"
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>

Conclusion

In this blog post, we had used a silent mode of ShareGate using which we can run in parallel multiple migration jobs from the different workstations (depending on your ShareGate licensing).

For a complete list of ShareGate PowerShell commands, you can refer a list at URL.

I hope you have found this post useful to create a site structure and migrate contents (list/libraries) to content’s new home inside SharePoint Online.

How to set Property Bag values in SharePoint Modern Sites using SharePoint Online .Net CSOM

If you think setting Property Bag values programmatically for Modern SharePoint sites using CSOM would be as straight forward as in the Old Classic SharePoint sites, then there is a surprise waiting for you.

As you can see below, in the old C# CSOM this code would set the Property Bag values as follows:

The challenge with the above method is that the PropertyBag values are not persisted after saving. So, if you load the context object again, the values are back to the initial values (i.e. without the “PropertyBagValue1” = “Property”).

The cause of the issue is that Modern Sites have the following setting set; IsNoScript = false, which prevents us from updating the Object model through script. Below is a screenshot of the documentation from MS docs – https://docs.microsoft.com/en-us/sharepoint/dev/solution-guidance/modern-experience-customizations-customize-sites

ModernTeamSitesPropertyBag_Limitation

Resolution:

Using PowerShell

Resolving this is quite easy using PowerShell, by setting the -DenyAddAndCustomizePages to 0.

Set-SPOsite <SiteURL> -DenyAddAndCustomizePages 0

However, when working with the CSOM model we need to take another approach (see below).

Using .NET CSOM Model

When using the CSOM Model, we must use the SharePoint PnP Online CSOM for this.  Start by downloading it from Nuget or the Package Manager in Visual Studio.  Next, add the code below.

Walking through the code, we are first initializing Tenant Administration and then accessing the Site Collection using the SiteUrl property. Then we use the SetSiteProperties method to set noScriptSite to false.  After that, we can use the final block of code to set the property bag values.

When done, you may want to set the noScriptSite back to true, as there are implications on modern sites as specified in this article here – https://support.office.com/en-us/article/security-considerations-of-allowing-custom-script-b0420ab0-aff2-4bbc-bf5e-03de9719627c

 

Quick start guide for PnP PowerShell

It is quite easy and quick to set up PnP PowerShell on a local system and start using it. Considering that PnP PowerShell has been gaining a lot of momentum among Devs and Admins, I thought it would be good to have a post for everyone’s reference.

So why PnP PowerShell? Because it is the recommended and most updated PowerShell module for IT Pros to work on SharePoint Online, SharePoint on-premise and Office 365 Groups. It allows us to remotely maintain a SharePoint Online and Office 365 tenancy as we will see below.

Installation:

First, we have to make sure that we have the latest and greatest version of PowerShell (at least >= v4.0) on the system. We can simply check it using the following command $PSVersionTable.PSVersion which will give the output below.

PowerShell Version Check

If your version isn’t sufficient, then go to this link to upgrade – https://www.microsoft.com/en-us/download/details.aspx?id=40855. Since there are OS requirements accompanied with PowerShell, please make sure to check the System Requirements in order to make sure the OS matches the specs.

Also, I would recommend installing SharePoint Online Management Shell because there are various Tenant commands that you may not find in PnP PowerShell. The link can be found here – https://www.microsoft.com/en-au/download/details.aspx?id=35588. Again, make sure to check the System Requirements in order to make sure the OS matches the specs.

After you have the proper PowerShell version, run the following commands to install the PnP PowerShell module depending on if you’re using the Online or On-Premise version.  You only need to install the version(s) you need to interact with (no need to install on premise versions if you’re only working online).

SharePoint Version Command to install
SharePoint Online Install-Module SharePointPnPPowerShellOnline
SharePoint 2016 Install-Module SharePointPnPPowerShell2016
SharePoint 2013 Install-Module SharePointPnPPowerShell2013

It will likely take few mins to run and complete the above command(s).

Getting Started Guide:

To start using PnP PowerShell, first we will have to understand how PnP PowerShell works.

PnP PowerShell works in the context of the current Connection the site it is connected to. Assuming it is connected to a Site Collection, all commands refer by default to that Site Collection. This is different from SPO commands which require you to have Tenant Admin rights. A benefit of the PnP approach is that you can have separate Admin personnel to manage separate site collections or sites if needed.

Overview of Basic SharePoint Operations with PnP PowerShell

1. Connect to PnP Online

Connect-PnPOnline -Url

In the above case, if you are not given a prompt for entering your credentials, then create a credential object and pass it to the Connect command (as seen below).

$user = ""
$secpasswd = ConvertTo-SecureString "" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($user, $secpasswd)
Connect-SPOnline -url $siteAdminURL -Credentials $mycreds

2. Get the web object using Get-PnPSite and then get the subsites in it

## For Site Collection (root web)
$web = Get-PnPSite 
## For Sub web
$web = Get-PnPWeb -Identity ""

Remember, PnP works in the context of the current connection.

3. To work on lists or objects of the site, use the “Includes” parameter to request them or else they will not be initialized.

$web = Get-PnPSite -Includes RootWeb.Lists

PnPPowerShell output

After you get the list object, you can traverse it as needed using the object.

4. To work on list items, you must request the list items and same goes for updating the items.

$listItem = Get-PnPListItem -List "Site Assets"

DocumentResults

Set-PnPListItem -List "Site Assets" -Identity 2 -Values @{"" = ""}

PnPPowerShell_UpdateItem

So, as we saw above, we can use PnP PowerShell to maintain SharePoint assets remotely.

Please feel free to let me know if there are any specific questions you may have regarding the approach above and I will post about those as well.

Automate SharePoint Document Libraries Management using Flow, Azure Functions and SharePoint CSOM

I’ve been working on a client requirement to automate SharePoint library management via scripts to implement a document lifecycle with many document libraries that have custom content types and requires regular housekeeping for ownership and permissions.

Solution overview

To provide a seamless user experience, we decided to do the following:

1. Create a document library template (.stp) with all the prerequisite folders and content types applied.

2. Create a list to store the data about entries for said libraries. Add the owner and contributors for the library as columns in that list.

3. Whenever the title, owners or contributors are changed, the destination document library will be updated.

Technical Implementation

The solution has two main elements to automate this process

1. Microsoft Flow – Trigger when an item is created or modified

2. Two Azure Functions – Create the library and update permissions

The broad steps and code are as follows:

1. When the flow is triggered, we would check the status field to find if it is a new entry or a change.

Note: Since Microsoft flow doesn’t have conditional triggers to differentiate between create and modified list item events, use a text column in the SharePoint list which is set to start, in progress and completed values to identify create and update events.

2. The flow will call an Azure function via an HTTP Post action in a Function. Below is the configuration of this.

AzureFunctionFromFlow

3. For the “Create Library” Azure function, create a HTTP C# Function in your Azure subscription.

4. In the Azure Function, open Properties -> App Service Editor. Then add a folder called bin and then copy two files to it.

  • Microsoft.SharePoint.Client
  • Microsoft.SharePoint.Client.Runtime

KuduTools_AzureFunction

Create Lib App Service Editor

Please make sure to get the latest copy of the Nuget package for SharepointPnPOnlineCSOM. To do that, you can set up a VS solution and copy the files from there, or download the Nuget package directly and extract the files from it.

5. After copying the files, reference them in the Azure function using the below code

#r "Microsoft.SharePoint.Client.dll"
#r "Microsoft.SharePoint.Client.Runtime.dll"
#r "System.Configuration"
#r "System.Web"

6. Then create the SharePoint client context and create a connection to the source list.

7. After that, use the ListCreationInformation class to create the Document library from the library template using the code below.

8. After the library is created, break the role inheritance for the library as per the requirement

9. Update the library permissions using the role assignment object

10. To differentiate between People, SharePoint Groups and AD Groups, find the unique ID and add the group as per the script below.

Note: In case you have people objects that are not in AD anymore because they have left the organisation, please refer to this blog for validating them before updating – Resolving “User not found” issue while assigning permissions using SharePoint CSOM

      Note: Try to avoid item.Update() from the Azure Function as that will trigger a second flow run, causing an iterative loop, instead use item.SystemUpdate()

11. After the update is done, return to the Flow with the success value from the Azure Function which will complete the loop.

As shown above, we saw how we can automate document library creation from a template and permissions management using Flow and Azure Functions

Building my first PowerApp, a basic roster, pulling data from SharePoint Online

What is PowerApps?

PowerApps is a set of services and apps, that enable power users to build line of business application rapidly. It can connect to the cloud services and data sources that we may be already using.

Why PowerApps?

PowerApps gives power user ability to quickly build apps that suit specific needs. They can share apps instantly with Team across the web, tablets, and mobile devices. To list down, few of the advantages of PowerApps are:

  • Simple and fast – capable of producing an app in minutes that pulls in data from Excel or a cloud service.

  • Can be integrated with Microsoft Flow, making it possible to trigger workflows from within apps.

  • Robust and enterprise-grade, so can be used for complex requirements.

What you need to get started?

We can chose from two options:

  • PowerApps Studio for web

  1. Go to the url: https://web.powerapps.com

  2. You need to use your tenant account which would be something like xxx@yyy.onmicrosoft.com

  3. Choose from any of the start option
  • PowerApps Studio for Windows

  1. Go to the url: https://powerapps.microsoft.com/en-us/downloads/

  2. Download the App for windows.

  3. You need to login using your tenant account which would be something like xxx@yyy.onmicrosoft.com

We would need to use web.powerapps.com to configure and manage data connections and on-premises gateways, and to work with the Common Data Service.

Few points to note:

After we create an app, we can administer it in the admin center.

We run apps in a browser from Microsoft Dynamics 365 or by using PowerApps Mobile, which is available for Windows, iOS, and Android devices.

PowerApps components:

  • web.powerapps.com – manage and share the apps we build
  • PowerApps Studio – build powerful apps with easy to use visual tools
  • PowerApps Mobile – run apps on Windows, iOS, and Android devices
  • PowerApps admin center – administer PowerApps environments and other components

PowerApps Studio

PowerApps Studio has three panels and a ribbon, which gives the view same as creating a PowerPoint presentation:

  1. Left navigation bar: shows thumbnail
  2. Middle pane: shows the screen that you’re working on
  3. Right-hand pane: shows options such as layout and data sources
  4. Property drop-down list: where you select the properties that formulas apply to
  5. Formula bar: where you add formulas
  6. Ribbon: where you add controls and customize design elements

PowerAppStudio.png

PowerApps Mobile

PowerApps Mobile for Windows, iOS, and Android provides an environment where instead of going to separate app stores, we stay in PowerApps and have access to all the apps that we have created and that others have shared with us.

Admin center

The PowerApps admin center is the centralized place to administer PowerApps for an organization. This is where we define different environments, data connections, and other elements. The admin center is also where we create Common Data Service databases, and manage permissions and data policies.

Lets create our first App

Ok, now time to create our first PowerApp… so, lets fire up PowerApps desktop studio. So, for this demo, we will use data source as SharePoint and layout as phone layout

  • Click New -> SharePoint -> Phone layout

PowerAppStudioDesktopStart.png

Connect to a data source

On the next screen we need to specify the connection for SharePoint, where we will enter SharePoint Url and click Go.

PowerAppSourceConnection.png

On the next screen, we can select the list on the specified SharePoint site and select the respective list, in our example we will chose the list “Roster” and click Connectafter which PowerApps then start generating the app.

powerappsourceurl2.png

The generated app is always based on a single list and we can add more data to the app later. An app with default screen is built for us, which we can see in action by click play icon Start app preview arrow.

powerappsharepointbaseapp1.png

Our app in action

MyFirstPowerApp.gif

Our three screen roster app opens in PowerApps Studio. All apps generated from data have the same set of screens:

  • The browse screen: On this, we can browse, sort, filter, and refresh the data pulled in from the list, as well as add items by clicking the (+) icon.
  • The details screen: On this, we view detail about an item, and can choose to delete or edit the item.
  • The edit/create screen: On this, we edit an existing item or create a new one.

Conclusion

So, in this post, we tried exploring PowerApps and the development tool available like PowerApps desktop studio and PowerApps web studio. Without writing a single line of code we created our first basic three screen PowerApp and were able to perform CURD (create, update, read, delete) operations on our SharePoint data source.

In the next post we will try creating PowerApp from SharePoint list.

Integrating Yammer data within SharePoint web-part using REST API

Background

We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.

Scenario

We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.

Approach

After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.

Step-1

This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
          callback(yam);
        }
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                  callback(yam);
                }
              });
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                }
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
                }
              });
          }, 30000);
        }
      });
    });
  }

Step-2

This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
          callback(yam);
        }
        else if (requestLogin) {
          this._iframeAuthentication()
              .then((res) => {
                callback(yam);
              })
              .catch((e) => {
                window.location.href="https://www.yammer.com/[your-yammer-network-name]/oauth2/authorize?client_id=[your-yammer-app-client-id]&response_type=token&redirect_uri=[your-sharepoint-home-page-url]";
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
              });
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");
        }
      });
    });
  }

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.


  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");
        document.body.appendChild(element);

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.platform.setAuthToken(token);
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                        resolve(res);
                    } else {
                        reject(res);
                    }
                });
            } catch (ex) {
                reject(ex);
            }
        });

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `https://www.yammer.com/${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;
      });
    }

Conclusion

This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.

Moving SharePoint Online workflow task metadata into the data warehouse using Nintex Flows and custom Web API

This post suggests the idea of automatic copying of SharePoint Online(SPO) workflow tasks’ metadata into the external data warehouse.  In this scenario, workflow tasks are becoming a subject of another workflow that performs automatic copying of task’s data into the external database using a custom Web API endpoint as the interface to that database. Commonly, the requirement to move workflow tasks data elsewhere arises from limitations of SPO. In particular, SPO throttles requests for access to workflow data making it virtually impossible to create a meaningful workflow reporting system with large amounts of workflow tasks. The easiest approach to solve the problem is to use Nintex workflow to “listen” to the changes in the workflow tasks, then request the task data via SPO REST API and, finally, send the data to external data warehouse Web API endpoint.

Some SPO solutions require creation of a reporting system that includes workflow tasks’ metadata. For example, it could be a report about documents with statuses of workflows linked to these documents. Using conventional approach (ex. SPO REST API) to obtain the data seems unfeasible as SPO throttles requests for workflow data. In fact, the throttling is so tight that generation of reports with more than a hundred of records is unrealistic. In addition to that, many companies would like to create Business Intelligence(BI) systems analysing workflow tasks data. Having data warehouse with all the workflow tasks metadata can assist in this job very well.

To be able to implement the solution a few prerequisites must be met. You must know basics of Nintex workflow creation and to be able to create a backend solution with the database of your choice and custom Web API endpoint that allows you to write the data model to that database. In this post we have used Visual Studio 2015 and created ordinary REST Web API 2.0 project with Azure SQL Database.

The solution will involve following steps:

  1. Get sample of your workflow task metadata and create your data model.
  2. Create a Web API capable of writing data model to the database.
  3. Expose one POST endpoint method of the Web REST API that accepts JSON model of the workflow task metadata.
  4. Create Nintex workflow in the SPO list storing your workflow tasks.
  5. Design Nintex workflow: call SPO REST API to get JSON metadata and pass this JSON object to your Web API hook.

Below is detailed description of each step.

We are looking here to export metadata of a workflow task. We need to find the SPO list that holds all your workflow tasks and navigate there. You will need a name of the list to be able to start calling SPO REST API. It is better to use a REST tool to perform Web API requests. Many people use Fiddler or Postman (Chrome Extension) for this job. Request SPO REST API to get a sample of JSON data that you want to put into your database. The request will look similar to this example:

Picture 1

The key element in this request is getbytitle(“list name”), where “list name” is SPO list name of your workflow tasks. Please remember to add header “Accept” with the value “application/json”. It tells SPO to return JSON instead of the HTML. As a result, you will get one JSON object that contains JSON metadata of Task 1. This JSON object is the example of data that you will need to put into your database. Not all fields are required in the data warehouse. We need to create a data model containing only fields of our choice. For example, it can look like this one in C# and all properties are based on model returned earlier:

The next step is to create a Web API that exposes a single method that accepts our model as a parameter from the body of the request. You can choose any REST Web API design. We have created a simple Web API 2.0 in Visual Studio 2015 using general wizard for MVC, Web API 2.0 project. Then, we have added an empty controller and filled  it with the code that works with the Entity Framework to write data model to the database. We have also created code-first EF database context that works with just one entity described above.

The code of the controller:

The code of the database context for Entity Framework

Once you have created the Web API, you should be able to call Web API method like this:
https://yoursite.azurewebsites.net/api/EmployeeFileReviewTasksWebHook

You will need to put your model data in the request body as a JSON object. Also don’t forget to include proper headers for your authentication and header “Accept” with “application/json” and set type of the request to POST. Once you’ve tested the method, you can move on to the next steps. For example, below is how we tested it in our project.

Picture 4

Next, we will create a new Nintex Workflow in the SPO list with our workflow tasks. It is all straightforward. Click Nintex Workflows, then create a new workflow and start designing it.

Picture 5

Picture 6

Once you’ve created a new workflow, click on Workflow Settings button. In the displayed form please set parameters as it shown on screenshot below. We set “Start when items are created” and “Start when items are modified”. In this scenario, any modifications of our Workflow task will start this workflow automatically. It also includes cases when Workflow task have been modified by other workflows.

Picture 7.1

Create 5 steps in this workflow as it shown on the following screenshots labelled as numbers 1 to 5. Please keep in mind that blocks 3 and 5 are there to assist in debugging only and not required in production use.

Picture 7

Step 1. Create a Dictionary variable that contains SPO REST API request headers. You can add any required headers, including Authentication headers. It is essential here to include Accept header with “application/json” in it to tell SPO that we want JSON in responses. We set Output variable to SPListRequestHeaders so we can use it later.

Picture 8

Step 2. Call HTTP Web Service. We call SPO REST API here. It is important to make sure that getbytitle parameter is correctly set to your Workflow Tasks list as we discussed before. The list of fields that we want to be returned is defined in the “$select=…” parameter of OData request. We need only fields that are included in our data model. Other settings are straightforward: we supply our Request Headers created in Step 1 and create two more variables for response. SPListResponseContent will get resulting JSON object that we going to need at the Step 4.

Picture 9

Step 3 is optional. We’ve added it to debug our workflow. It will send an email with the contents of our JSON response from the previous step. It will show us what was returned by SPO REST API.

Picture 10

Step 4. Here we are calling our custom API endpoint with passing JSON object model that we got from SPO REST API. We supply full URL of our Web Hook, Set method to POST and in the Request body we inject SPListResponseContent from the Step 2. We’re also capturing response code to display later in workflow history.

Picture 11

Step 5 is also optional. It writes a log message with the response code that we have received from our API Endpoint.

Picture 12

Once all five steps are completed, we publish this Nintex workflow. Now we are ready for testing.

To test the system, open list of our Workflow tasks. Click on any task and modify any of task’s properties and save the task. This will initiate our workflow automatically. You can monitor workflow execution in workflow history. Once workflow is completed, you should be able to see messages as displayed below. Notice that our workflow has also written Web API response code at the end.

Picture 13

To make sure that everything went well, open your database and check the records updated by your Web API. After every Workflow Task modification you will see corresponding changes in the database. For example:

Picture 14

 

In this post we have shown that automatic copying of Workflow Tasks metadata into your data warehouse can be done with a simple Nintex Workflow setup and performing only two REST Web API requests. The solution is quite flexible as you can select required properties from the SPO list and export into the data warehouse. We can easily add more tables in case if there are more than one workflow tasks lists. This solution enables creation of powerful reporting system  using data warehouse and also allows to employ BI data analytics tool of your choice.