Creating SharePoint Modern Team sites using Site Scripts, Flow and Azure Function

With Site Scripts and Site design, it is possible to invoke custom PnP Provisioning for Modern Team Sites from a Site Script. In the previous blog, we saw how we can provision Simple modern sites using Site Scripts JSON. However, there are some scenarios where we would need a custom provisioning template or process such as listed below:

  • Auto deploy custom web components such as SPFx extension apps
  • Complex Site Templates which couldn’t be configured
  • Complex Document libs, content types that are provided by JSON schema. For an idea of support items using the OOB schema, please check here.

Hence, in this blog, we will see how we can use Flow and Azure Functions to apply more complex templates and customization on SharePoint Modern Sites.

Software Prerequisites:

  • Azure Subscription
  • Office 365 subscription or MS Flow subscription
  • PowerShell 3.0 or above
  • SharePoint Online Management Shell
  • PnP PowerShell
  • Azure Storage Emulator*
  • Postman*

* Optional, helpful for Dev and Testing

High Level Overview Steps:

1. Create an Azure Queue Storage Container
2. Create a Microsoft Flow with Request Trigger
3. Put an item into Azure Queue from Flow
4. Create an Azure Function to trigger from the Queue
5. Use the Azure Function to apply the PnP Provisioning template

Detailed Steps:

This can get quite elaborate, so hold on!!


1. Create an Azure Queue Store.

Note: For dev and testing, you can use the Azure Storage Emulator to emulate the queue requirements. For more details to configure Azure Emulator on your system, please check here.

Microsoft Flow

2. Create a Microsoft Flow with Request trigger and then add the below JSON.

Note: If you have an Office 365 enterprise E3 license, you get a Flow Free Subscription or else you can also register for a trial for this here.

3. Enter a message into the Queue in the Flow using the “Add message to Azure Queue” action.


Note: The flow trigger URL has an access key which allows it to be called from any tenant. For security reasons, please don’t share it with any third parties unless needed.

Custom SharePoint Site Template (PowerShell)

4. Next create a template site for provisioning. Make all the configurations that you will need for the initial implementation. Then create the template using PnPPowerShell, use the PnP Provisioning Command as shown below.

Get-PnPProvisioningTemplate -Out .\TestCustomTeamTemplate.xml -ExcludeHandlers Navigation, ApplicationLifecycleManagement -IncludeNativePublishingFiles

Note: The ExclueHandlers option depends on your requirement, but the configuration in the above command will save a lot of issues which you could potentially encounter while applying the template later. So, use the above as a starting template.

Note: Another quick tip, if you have any custom theme applied on the template site, then the provisioning template doesn’t carry it over. You might have to apply the theme again!

5. Export and save the PowerShell PnP Module to a local drive location. We will use it later in the Azure Function.

powershell Save-Module -Name SharePointPnPPowershellOnline -Path “[Location on your system or Shared drive]”

6. Register an App key and App Secret in and provide the below settings.
7. Copy the App Id and Secret which we will use later for Step 9 and 10. Below is a screenshot of the App registration page.
8. Trust the app at by providing the below xml. Fill in the App Id to get the details of the App.

Azure Function

9. Create a Queue Trigger PowerShell Azure function
10. After the function is created, go to Advance Editor (kudu) and then create a sub folder “SharePointPnPPowerShellOnline” in site -> wwwroot -> [function_name] -> modules. Upload all the files from the saved PowerShell folder in the Step above into this folder.
11. Add the below PowerShell to the Azure Function

12. Test the Function by the below input in PowerShell

$uri = "[the URI you copied in step 14 when creating the flow]"
$body = "{webUrl:'somesiteurl'}"
Invoke-RestMethod -Uri $uri -Method Post -ContentType "application/json" -Body $body

PowerShell and JSON

13. Create a Site Script with the below JSON and add it to a Site Design. For more details, please check the link here for more detailed steps.

14. After the above, you are finally ready to run the provisioning process. Yay!!

But before we finish off, one quick tip is that when you click manual refresh, the changes are not immediately updated on the site. It may take a while, but it will apply.


In the above blog we saw how we can create Site templates using custom provisioning template by Flow and Azure Function using SharePoint site scripts and design.

For more blogs, click here

Create Modern Pages and update metadata using SPFx Extensions, SP PnP JS and Azure Functions

Modern Site Pages (Site Page content type) have a constraint to associate custom metadata with it. In other words, the “Site Page” content type cannot have other site columns added to it as can be seen below.


On another note, even though we can create a child content types from Site Page content type, the New Site page creation (screenshot below) process doesn’t associate the new content type when the Page is created. So, the fields from the child content type couldn’t be associated.

For eg. In the below screenshot, we have created a new site page – test.aspx using “Intranet Site Page Content Type” which is a child of “Site Page” content type. After the page is created, it gets associated to Site Page Content type instead of Intranet Site Page Content type. We can edit it again to get it associated to Intranet Page content type but that adds another step for end users to do and added training effort.




Solution Approach:

To overcome the above constraints, we implemented a solution to associate custom metadata into Modern Site Pages creation using SharePoint Framework (SPFx) List View Command Set extension and Azure Function. In this blog, I am going to briefly talk about the approach so it could be useful for anyone trying to do the same.

1. Create a List View Command Item for creating site pages, editing properties of site pages and promoting site pages to news
2. Create an Azure function that will create the Page using SharePoint Online CSOM
3. Call the Azure Function from the SPFx command.

A brief screenshot of the resulting SPFx extension dialog is below.



To override the process for modern page creation, we will use an Azure Function with SharePoint Online PnP core CSOM. Below is an extract of the code for the same. On a broad level, the Azure Function basically does the following

1. Get the value of the Site Url and Page name from the Query parameters
2. Check if the Site page is absent
3. Create the page if absent
4. Save the page

Note: The below code also includes the code to check if the page exists.

Next, create a SPFx extension list view command and SP dialog component that will allow us to call the Azure Function from Site Pages Library to create pages. The code uses ‘fetch api’ to call the Azure Function and pass the parameters for the Site Url and page name required for the Azure Function to create the page. After the page is created, the Azure function will respond with a success status, which can be used to confirm the page creation.

Note: Make sure that the dialog is locked while this operation is working. So, implement the code to stop closing or resubmitting the form.

After the pages are created, lets update the properties of the item using PnP JS library using the below code.


As we can see above, we have overridden the Page Creation process using our own Azure Function using SPFx List View command and PnP JS. I will be detailing the SP dialog for SPFx extension in another upcoming blog, so keep an eye for it.

There are still some limitations of the above approach as below. You might have to get business approval for the same.

1. Cannot hide the out of the box ‘New Page’ option from inside the extension.
2. Cannot rearrange order of the Command control and it will be displayed at last to the order of SharePoint Out of the box elements.

For more blogs, click here

Update Managed Metadata and Hyperlink column in a SharePoint list using PnP PowerShell

If you are trying to update a Managed Metadata and Hyperlink column in SharePoint online using PnP PowerShell and it is not getting updated, then this blog might be of help.

I had been working on a quick requirement for updating Managed Metadata columns using a PowerShell script for a bulk update requirement and it was not working as planned. The managed metadata column we were updating didn’t have the right format of Term store hierarchy and this was the cause of the issue.

Managed Metadata

The Managed Metadata termstore format below, will not work if anything is missed in the hierarchy or symbols.

$val = "Group|TermSet|Term1|Term2..."

So for eg. Your group is “Test Group” with Termset “Test Termset1” with Level 1 Term “Parent” and Level 2 Term “Child”, then the value is:

$val = "Test Group|Test Termset1|Parent|Child"

The update command in PnP Powershell will be:

Set-PnPListItem -List "[Lib Name]" -Identity [ItemID] -Values  @{"Managed Metadata Column" = $val}


This one was amusing because I didn’t see the mistake till I realised that strings are not comma delimited, but space delimited! 🙂


The format for updating a hyperlink is below. Notice the space after the comma (,) it is very important to get this right.

Set-PnPListItem -List "[Lib Name]" -Identity [ItemID] -Values  @{"Hyperlink Column" = $val}


So above we saw how we can update Managed Metadata and Hyperlink columns using PnP PowerShell.

For more blogs, click here

Intro to Site Scripts and Site Designs with a Simple SharePoint Modern Site provisioning

Microsoft announced Site Scripts and Site Designs in late 2017 which became available for Targeted release in Jan 2018, and released to general use recently. It is a quick way to allow users to create custom modern sites, without using any scripting hacks. Hence, in this blog we will go through the steps of Site Scripts and Site design for a Simple SharePoint Modern Site Creation.

Before we get into detailed steps, lets’ get a brief overview about Site Designs and Site Scripts.

Site designs: Site designs are like a template. They can be used each time a new site is created to apply a consistent set of actions. You create site designs and register them in SharePoint to one of the modern template sites: the team site, or communication site. A site design can run multiple scripts. The script IDs are passed in an array, and they will run in the order listed.

Site Scripts: Site script is custom JSON based script that runs when a site design is selected. The site scripts detail the provisioning items such as creating new lists or applying a theme. When the actions in the scripts are completed, SharePoint displays detailed results of those actions in a progress pane. They can even call flow trigger that is essential for site creation.

Software Prerequisites:

  1. PowerShell 3.0 or above
  2. SharePoint Online Management Shell
  3. Notepad or any notes editor for JSON creation – I prefer Notepad++
  4. Windows System to run PowerShell
  5. And a must – SharePoint Tenant J

 Provisioning Process Overview:

The Provisioning process is divided into 4 steps

1. Create a Site Script using JSON template to call actions to be run after a Modern Site is created.
2. Add the Script to your tenant
3. Create a Site Design and add the Site Script to the design. Associate the Site Design to Modern Site Templates – Team Site template is 64 and Communication Site Template is 68
4. Create a Modern Site from SharePoint landing page
5. Wait for the Site Script to apply changes and refresh your site

Quick and Easy right!? Now lets’ get to the “how to”.


1. JSON Template: We will need to create a JSON template that will have the declarations of site resources that will be provisioned after the site is created. For more details, here is a link to Microsoft docs. The brief schema is below.

"$schema": "schema.json",
"actions": [
... [one or more verb   actions]  ...
"bindata": { },
"version": 1

For our blog here, we will use the below schema where we are creating a custom list with few columns.

2. Site Script: Add the above site script to your tenant using PowerShell. The below code with return the Site Script GUID. Copy that GUID and will be used later

Get-Content '[ JSON Script location ]' -Raw | Add-SPOSiteScript -Title “[ Title of the script ]

3. Site Design: After the Site Script is added, create the Site Design from the Site Script to be added to the dropdown menu options for creating the site.

Add-SPOSiteDesign -Title “[ Site design title ]” -WebTemplate "64"  -SiteScripts “[ script GUID from above step ]”  -Description "[ Description ]"

4. Create a Modern Site: After the Site Design is registered, you could see the design while creating a site as shown below


5. Click on the Manual Refresh button as per screenshot after the site upgrade process is complete.


When ready, the final Team site will look like the screenshot below after provisioning is complete.


In this blog, we came to know about Site Script, Site design and how to use them to provision modern team sites.

For more blogs, click here

Migrate SharePoint contents using ShareGate


The first step in any migration project is to do the inventory and see what is the size of the data you have got which you are looking to migrate. For the simplicity, this post assumes you have already done this activity and have already planned new destination sites for your existing content. This means you have some excel sheet somewhere to identify where your content is going to reside in your migrated site and every existing site (in scope) have got at least a new home pointer if it qualifies for migration.

The project I was working on had 6 level of sites and subsites within a site collection, for simplicity we just consider one site collection in scope for this post and had finalised it will have max. 3 levels of site and a subsite initially after discussing it with business.

In this blog post, we will be migrating SharePoint 2010 and 2013 contents from on-premise to SharePoint Online (SPO) in our customer’s Office 365 tenant.

Creating Structure

After doing the magic of mapping an existing site collection, site and subsites; we came up with an excel sheet to have the mapping for our desired structure as shown below:




The above files will be used as a reference master sheet to create site structure before pulling the trigger for migrating contents. We will be using PowerShell script below to create the structure for our desired state within our new site collection.


$url = ""

function CreateLevelOne(){
    Connect-PnPOnline -Url $url -Credentials 'O365-CLIENT-CREDENTIALS'

    filter Get-LevelOne {
        New-PnPWeb -Title $_.SiteName -Url $_.URL1 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"

    Import-Csv C:\_temp\Level1.csv | Get-LevelOne

function CreateLevelTwo(){

    filter Get-LevelTwo {
        $connectUrl = $url + $_.Parent

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL2 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"

    Import-Csv C:\_temp\Level2.csv | Get-LevelTwo

function CreateLevelThree(){

    filter Get-LevelThree {
        $connectUrl = $url +  $_.GrandPrent + '/' + $_.Parent 

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL3 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"

    Import-Csv C:\_temp\Level3.csv | Get-LevelThree




Migrating Contents

Once you have successfully created your site structure, this is the time now to start migrating contents to the newly created structure as per the mapping you have identified earlier. CSV file format looks like below:


Final step is to execute PowerShell script and migrate content using ShareGate commands from your source site to your destination site (as defined in your mapping file above)


<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span># folder where files will be produced
$folderPath = "C:\_Kloud\SGReports\"
$folderPathBatches = "C:\_Kloud\MigrationBatches\"

filter Migrate-Content {
    # URLs for source and destination
    $urlDes = $_.DesintationURL
    $urlSrc = $_.SourceURL 

    # file where migration log will be added again each run of this script
    $itemErrorFolderPath = $folderPath + 'SG-Migration-Log-Webs.csv'

    # migration settings used by ShareGate commands
    $copysettings = New-CopySettings -OnContentItemExists IncrementalUpdate -VersionOrModerationComment "updated while migration to Office 365" 


    $pwdDest = ConvertTo-SecureString "your-user-password" -AsPlainText -Force
    $siteDest = Connect-Site -Url $urlDes -Username your-user-name -Password $pwdDest

    $listsToCopy = @() 

    $siteSrc = Connect-Site -Url $urlSrc

    $listsInSrc = Get-List -Site $siteSrc 

    foreach ($list in $listsInSrc)
        if ($list.Title -ne "Content and Structure Reports" -and
            $list.Title -ne "Form Templates" -and
            $list.Title -ne "Master Page Gallery" -and
            $list.Title -ne "Web Part Gallery" -and
            $list.Title -ne "Pages" -and
            $list.Title -ne "Style Library" -and
            $list.Title -ne "Workflow Tasks"){

            $listsToCopy += $list

    # building a log entry with details for migration run

    $date = Get-Date -format d
    $rowLog = '"' + $siteSrc.Address + '","' + $siteSrc.Title + '","' + $listsInSrc.Count + '","' +  $siteDest.Address + '","' +  $siteDest.Title + '","' + $listsToCopy.Count + '","' + $date + '"'
    $rowLog | Out-File $itemErrorFolderPath -Append -Encoding ascii

    Write-Host Copying $listsToCopy.Count out of $listsInSrc.Count lists and libraries to '('$siteDest.Address')'
    #Write-Host $rowLog

    $itemLogFolderPath = $folderPath + $siteSrc.Title + '.csv'
    #Write-Host $itemLogFolderPath

    $result = Copy-List -List $listsToCopy -DestinationSite $siteDest -CopySettings $copysettings -InsaneMode -NoCustomPermissions -NoWorkflows
    Export-Report $result -Path $itemLogFolderPath

function Start-Migration($batchFileName)
    $filePath = $folderPathBatches + $batchFileName

    Import-Csv $filePath | Migrate-Content

Start-Migration -batchFileName "MG-Batch.csv"
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>


In this blog post, we had used a silent mode of ShareGate using which we can run in parallel multiple migration jobs from the different workstations (depending on your ShareGate licensing).

For a complete list of ShareGate PowerShell commands, you can refer a list at URL.

I hope you have found this post useful to create a site structure and migrate contents (list/libraries) to content’s new home inside SharePoint Online.

How to set Property Bag values in SharePoint Modern Sites using SharePoint Online .Net CSOM

If you think setting Property Bag values programmatically for Modern SharePoint sites using CSOM would be as straight forward as in the Old Classic SharePoint sites, then there is a surprise waiting for you.

As you can see below, in the old C# CSOM this code would set the Property Bag values as follows:

The challenge with the above method is that the PropertyBag values are not persisted after saving. So, if you load the context object again, the values are back to the initial values (i.e. without the “PropertyBagValue1” = “Property”).

The cause of the issue is that Modern Sites have the following setting set; IsNoScript = false, which prevents us from updating the Object model through script. Below is a screenshot of the documentation from MS docs –



Using PowerShell

Resolving this is quite easy using PowerShell, by setting the -DenyAddAndCustomizePages to 0.

Set-SPOsite <SiteURL> -DenyAddAndCustomizePages 0

However, when working with the CSOM model we need to take another approach (see below).

Using .NET CSOM Model

When using the CSOM Model, we must use the SharePoint PnP Online CSOM for this.  Start by downloading it from Nuget or the Package Manager in Visual Studio.  Next, add the code below.

Walking through the code, we are first initializing Tenant Administration and then accessing the Site Collection using the SiteUrl property. Then we use the SetSiteProperties method to set noScriptSite to false.  After that, we can use the final block of code to set the property bag values.

When done, you may want to set the noScriptSite back to true, as there are implications on modern sites as specified in this article here –

For more blogs, click here


Quick start guide for PnP PowerShell

It is quite easy and quick to set up PnP PowerShell on a local system and start using it. Considering that PnP PowerShell has been gaining a lot of momentum among Devs and Admins, I thought it would be good to have a post for everyone’s reference.

So why PnP PowerShell? Because it is the recommended and most updated PowerShell module for IT Pros to work on SharePoint Online, SharePoint on-premise and Office 365 Groups. It allows us to remotely maintain a SharePoint Online and Office 365 tenancy as we will see below.


First, we have to make sure that we have the latest and greatest version of PowerShell (at least >= v4.0) on the system. We can simply check it using the following command $PSVersionTable.PSVersion which will give the output below.

PowerShell Version Check

If your version isn’t sufficient, then go to this link to upgrade – Since there are OS requirements accompanied with PowerShell, please make sure to check the System Requirements in order to make sure the OS matches the specs.

Also, I would recommend installing SharePoint Online Management Shell because there are various Tenant commands that you may not find in PnP PowerShell. The link can be found here – Again, make sure to check the System Requirements in order to make sure the OS matches the specs.

After you have the proper PowerShell version, run the following commands to install the PnP PowerShell module depending on if you’re using the Online or On-Premise version.  You only need to install the version(s) you need to interact with (no need to install on premise versions if you’re only working online).

SharePoint Version Command to install
SharePoint Online Install-Module SharePointPnPPowerShellOnline
SharePoint 2016 Install-Module SharePointPnPPowerShell2016
SharePoint 2013 Install-Module SharePointPnPPowerShell2013

It will likely take few mins to run and complete the above command(s).

Getting Started Guide:

To start using PnP PowerShell, first we will have to understand how PnP PowerShell works.

PnP PowerShell works in the context of the current Connection the site it is connected to. Assuming it is connected to a Site Collection, all commands refer by default to that Site Collection. This is different from SPO commands which require you to have Tenant Admin rights. A benefit of the PnP approach is that you can have separate Admin personnel to manage separate site collections or sites if needed.

Overview of Basic SharePoint Operations with PnP PowerShell

1. Connect to PnP Online

Connect-PnPOnline -Url

In the above case, if you are not given a prompt for entering your credentials, then create a credential object and pass it to the Connect command (as seen below).

$user = ""
$secpasswd = ConvertTo-SecureString "" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($user, $secpasswd)
Connect-SPOnline -url $siteAdminURL -Credentials $mycreds

2. Get the web object using Get-PnPSite and then get the subsites in it

## For Site Collection (root web)
$web = Get-PnPSite 
## For Sub web
$web = Get-PnPWeb -Identity ""

Remember, PnP works in the context of the current connection.

3. To work on lists or objects of the site, use the “Includes” parameter to request them or else they will not be initialized.

$web = Get-PnPSite -Includes RootWeb.Lists

PnPPowerShell output

After you get the list object, you can traverse it as needed using the object.

4. To work on list items, you must request the list items and same goes for updating the items.

$listItem = Get-PnPListItem -List "Site Assets"


Set-PnPListItem -List "Site Assets" -Identity 2 -Values @{"" = ""}


So, as we saw above, we can use PnP PowerShell to maintain SharePoint assets remotely.

Please feel free to let me know if there are any specific questions you may have regarding the approach above and I will post about those as well.

For more blogs, click here

Automate SharePoint Document Libraries Management using Flow, Azure Functions and SharePoint CSOM

I’ve been working on a client requirement to automate SharePoint library management via scripts to implement a document lifecycle with many document libraries that have custom content types and requires regular housekeeping for ownership and permissions.

Solution overview

To provide a seamless user experience, we decided to do the following:

1. Create a document library template (.stp) with all the prerequisite folders and content types applied.

2. Create a list to store the data about entries for said libraries. Add the owner and contributors for the library as columns in that list.

3. Whenever the title, owners or contributors are changed, the destination document library will be updated.

Technical Implementation

The solution has two main elements to automate this process

1. Microsoft Flow – Trigger when an item is created or modified

2. Two Azure Functions – Create the library and update permissions

The broad steps and code are as follows:

1. When the flow is triggered, we would check the status field to find if it is a new entry or a change.

Note: Since Microsoft flow doesn’t have conditional triggers to differentiate between create and modified list item events, use a text column in the SharePoint list which is set to start, in progress and completed values to identify create and update events.

2. The flow will call an Azure function via an HTTP Post action in a Function. Below is the configuration of this.


3. For the “Create Library” Azure function, create a HTTP C# Function in your Azure subscription.

4. In the Azure Function, open Properties -> App Service Editor. Then add a folder called bin and then copy two files to it.

  • Microsoft.SharePoint.Client
  • Microsoft.SharePoint.Client.Runtime


Create Lib App Service Editor

Please make sure to get the latest copy of the Nuget package for SharepointPnPOnlineCSOM. To do that, you can set up a VS solution and copy the files from there, or download the Nuget package directly and extract the files from it.

5. After copying the files, reference them in the Azure function using the below code

#r "Microsoft.SharePoint.Client.dll"
#r "Microsoft.SharePoint.Client.Runtime.dll"
#r "System.Configuration"
#r "System.Web"

6. Then create the SharePoint client context and create a connection to the source list.

7. After that, use the ListCreationInformation class to create the Document library from the library template using the code below.

8. After the library is created, break the role inheritance for the library as per the requirement

9. Update the library permissions using the role assignment object

10. To differentiate between People, SharePoint Groups and AD Groups, find the unique ID and add the group as per the script below.

Note: In case you have people objects that are not in AD anymore because they have left the organisation, please refer to this blog for validating them before updating – Resolving “User not found” issue while assigning permissions using SharePoint CSOM

      Note: Try to avoid item.Update() from the Azure Function as that will trigger a second flow run, causing an iterative loop, instead use item.SystemUpdate()

11. After the update is done, return to the Flow with the success value from the Azure Function which will complete the loop.

As shown above, we saw how we can automate document library creation from a template and permissions management using Flow and Azure Functions

For more blogs, click here

Building my first PowerApp, a basic roster, pulling data from SharePoint Online

What is PowerApps?

PowerApps is a set of services and apps, that enable power users to build line of business application rapidly. It can connect to the cloud services and data sources that we may be already using.

Why PowerApps?

PowerApps gives power user ability to quickly build apps that suit specific needs. They can share apps instantly with Team across the web, tablets, and mobile devices. To list down, few of the advantages of PowerApps are:

  • Simple and fast – capable of producing an app in minutes that pulls in data from Excel or a cloud service.

  • Can be integrated with Microsoft Flow, making it possible to trigger workflows from within apps.

  • Robust and enterprise-grade, so can be used for complex requirements.

What you need to get started?

We can chose from two options:

  • PowerApps Studio for web

  1. Go to the url:

  2. You need to use your tenant account which would be something like

  3. Choose from any of the start option
  • PowerApps Studio for Windows

  1. Go to the url:

  2. Download the App for windows.

  3. You need to login using your tenant account which would be something like

We would need to use to configure and manage data connections and on-premises gateways, and to work with the Common Data Service.

Few points to note:

After we create an app, we can administer it in the admin center.

We run apps in a browser from Microsoft Dynamics 365 or by using PowerApps Mobile, which is available for Windows, iOS, and Android devices.

PowerApps components:

  • – manage and share the apps we build
  • PowerApps Studio – build powerful apps with easy to use visual tools
  • PowerApps Mobile – run apps on Windows, iOS, and Android devices
  • PowerApps admin center – administer PowerApps environments and other components

PowerApps Studio

PowerApps Studio has three panels and a ribbon, which gives the view same as creating a PowerPoint presentation:

  1. Left navigation bar: shows thumbnail
  2. Middle pane: shows the screen that you’re working on
  3. Right-hand pane: shows options such as layout and data sources
  4. Property drop-down list: where you select the properties that formulas apply to
  5. Formula bar: where you add formulas
  6. Ribbon: where you add controls and customize design elements


PowerApps Mobile

PowerApps Mobile for Windows, iOS, and Android provides an environment where instead of going to separate app stores, we stay in PowerApps and have access to all the apps that we have created and that others have shared with us.

Admin center

The PowerApps admin center is the centralized place to administer PowerApps for an organization. This is where we define different environments, data connections, and other elements. The admin center is also where we create Common Data Service databases, and manage permissions and data policies.

Lets create our first App

Ok, now time to create our first PowerApp… so, lets fire up PowerApps desktop studio. So, for this demo, we will use data source as SharePoint and layout as phone layout

  • Click New -> SharePoint -> Phone layout


Connect to a data source

On the next screen we need to specify the connection for SharePoint, where we will enter SharePoint Url and click Go.


On the next screen, we can select the list on the specified SharePoint site and select the respective list, in our example we will chose the list “Roster” and click Connectafter which PowerApps then start generating the app.


The generated app is always based on a single list and we can add more data to the app later. An app with default screen is built for us, which we can see in action by click play icon Start app preview arrow.


Our app in action


Our three screen roster app opens in PowerApps Studio. All apps generated from data have the same set of screens:

  • The browse screen: On this, we can browse, sort, filter, and refresh the data pulled in from the list, as well as add items by clicking the (+) icon.
  • The details screen: On this, we view detail about an item, and can choose to delete or edit the item.
  • The edit/create screen: On this, we edit an existing item or create a new one.


So, in this post, we tried exploring PowerApps and the development tool available like PowerApps desktop studio and PowerApps web studio. Without writing a single line of code we created our first basic three screen PowerApp and were able to perform CURD (create, update, read, delete) operations on our SharePoint data source.

In the next post we will try creating PowerApp from SharePoint list.

Integrating Yammer data within SharePoint web-part using REST API


We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.


We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.


After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.


This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
          }, 30000);


This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
        else if (requestLogin) {
              .then((res) => {
              .catch((e) => {
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.

  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                    } else {
            } catch (ex) {

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;


This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.