Building my first PowerApp, a basic roster, pulling data from SharePoint Online

What is PowerApps?

PowerApps is a set of services and apps, that enable power users to build line of business application rapidly. It can connect to the cloud services and data sources that we may be already using.

Why PowerApps?

PowerApps gives power user ability to quickly build apps that suit specific needs. They can share apps instantly with Team across the web, tablets, and mobile devices. To list down, few of the advantages of PowerApps are:

  • Simple and fast – capable of producing an app in minutes that pulls in data from Excel or a cloud service.

  • Can be integrated with Microsoft Flow, making it possible to trigger workflows from within apps.

  • Robust and enterprise-grade, so can be used for complex requirements.

What you need to get started?

We can chose from two options:

  • PowerApps Studio for web

  1. Go to the url: https://web.powerapps.com

  2. You need to use your tenant account which would be something like xxx@yyy.onmicrosoft.com

  3. Choose from any of the start option
  • PowerApps Studio for Windows

  1. Go to the url: https://powerapps.microsoft.com/en-us/downloads/

  2. Download the App for windows.

  3. You need to login using your tenant account which would be something like xxx@yyy.onmicrosoft.com

We would need to use web.powerapps.com to configure and manage data connections and on-premises gateways, and to work with the Common Data Service.

Few points to note:

After we create an app, we can administer it in the admin center.

We run apps in a browser from Microsoft Dynamics 365 or by using PowerApps Mobile, which is available for Windows, iOS, and Android devices.

PowerApps components:

  • web.powerapps.com – manage and share the apps we build
  • PowerApps Studio – build powerful apps with easy to use visual tools
  • PowerApps Mobile – run apps on Windows, iOS, and Android devices
  • PowerApps admin center – administer PowerApps environments and other components

PowerApps Studio

PowerApps Studio has three panels and a ribbon, which gives the view same as creating a PowerPoint presentation:

  1. Left navigation bar: shows thumbnail
  2. Middle pane: shows the screen that you’re working on
  3. Right-hand pane: shows options such as layout and data sources
  4. Property drop-down list: where you select the properties that formulas apply to
  5. Formula bar: where you add formulas
  6. Ribbon: where you add controls and customize design elements

PowerAppStudio.png

PowerApps Mobile

PowerApps Mobile for Windows, iOS, and Android provides an environment where instead of going to separate app stores, we stay in PowerApps and have access to all the apps that we have created and that others have shared with us.

Admin center

The PowerApps admin center is the centralized place to administer PowerApps for an organization. This is where we define different environments, data connections, and other elements. The admin center is also where we create Common Data Service databases, and manage permissions and data policies.

Lets create our first App

Ok, now time to create our first PowerApp… so, lets fire up PowerApps desktop studio. So, for this demo, we will use data source as SharePoint and layout as phone layout

  • Click New -> SharePoint -> Phone layout

PowerAppStudioDesktopStart.png

Connect to a data source

On the next screen we need to specify the connection for SharePoint, where we will enter SharePoint Url and click Go.

PowerAppSourceConnection.png

On the next screen, we can select the list on the specified SharePoint site and select the respective list, in our example we will chose the list “Roster” and click Connectafter which PowerApps then start generating the app.

powerappsourceurl2.png

The generated app is always based on a single list and we can add more data to the app later. An app with default screen is built for us, which we can see in action by click play icon Start app preview arrow.

powerappsharepointbaseapp1.png

Our app in action

MyFirstPowerApp.gif

Our three screen roster app opens in PowerApps Studio. All apps generated from data have the same set of screens:

  • The browse screen: On this, we can browse, sort, filter, and refresh the data pulled in from the list, as well as add items by clicking the (+) icon.
  • The details screen: On this, we view detail about an item, and can choose to delete or edit the item.
  • The edit/create screen: On this, we edit an existing item or create a new one.

Conclusion

So, in this post, we tried exploring PowerApps and the development tool available like PowerApps desktop studio and PowerApps web studio. Without writing a single line of code we created our first basic three screen PowerApp and were able to perform CURD (create, update, read, delete) operations on our SharePoint data source.

In the next post we will try creating PowerApp from SharePoint list.

Integrating Yammer data within SharePoint web-part using REST API

Background

We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.

Scenario

We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.

Approach

After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.

Step-1

This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
          callback(yam);
        }
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                  callback(yam);
                }
              });
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                  clearInterval(timerId);
                }
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
                }
              });
          }, 30000);
        }
      });
    });
  }

Step-2

This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('https://assets.yammer.com/assets/platform_js_sdk.js', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
          callback(yam);
        }
        else if (requestLogin) {
          this._iframeAuthentication()
              .then((res) => {
                callback(yam);
              })
              .catch((e) => {
                window.location.href="https://www.yammer.com/[your-yammer-network-name]/oauth2/authorize?client_id=[your-yammer-app-client-id]&response_type=token&redirect_uri=[your-sharepoint-home-page-url]";
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
              });
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");
        }
      });
    });
  }

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.


  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");
        document.body.appendChild(element);

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.platform.setAuthToken(token);
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                        resolve(res);
                    } else {
                        reject(res);
                    }
                });
            } catch (ex) {
                reject(ex);
            }
        });

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `https://www.yammer.com/${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;
      });
    }

Conclusion

This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.

Moving SharePoint Online workflow task metadata into the data warehouse using Nintex Flows and custom Web API

This post suggests the idea of automatic copying of SharePoint Online(SPO) workflow tasks’ metadata into the external data warehouse.  In this scenario, workflow tasks are becoming a subject of another workflow that performs automatic copying of task’s data into the external database using a custom Web API endpoint as the interface to that database. Commonly, the requirement to move workflow tasks data elsewhere arises from limitations of SPO. In particular, SPO throttles requests for access to workflow data making it virtually impossible to create a meaningful workflow reporting system with large amounts of workflow tasks. The easiest approach to solve the problem is to use Nintex workflow to “listen” to the changes in the workflow tasks, then request the task data via SPO REST API and, finally, send the data to external data warehouse Web API endpoint.

Some SPO solutions require creation of a reporting system that includes workflow tasks’ metadata. For example, it could be a report about documents with statuses of workflows linked to these documents. Using conventional approach (ex. SPO REST API) to obtain the data seems unfeasible as SPO throttles requests for workflow data. In fact, the throttling is so tight that generation of reports with more than a hundred of records is unrealistic. In addition to that, many companies would like to create Business Intelligence(BI) systems analysing workflow tasks data. Having data warehouse with all the workflow tasks metadata can assist in this job very well.

To be able to implement the solution a few prerequisites must be met. You must know basics of Nintex workflow creation and to be able to create a backend solution with the database of your choice and custom Web API endpoint that allows you to write the data model to that database. In this post we have used Visual Studio 2015 and created ordinary REST Web API 2.0 project with Azure SQL Database.

The solution will involve following steps:

  1. Get sample of your workflow task metadata and create your data model.
  2. Create a Web API capable of writing data model to the database.
  3. Expose one POST endpoint method of the Web REST API that accepts JSON model of the workflow task metadata.
  4. Create Nintex workflow in the SPO list storing your workflow tasks.
  5. Design Nintex workflow: call SPO REST API to get JSON metadata and pass this JSON object to your Web API hook.

Below is detailed description of each step.

We are looking here to export metadata of a workflow task. We need to find the SPO list that holds all your workflow tasks and navigate there. You will need a name of the list to be able to start calling SPO REST API. It is better to use a REST tool to perform Web API requests. Many people use Fiddler or Postman (Chrome Extension) for this job. Request SPO REST API to get a sample of JSON data that you want to put into your database. The request will look similar to this example:

Picture 1

The key element in this request is getbytitle(“list name”), where “list name” is SPO list name of your workflow tasks. Please remember to add header “Accept” with the value “application/json”. It tells SPO to return JSON instead of the HTML. As a result, you will get one JSON object that contains JSON metadata of Task 1. This JSON object is the example of data that you will need to put into your database. Not all fields are required in the data warehouse. We need to create a data model containing only fields of our choice. For example, it can look like this one in C# and all properties are based on model returned earlier:

The next step is to create a Web API that exposes a single method that accepts our model as a parameter from the body of the request. You can choose any REST Web API design. We have created a simple Web API 2.0 in Visual Studio 2015 using general wizard for MVC, Web API 2.0 project. Then, we have added an empty controller and filled  it with the code that works with the Entity Framework to write data model to the database. We have also created code-first EF database context that works with just one entity described above.

The code of the controller:

The code of the database context for Entity Framework

Once you have created the Web API, you should be able to call Web API method like this:
https://yoursite.azurewebsites.net/api/EmployeeFileReviewTasksWebHook

You will need to put your model data in the request body as a JSON object. Also don’t forget to include proper headers for your authentication and header “Accept” with “application/json” and set type of the request to POST. Once you’ve tested the method, you can move on to the next steps. For example, below is how we tested it in our project.

Picture 4

Next, we will create a new Nintex Workflow in the SPO list with our workflow tasks. It is all straightforward. Click Nintex Workflows, then create a new workflow and start designing it.

Picture 5

Picture 6

Once you’ve created a new workflow, click on Workflow Settings button. In the displayed form please set parameters as it shown on screenshot below. We set “Start when items are created” and “Start when items are modified”. In this scenario, any modifications of our Workflow task will start this workflow automatically. It also includes cases when Workflow task have been modified by other workflows.

Picture 7.1

Create 5 steps in this workflow as it shown on the following screenshots labelled as numbers 1 to 5. Please keep in mind that blocks 3 and 5 are there to assist in debugging only and not required in production use.

Picture 7

Step 1. Create a Dictionary variable that contains SPO REST API request headers. You can add any required headers, including Authentication headers. It is essential here to include Accept header with “application/json” in it to tell SPO that we want JSON in responses. We set Output variable to SPListRequestHeaders so we can use it later.

Picture 8

Step 2. Call HTTP Web Service. We call SPO REST API here. It is important to make sure that getbytitle parameter is correctly set to your Workflow Tasks list as we discussed before. The list of fields that we want to be returned is defined in the “$select=…” parameter of OData request. We need only fields that are included in our data model. Other settings are straightforward: we supply our Request Headers created in Step 1 and create two more variables for response. SPListResponseContent will get resulting JSON object that we going to need at the Step 4.

Picture 9

Step 3 is optional. We’ve added it to debug our workflow. It will send an email with the contents of our JSON response from the previous step. It will show us what was returned by SPO REST API.

Picture 10

Step 4. Here we are calling our custom API endpoint with passing JSON object model that we got from SPO REST API. We supply full URL of our Web Hook, Set method to POST and in the Request body we inject SPListResponseContent from the Step 2. We’re also capturing response code to display later in workflow history.

Picture 11

Step 5 is also optional. It writes a log message with the response code that we have received from our API Endpoint.

Picture 12

Once all five steps are completed, we publish this Nintex workflow. Now we are ready for testing.

To test the system, open list of our Workflow tasks. Click on any task and modify any of task’s properties and save the task. This will initiate our workflow automatically. You can monitor workflow execution in workflow history. Once workflow is completed, you should be able to see messages as displayed below. Notice that our workflow has also written Web API response code at the end.

Picture 13

To make sure that everything went well, open your database and check the records updated by your Web API. After every Workflow Task modification you will see corresponding changes in the database. For example:

Picture 14

 

In this post we have shown that automatic copying of Workflow Tasks metadata into your data warehouse can be done with a simple Nintex Workflow setup and performing only two REST Web API requests. The solution is quite flexible as you can select required properties from the SPO list and export into the data warehouse. We can easily add more tables in case if there are more than one workflow tasks lists. This solution enables creation of powerful reporting system  using data warehouse and also allows to employ BI data analytics tool of your choice.

Managing SharePoint Online (SPO) User Profiles with FIM/MIM 2016 and the Granfeldt PowerShell MA

Forefront / Microsoft Identity Manager does not come with an out-of-the-box management agent for managing SharePoint Online.

Whilst the DirSync/AADConnect solution will allow you to synchronise attributes from your On Premise Active Directory to AzureAD, SharePoint only leverages a handful of them. It then has its own set of attributes that it leverages. Many are similarly named to the standard Azure AD attributes but with the SPS- prefix.

For example, here is a list of SPO attributes and a couple of references to associated Azure AD attributes;

  • UserProfile_GUID
  • SID
  • SPS-PhoneticFirstName
  • SPS-PhoneticLastName
  • SPS-PhoneticDisplayName
  • SPS-JobTitle
  • SPS-Department
  • AboutMe
  • PersonalSpace
  • PictureURL
  • UserName
  • QuickLinks
  • WebSite
  • PublicSiteRedirect
  • SPS-Dotted-line
  • SPS-Peers
  • SPS-Responsibility
  • SPS-SipAddress
  • SPS-MySiteUpgrade
  • SPS-ProxyAddresses
  • SPS-HireDate
  • SPS-DisplayOrder
  • SPS-ClaimID
  • SPS-ClaimProviderID
  • SPS-ClaimProviderType
  • SPS-SavedAccountName
  • SPS-SavedSID
  • SPS-ResourceSID
  • SPS-ResourceAccountName
  • SPS-ObjectExists
  • SPS-MasterAccountName
  • SPS-PersonalSiteCapabilities
  • SPS-UserPrincipalName
  • SPS-O15FirstRunExperience
  • SPS-PersonalSiteInstantiationState
  • SPS-PersonalSiteFirstCreationTime
  • SPS-PersonalSiteLastCreationTime
  • SPS-PersonalSiteNumberOfRetries
  • SPS-PersonalSiteFirstCreationError
  • SPS-DistinguishedName
  • SPS-SourceObjectDN
  • SPS-FeedIdentifier
  • SPS-Location
  • Certifications
  • SPS-Skills
  • SPS-PastProjects
  • SPS-School
  • SPS-Birthday
  • SPS-Interests
  • SPS-StatusNotes
  • SPS-HashTags
  • SPS-PictureTimestamp
  • SPS-PicturePlaceholderState
  • SPS-PrivacyPeople
  • SPS-PrivacyActivity
  • SPS-PictureExchangeSyncState
  • SPS-TimeZone
  • SPS-EmailOptin
  • OfficeGraphEnabled
  • SPS-UserType
  • SPS-HideFromAddressLists
  • SPS-RecipientTypeDetails
  • DelveFlags
  • msOnline-ObjectId
  • SPS-PointPublishingUrl
  • SPS-TenantInstanceId

My customer has AADConnect in place that is synchronising their On Premise AD to Office 365. They also have a MIM 2016 instance that is managing user provisioning and lifecycle management. I’ll be using that MIM 2016 instance to manage SPO User Profile Attributes.

The remainder of this blog post describes the PS MA I’ve developed to manage the SPO attributes to allow their SPO Online Forms etc to leverage business and organisation user metadata.

Using the Granfeldt PowerShell Management Agent to manage SharePoint Online User Profiles

In this blog post I detail how you can synchronise user attributes from your On Premise Active Directory to an associated users SharePoint Online user profile utilising Søren Granfeldt’s extremely versatile PowerShell Management Agent. Provisioning and licensing of users for SPO is performed in parallel by the DirSync/AADConnect solution. This solution just provides attribute synchronisation to SPO User Profile attributes.

Overview

In this solution I’m managing the attributes that are pertinent to the customer. If you need an additional attribute or you have created custom attributes it is easy enough to extent.

Getting Started with the Granfeldt PowerShell Management Agent

First up, you can get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to manage SharePoint Online User Profiles. More detail on that further below.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~4\2010\SYNCHR~1\EXTENS~2\SPO

Managing SPO User Profiles

In order to use this working example there are a couple of items to note;

  • At the top of the Import and Export scripts you’ll need to enter your SPO Tenant Admin URL. If your tenant URL is ‘CORP.sharepoint.com’ then at the top of the scripts enter ‘CORP-admin.sharepoint.com’. The Import script will work with corp.sharepoint.com but the export won’t.
  • Give the account you’re using to connect to SPO via your MIM permissions to manage/update SPO User Profiles

Schema.ps1

As mentioned above I’m only syncing attributes pertinent to my customers’ requirements. That said I’ve selected a number of attributes that are potentials for future requirements.

Password Script (password.ps1)

Empty as described above

Import.ps1

A key part of the import script is connecting to SPO and accessing the full User Profile. In order to do this, you will need to install the SharePoint Online Client Components SDK. It’s available for download here https://www.microsoft.com/en-us/download/details.aspx?id=42038

The import script then imports two libraries that give us access to the SPO User Profiles.

Import-Module
‘C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.UserProfiles.dll’

Import-Module
‘C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll’

Import values for attributes defined in the schema.

Export.ps1

The business part of the MA. Basically enough to take attribute value changes from the MV to the SPO MA and export them to SPO. In the example script below I’m only exporting three attributes. Add as many as you need.

Wiring it all together

In order to wire the functionality together, I’m doing it just using the Sync Engine MA configuration as we’re relying on AADConnect to create the users in Office365, and we’re just flowing through attribute values.

Basically, create the PS MA, create your MA Run Profiles, import users and attributes from the PS MA, validate your joins and Export to update SPO attributes as per your flow rules.

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using the Office 365 account we gave permissions to manage user profiles in SPO earlier.

Password script must be specified but as we’re not doing password management it’s empty as detailed above.

If your schema.ps1 file is formatted correctly you can select your attributes.

I have a few join rules. In the pre-prod environment though I’m joining on WorkEmail => mail.

My import flow is just bringing back in users mobile numbers that users are able to modify in SPO. I’m exporting Title, Location and Department to SPO.

Summary

Using the Granfeldt PowerShell MA it was very easy to manage user SharePoint Online User Profile attributes.

Follow Darren on Twitter @darrenjrobinson

How to parse JSON data in Nintex Workflow for Office 365

A workflow is usually described as a series of tasks that produce an outcome. In the context of Microsoft SharePoint Products and Technologies, a workflow is defined more precisely as the automated movement of documents or items through a specific sequence of actions or tasks that are related to a business process. SharePoint Workflows can be used to consistently manage common business processes within an organisation by allowing the attachment of business logic that is set of instructions to documents or items in a SharePoint list or library.

Nintex Workflow is one of the most popular 3rd party workflow products, it adds a drag-and-drop workflow designer, advanced connectivity, and rich workflow features to give customers more power and flexibility. Nintex Workflow Products have Nintex Workflow for SharePoint and Nintex for Office 365. Nintex for SharePoint is targeted to SharePoint on premises and Nintex Workflow for Office 365 has seamless integration with Office 365 so you can use a browser based drag-and-drop workflow designer but have limited workflow activities compared to the on premises product.

Prior to SharePoint 2013, many organisations streamlined business process by building an InfoPath form and SharePoint workflow. With InfoPath being deprecated soon by Microsoft developers need to move away from that technology for their basic forms needs. A great InfoPath alternative is HTML fields form and store the fields as a JSON object in one field in the SharePoint list.

Nintex Workflow for Office 365 hasn’t released an activity in the workflow that can parse the JSON format, and there is no easy solution to get the JSON value out except using Regular Expression activity which is hard to maintain or update if JSON object structure changes. In July 2015 Product Release, Nintex Workflow contains a new feature – Collection Variable, it is extremely powerful and can speed up workflow design dramatically.

Below are the actions that are available in the cloud:

  • Add Item to Collection
  • Check if Item Exists in Collection
  • Clear Collection
  • Count Items in Collection
  • Get Item from Collection
  • Join Items in Collection
  • Remove Duplicates from Collection
  • Remove Item from Collection
  • Remove Last Item from Collection
  • Remove Value from Collection
  • Sort Items in Collections.

I will walk you through a scenario how we can read the JSON data from the Nintex Workflow in the cloud. Below is a simple HTML form which contains basic employee information and it saves the form data as JSON to the field of a SharePoint list.

Employee details form

This is how it looks like JSON data stored in the SharePoint list field “Form Data“,

{
   "employeeDetails":
   {
      "dateChangeEffectiveFrom":"2015-09-22T21:49:15.866Z",
      "dateChangeEffectiveTo":"2015-12-25T00:00:00.000Z",
      "employee":
      {
         "employeeId":"2318",
         "employeeFullName":"Ken Zheng",
         "positionTitle":"Developer",
         "departmentDescription":"IT"
       }
   }
}

If you put in a JSON Parser it will give you better understanding the structure of the data, so you can see it contains an “employeeDetails” property which has “dateChangeEffectiveFrom“, “dateChangeEffectiveTo” and “employee“. And “employee” contains “employeeId“, “employeeFullName“, “positionTitle” and “departmentDescription” properties. In this example, we are going to get the value of “employeeFullName” from the JSON data.

JSON with colourised layout

Step 1: Create two Collection Variables: ‘FormData’ and ’employee’, a Dictionary variable for ’employeeDetails’ and a text variable named ‘employeeFullName’ in the Nintex Workflow Designer

Step 1

Step 2: Set “FormData” variable to the value of “Form Data” field of the list:

Step 2

Step 3: Use “Get an Item From A Dictionary” activity to set “employeeDetails” variable value, Item name or path must be the child element name

Step 3

Step 4: Then add another “Get an Item From A Dictionary” activity to get Employee collection, because “employee” is the child of “employeeDetails“. You need to select “employeeDetails” as source Dictionary:

Step 4

Step 5: Then another “Get an Item From A Dictionary” activity to get Employee Full name (Text variable)

Step 5

The completed workflow diagram will look like this

Completed workflow

Now if you log the variable “employeeFullName“, you should get value “Ken Zheng”.

With this Nintext Workflow Collection Variables feature we can now parse and extract information from JSON data field in a SharePoint list back into a workflow without creating separate list fields for each properties. And it can easily handle looping properties as well.

The Secrets of SharePoint Site Mailbox

A Site Mailbox serves as a central filing cabinet, providing a place to file project emails and documents that can be only accessed and edited by SharePoint site members.

It can be used from a SharePoint team site to store and organise team email or via Outlook 2013 for team email, and as a way to quickly store attachments and retrieve documents from the team site.

Users will not see a Site Mailbox in their Outlook client unless they are an owner or member of that Site in SharePoint.

Secret 1: Remove Site Mailbox from Outlook client

Outlook 2013 has been enhanced to support Site Mailboxes, with a really nice integration point being the automated rollout of Site Mailboxes directly to user’s Outlook profile based on the user’s permission to the SharePoint site.

Anyone in the default owners or members groups for the site (anyone with Contribute permissions) can use the Site Mailbox. People can be listed in the owners or members group as individuals or as part of a security group. Once the Site Mailbox is provisioned, the Site Mailbox will automatically be added to Outlook, with the reverse true if the user’s permission is removed.

There are two ways to manage visibility of the Site Mailbox in Outlook:

A. Managing from Outlook manually

Users can simply right click on their personal mailbox and by selecting ‘Manage All Site Mailboxes’, users will be directed to a list of all Site Mailboxes they have access to and they can easily pin and unpin them from there.

Manage all Site Mailboxes in Outlook

B. Group Membership in SharePoint

If you don’t want users to see the Site Mailbox in Outlook at all here is the tip – do not add users to the default members or owners group in SharePoint. Instead, create a separate SharePoint group with Edit permission and add users to the new group so that they can still access the site mailbox through the web but the mailbox will not be available in Outlook.

Secret 2: Rename an existing Site Mailbox

Here is another tip – what happened if users are not happy with the Site Mailbox name that shows in the Global Address List (GAL)? There is lots of efforts to delete the Site Mailbox and reassign the permission if all we want to do is rename it.

I have great news – here is the work around:

  1. In the Office 365 Admin Center > Active users, find the Site Mailbox
  2. Select the Site Mailbox and edit its display name
  3. Then go to the Site with the mailbox, go to Site Settings > Title, description and logo
  4. Update the Title with the new Site Mailbox name too.

Note: if you did not update the Site name (the last step), Exchange will revert the name of the Site Mailbox name back to the Site name – Smart enough, isn’t it :).

So there we have a couple of useful tips – hopefully they’ve helped you out!

Connecting Salesforce and SharePoint Online with Azure App Services

Back in November I wrote a post that demonstrated how we can integrate Salesforce and SharePoint Online using the MuleSoft platform and the MuleSoft .NET Connector. In this post I hope to achieve the same thing using the recently released into preview Azure App Services offering.

Azure App Services

Azure App Services rebrands a number of familiar service types (Azure Websites, Mobile Services, and BizTalk Services) as well as adding a few new ones to the platform.

azure_app_services

  • Web Apps – Essentially a rebranding of Azure websites.
  • Mobile Apps – Built on the existing Azure Mobile Services with some additional features such as better deployment and scalability options
  • Logic Apps – A new service to the platform that allows you to visually compose process flows using a suite of API Apps from both the Marketplace and custom built.
  • API Apps – A special type of Web App that allows you to host and manage APIs to connect SaaS applications, on-premise applications or implement custom business logic. The Azure Marketplace provides a number of API Apps ready built that you can deploy as APIs in your solution.

Microsoft have also published a number of Apps to the Azure Marketplace to provide some ready-to-use functionality within each of these service types.  A new Azure SDK has also been released that we can use to build & deploy our own custom App Services. Further details on the Azure App Service can be found on the Azure Documentation site here.

Scenario Walkthrough

In this post we will see how we can create a Logic App that composes a collection of API Apps to implement the same SaaS integration solution as we did in the earlier post. To recap, we had the following integration scenario:

  • Customers (Accounts) are entered into Salesforce.com by the Sales team.
  • The team use O365 and SharePoint Online to manage customer and partner related documents.
  • When new customers are entered into Salesforce, corresponding document library folders need to be created in SharePoint.
  • Our interface needs to poll Salesforce for changes and create a new document library folder in SharePoint for this customer according to some business rules.
  • The business logic required to determine the target document library is based on the Salesforce Account type (Customer or Partner)

Azure Marketplace

As a first step, we should search the Azure Marketplace for available connectors that suit our requirements. A quick search yields some promising candidates…

Salesforce Connector – Published by Microsoft and supports Account entities and executing custom queries. Supported as an action within Logic Apps. Looking good.

salesforce_connector

SharePoint Online Connector – Published by Microsoft and supports being used as an action or trigger in Logic Apps. Promising, but upon further inspection we find that it doesn’t support creating folders within a document library. Looks like we’ll need to create our own custom API App to perform this.

sharepoint_online_connector

Business Rules API – Again published by Microsoft and based on the BizTalk Business Rules Engine. Supports being used as an action in Logic Apps however only supports XML based facts which as we’ll see doesn’t play well with the default messaging format used in Logic Apps (JSON). Looks like we’ll either need to introduce additional Apps to perform the conversion (json > xml and xml > json) or create a custom API App to perform our business rules as well.

biztalk_rules_api_app

So it appears we can only utilize one of the out-of-the-box connectors. We will need to roll up our sleaves and create at least two custom API Apps to implement our integration flow. As the offering matures and community contributions to the Marketplace is supported, hopefully we will be spending less time developing services and more time composing them. But for now let’s move on and setup the Azure App Services we will need.

Azure API Apps

As we are creating our first Azure App Service we need to first create a Resource Group and create a Azure App Service Plan. Service plans allow us to apply and manage resource tiers to each of our apps. We can then modify this service plan to scale up/down resources shared across all the apps consistently.

We start by adding a new Logic App and creating a new Resource Group and Service Plan as follows:

create_logic_app

Navigate to the newly created Resource Group. You should see two new resources in your group, your Logic App and an API Gateway that was automatically created for the resource group.

new_resource_group

Tip: Pin the Resource Group to your Home screen (start board) for easy access as we switch back and forth between blades.

Next, add the Salesforce Connector API App from the Marketplace …

create_salesforce_connector

… and add it to our Resource Group using the same Service Plan as our Logic App. Ensure that in the package settings we have the Account entity configured. This is the entity in Salesforce we want to query.

sf_connector_package_config

Now we need to provision two API App Services to host our custom API’s. Let’s add an API App Service for our custom BusinessRulesService API first, ensuring we select our existing Resource Group and Service Plan.

create_rules_api

Then repeat for our custom SharePointOnlineConnector API App Service, again selecting our Resource Group and Service Plan. We should now see three API Apps added to our resource group

resource_group_summary

Developing Custom API Apps

Currently, only the Salesforce Connector API has been deployed (as we created this from the Marketplace). We now need to develop our custom APIs and deploy them to our API App services we provisioned above.

You will need Visual Studio 2013 and the latest Azure SDK for .NET (2.5.1 or above) installed.

Business Rules Service

In Visual Studio, create a new ASP.NET Web Application for the custom BusinessRulesService and choose Azure API App (Preview)

vs_-_create_azure_api_app

Add a model to represent the SharePoint document library details we need our business rules to spit out

    public class DocumentLibraryFolder
    {
        public string DocumentLibrary { get; set; }
        public string FolderName { get; set; }
    }

Add an Api Controller that implements our business rules and return an instance of our DocumentLibraryFolder class.

    public class BusinessRulesController : ApiController
    {

        [HttpGet]
        public DocumentLibraryFolder Get(string accountType, string accountName)
        {
            System.Diagnostics.Trace.TraceInformation("Enter: Get");

            DocumentLibraryFolder docLib = new DocumentLibraryFolder();

            try
            {
                // Check for customer accounts
                if (accountType.Contains("Customer"))
                    docLib.DocumentLibrary = "Customers";

                // Check for partner accounts
                if (accountType.Contains("Partner"))
                    docLib.DocumentLibrary = "Partners";

                // Set folder name
                docLib.FolderName = accountName;
            }
            catch (Exception ex)
            {
                System.Diagnostics.Trace.TraceError(ex.ToString());
            }

            return docLib;
        }
    }

With the implementation done, we should test it works locally (how else can we claim “it works on my machine” right!). The easiest way to test an API App is to enable the swagger UI and use its built in test harness. Navigate to App_Start\SwaggerConfig.cs and uncomment the lines shown below.

enable_swagger_ui

Run your API App and navigate to /swagger

business_rules_-_test_with_swagger_locally

Once we have confirmed it works, we need to deploy the API to the Azure API App service we provisioned above. Right click the BusinessRulesService project in Solution Explorer and select Publish. Sign-in using your Azure Service Administration credentials and select the target API App service from the drop down list.

vs_api_publish

Click Publish to deploy the BusinessRulesService to Azure

Tip: Once deployed it is good practice to test your API works in Azure. You could enable public access and test using the swagger UI test harness as we did locally, or you could generate a test client app in Visual Studio. Using swagger UI is quicker as long as you remember to revoke access once testing has completed as we don’t want to grant access to this API outside our resource group.

Grant public (anonymous) access in the Application Settings section of our API App and test the deployed version using the URL found on the Summary blade of the AP App.

api_access_level

business_rules_-_test_with_swagger_in_the_cloud

Custom SharePoint Online Connector

Since the out-of-the-box connector in the Marketplace didn’t support creating folders in document libraries, we need to create our own custom API App to implement this functionality. Using the same steps as above, create a new ASP.NET Web Application named SharePointOnlineConnector and choose the Azure API App (Preview) project template.

Add the same DocumentLibraryFolder model we used in our BusinessRulesService and an Api Controller to implement the connection to SharePoint and creation of the folder in the specified document library

    public class DocumentLibraryController : ApiController
    {
        #region Connection Details
        string url = "url to your sharepoint site";
        string username = "username";
        string password = "password";
        #endregion

        [HttpPost]
        public void Post([FromBody] DocumentLibraryFolder folder)
        {
            using (var context = new Microsoft.SharePoint.Client.ClientContext(url))
            {
                try
                {
                    // Provide client credentials
                    System.Security.SecureString securePassword = new System.Security.SecureString();
                    foreach (char c in password.ToCharArray()) securePassword.AppendChar(c);
                    context.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, securePassword);

                    // Get library
                    var web = context.Web;
                    var list = web.Lists.GetByTitle(folder.DocumentLibrary);
                    var root = list.RootFolder;
                    context.Load(root);
                    context.ExecuteQuery();

                    // Create folder
                    root.Folders.Add(folder.FolderName);
                    //root.Folders.Add(HttpUtility.HtmlEncode(folder.FolderName));
                    context.ExecuteQuery();

                }
                catch (Exception ex)
                {
                    System.Diagnostics.Debug.WriteLine(ex.ToString());
                }
            }
        }

    }

Deploy to our Resource Group selecting our SharePointOnlineConnector API App Service.

vs_api_publish

Grant public access and test the API is working in Azure using swagger UI once again.

spo_connector_-_test_with_swagger_in_the_cloud

Note: I did have some issues with the Microsoft.SharePoint.Client libraries. Be sure to use v16.0.0.0 of these libraries to avoid the System.IO.FileNotFoundException: msoidcliL.dll issue (thanks Alexey Shcherbak for the fix).

Azure Logic App

With all our App Services deployed, let’s now focus on composing them into our logic flow within an Azure Logic App. Open our Logic App and navigate to the Triggers and Actions blade. From the toolbox on the right, drag the following Apps onto the designer:

  • Recurrence Trigger
  • Salesforce Connector
  • BusinessRulesService
  • SharePointOnlineConnector

logic_app_config

Note: Only API Apps and Connectors in your Resource Group will show up in the toolbox on the right hand side as well as the Recurrence Trigger and HTTP Connector.

Configure Recurrence trigger

  • Frequency: Minutes
  • Interval: 1

recurrence

Configure Salesforce Connector API

First we must authorise our Logic App to access our SFDC service domain. Click on Authorize and sign in using your SFDC developer credentials. Configure the Execute Query action to perform a select using the following SQL statement:

SELECT Id, Name, Type, LastModifiedDate FROM Account WHERE LastModifiedDate &gt; YESTERDAY LIMIT 10

salesforce

The output of the Salesforce Connector API will be in json, the default messaging format used in logic apps. The structure of the json data will look something like this

{
	"totalSize": 10,
	"done": true,
	"records": [{
		"attributes": {
			"type": "Account",
			"url": "/services/data/v32.0/sobjects/Account/00128000002l9m6AAA"
		},
		"Id": "00128000002l9m6AAA",
		"Name": "GenePoint",
		"Type": "Customer - Channel",
		"LastModifiedDate": "2015-03-20T22:45:13+00:00"
	},
	{
		"attributes": {
			"type": "Account",
			"url": "/services/data/v32.0/sobjects/Account/00128000002l9m7AAA"
		},
		"Id": "00128000002l9m7AAA",
		"Name": "United Oil &amp; Gas, UK",
		"Type": "Customer - Direct",
		"LastModifiedDate": "2015-03-20T22:45:13+00:00"
	},
	... repeats ...
    ]
}

Notice the repeating “records” section. We’ll need to let downstream APIs be aware of these repeating items so they can get invoked once for every repeating item.

Configure Business Rules API

Setup a repeating item so that our Business Rules API gets called once for every account the Salesforce Connector outputs in the response body.

  • Click on the Settings icon and select Repeat over a list
  • Set Repeat to @body(‘salesforceconnector’).result.records

Note: Here @body(‘salesforceconnector’) references the body of the response (or output) of the API call. “result.records” is referencing the elements within the json response structure where “records” is the repeating collection we want to pass to the next API in the flow.

Configure call to the BusinessRules_Get action passing the Type and Name fields of the repeated item

  • Set accountType to @repeatItem().Type
  • Set accountName to @repeatItem().Name

business_rules

The output of the BusinessRulesService will be a repeating collection of both inputs and outputs (discovered after much trial and error. Exception details are pretty thin as with most preview releases)

{
	"repeatItems": [{
		"inputs": {
			"host": {
				"gateway": "https://blogdemoresgroupxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.azurewebsites.net/",
				"id": "/subscriptions/72608e17-c89f-4822-8726-d15540e3b89c/resourcegroups/blogdemoresgroup/providers/Microsoft.AppService/apiapps/businessrulesservice"
			},
			"operation": "BusinessRules_Get",
			"parameters": {
				"accountType": "Customer - Channel",
				"accountName": "GenePoint"
			},
			"apiVersion": "2015-01-14",
			"authentication": {
				"scheme": "Zumo",
				"type": "Raw"
			}
		},
		"outputs": {
			"headers": {
				"pragma": "no-cache,no-cache",
				"x-ms-proxy-outgoing-newurl": "https://microsoft-apiapp7816bc6c4ee7452687f2fa9f58cce316.azurewebsites.net/api/BusinessRules?accountType=Customer+-+Channel&amp;accountName=GenePoint",
				"cache-Control": "no-cache",
				"set-Cookie": "ARRAffinity=451155c6c25a46b4af4ca2b73a70e702860aefb1d0efa48497d93db09e8a6ca1;Path=/;Domain=blogdemoresgroupxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.azurewebsites.net,ARRAffinity=451155c6c25a46b4af4ca2b73a70e702860aefb1d0efa48497d93db09e8a6ca1;Path=/;Domain=blogdemoresgroupxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.azurewebsites.net",
				"server": "Microsoft-IIS/8.0",
				"x-AspNet-Version": "4.0.30319",
				"x-Powered-By": "ASP.NET,ASP.NET",
				"date": "Sun, 19 Apr 2015 12:42:18 GMT"
			},
			"body": {
				"DocumentLibrary": "Customers",
				"FolderName": "GenePoint"
			}
		},
		"startTime": "2015-04-19T12:42:18.9797299Z",
		"endTime": "2015-04-19T12:42:20.0306243Z",
		"trackingId": "9c767bc2-150d-463a-9bae-26990c48835a",
		"code": "OK",
		"status": "Succeeded"
	}]
}

We will need to again define the appropriate repeating collection to present to the next API. In this case it will need to be the “outputs.body” element of the repeatItems collection.

Configure SharePointOnline Connector API

Setup a repeating item so that our SharePointOnline API gets called once for every item in the repeatItems collection.

  • Click on the Settings icon and select Repeat over a list
  • Set Repeat to @actions(‘businessrulesservice’).outputs.repeatItems

Configure call to the DocumentLibrary_POST action setting the following parameters

  • Set DocumentLibrary to @repeatItem().outputs.body.DocumentLibrary
  • Set FolderName to @repeatItem().outputs.body.FolderName

spo_connector

Save the Logic App and verify no errors are displayed. Close the Triggers and Actions blade so we return to our Logic App Summary blade.

Testing Our Solution

Ensure our Logic App is enabled and verify it is being invoked every 1 minute by the Recurrence trigger.

logic_app_operations

Open a browser and navigate to your Salesforce Developer Account. Modify a number of Accounts ensuring we have a mix of Customer and Partner Account types.

sfdc_accounts

Open a browser and navigate to your SharePoint Online Developer Account. Verify that folders for those modified accounts appear in the correct document libraries.

spo_doclibs_updated

Conclusion

In this post we have seen how we can compose logical flows using a suite of API Apps pulled together from a mix of the Azure Marketplace and custom APIs within a single integrated solution to connect disparate SaaS applications.

However, it is early days for Azure App Services and I struggled with its v1.0 limitations and poor IDE experience within the Azure Preview Portal. I would like to see a Logic App designer in Visual Studio, addition of flow control and expansion of the expression language to include support for more complex data types (perhaps even custom .NET classes).  I’m sure as the offering matures and community contributions to the Marketplace are enabled, we will be spending less time developing services and more time composing them hopefully with a much better user experience.

Getting Started with Office 365 Video

Starting Tuesday November 18 Microsoft started rolling out Office 365 Video to customers who have opted in to the First Release programme (if you haven’t you will need to wait a little longer!)

Kloud has built video solutions on Office 365 in the past so it’s great to see Microsoft deliver this as a native feature of SharePoint Online – and one that leverages the underlying power of Azure Media Services capabilities for video cross-encoding and dynamic packaging.

In this blog post we’ll take a quick tour of the new offering and show a simple usage scenario.

Basic Restrictions

In order to have access to Office 365 Video the following must be true for your Office 365 tenant:

  • SharePoint Online must be part of your subscription and users must have been granted access to it.
  • Users must have E1, E2, E3, E4, A2, A3 or A4 licenses.
  • There is no external sharing capability – you aren’t able to serve video to users who are not licensed as per the above.

There may be some change in the licenses required in future, but at launch these are the only ones supported.

Note that you don’t need to have an Azure subscription to make use of this Office 365 feature.

Getting Started

When Video is made available in your tenant it will show in either the App Launcher or Office 365 Ribbon.

Video on App Launcher

Video on Office 365 Ribbon

Like any well-managed Intranet it’s important to get the structure of your Channels right. At this stage there is no functionality to allow us to create sub-channels so how you create your Channels will depend primarily on who the target audience will be as a Channel is logical container than can be access controlled like any standard SharePoint item.

There are two default Channels out-of-the-box but let’s go ahead and create a new one for our own use.

Options when creating a Channel

Once completed we will be dropped at the Channel landing page and have the ability to upload content or manage settings. I’m going to modify the Channel I just created and restrict who can manage the content by adding one of my Kloud Colleagues to the Editors group (shown below).

Setting Permissions

Now we have our Channel configured, let’s add some content.

I click on the Upload option on the Channel home page and select an appropriate video (I’ve chosen to use an MP4 created on my trusty Lumia 920) and drag and drop it onto the upload form. The file size limits supported match the standard SharePoint Online ones (hint: your files can be pretty large!)

When you see the page below make sure you scroll down, set the video title and description (note: these are really important as they’ll be used by SharePoint Search and Delve to index the video).

Upload Process

Then you need to wait… time to complete the cross-encoding depends on how long the video is you’ve uploaded.

Once it’s completed you can play the video back via the embedded player and, if you want you can cross-post it to Yammer using the Yammer sidebar (assuming you have Yammer and an active session). You also get preview in search results and can play video from right in the preview (see below).

Video Preview

This is very early days for Office 365 Video – expect to see a lot richer functionality over time based on end user feedback.

The Office 365 Video team is listening to feedback and you can provide yours via their Uservoice site.