Integrating Yammer data within SharePoint web-part using REST API


We were developing a SharePoint application for one of our client and have some web-parts that had to retrieve data from Yammer. As we were developing on SharePoint Online (SPO) using a popular SharePoint Framework (SPFx), so for the most part of our engagement we were developing using a client-side library named React to deliver what is required from us.

In order for us to integrate client’s Yammer data into our web-parts, we were using JavaScript SDK provided by Yammer.


We were having around 7-8 different calls to Yammer API in different web-parts to extract data from Yammer on behalf of a logged-in user. Against each API call, a user has to be authenticated before a call to Yammer API has been made and this has to be done without the user being redirected to Yammer for login or presented with a popup or a button to log in first.

If you follow Yammer’s JavaScript SDK instructions, we will not be meeting our client’s requirement of not asking the user to go Yammer first (as this will change their user flow) or a pop-up with login/sign-in dialog.


After looking on the internet to fulfill above requirements, I could not find anything that serves us. I have found the closest match in PnP sample but it only works if a client has already consented to your Yammer app before. In our case, this isn’t possible as many users will be accessing SharePoint home page for the first them and have never accessed Yammer before.

What we have done is, let our API login calls break into two groups. Randomly one of the calls was chosen to let the user login to Yammer and get access token in the background and cache it with Yammer API and make other API login calls to wait for the first login and then use Yammer API to log in.


This function will use standard Yammer API to check login status if successful then it will proceed with issuing API data retrieval calls, but if could not log in the first time; it will wait and check again after every 2 sec until it times out after 30 sec.

  public static loginToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

        yam.getLoginStatus((FirstloginStatusResponse) => {
        if (FirstloginStatusResponse.authResponse) {
        else {
          let timerId = setInterval(()=>{
              yam.getLoginStatus((SecondloginStatusResponse) => {
                if (SecondloginStatusResponse.authResponse) {
          }, 2000);

          setTimeout(() => {
              yam.getLoginStatus((TimeOutloginStatusResponse) => {
                if (TimeOutloginStatusResponse.authResponse) {
                else {
                  console.error("iFrame - user could not log in to Yammer even after waiting");
          }, 30000);


This method will again use the standard Yammer API to check login status; then tries to log in user in the background using an iframe approach as called out in PnP sample; if that approach didn’t work either then it will redirect user to Smart URL in the same window to get user consent for Yammer app with a redirect URI set to home page of  your SharePoint where web-parts with Yammer API are hosted.

  public static logonToYammer(callback: Function, requestLogin = true) {
    SPComponentLoader.loadScript('', { globalExportsName: "yam"}).then(() => {
      const yam = window["yam"];

      yam.getLoginStatus((loginStatusResponse) => {
        if (loginStatusResponse.authResponse) {
        else if (requestLogin) {
              .then((res) => {
              .catch((e) => {
                console.error("iFrame - user could not log in to Yammer due to error. " + e);
        } else {
          console.error("iFrame - it was not called and user could not log in to Yammer");

The function _iframeAuthentication is copied from PnP sample with some modifications to fit our needs as per the client requirements were developing against.

  private static _iframeAuthentication(): Promise<any> {
      let yam = window["yam"];
      let clientId: string = "[your-yammer-app-client-id]";
      let redirectUri: string = "[your-sharepoint-home-page-url]";
      let domainName: string = "[your-yammer-network-name]";

      return new Promise((resolve, reject) => {
        let iframeId: string = "authIframe";
        let element: HTMLIFrameElement = document.createElement("iframe");

        element.setAttribute("id", iframeId);
        element.setAttribute("style", "display:none");

        element.addEventListener("load", _ => {
            try {
                let elem: HTMLIFrameElement = document.getElementById(iframeId) as HTMLIFrameElement;
                let token: string = elem.contentWindow.location.hash.split("=")[1];
                yam.getLoginStatus((res: any) => {
                    if (res.authResponse) {
                    } else {
            } catch (ex) {

        let queryString: string = `client_id=${clientId}&response_type=token&redirect_uri=${redirectUri}`;

       let url: string = `${domainName}/oauth2/authorize?${queryString}`;

        element.src = url;


This resulted in authenticating Office 365 tenant user within the same window of SharePoint home page with the help of an iframe [case: the user had consented Yammer app before] or getting a Yammer app consent from the Office 365 tenant user without being redirected to Yammer to do OAuth based authentication [case: the user is accessing Yammer integrated web-parts for the 1st time].

We do hope future releases of Yammer API will cater seamless integration among O365 products without having to go through a hassle to get access tokens in a way described in this post.

Angular Bag of Tricks for SharePoint


I’ve been using Angular 1.x for building custom UI components and SPAs for SharePoint for years now. There’s a lot of rinse and repeat here, and over time my “stack” of open-source custom Directives and Services that make Angular such a powerful framework, have settled down to a few key ones that I wanted to highlight – hence this post.

Some may be wondering why am I still working almost exclusively with Angular 1.x? Why not Angular 2, or React, or Aurelia? A few reasons:

  • It’s often already in use. Quite often a customer is already leveraging Angular 1.x in a custom masterpage, so it makes sense not to add another framework to the mix.
  • Performance improvements are not a high priority. I work almost entirely with SharePoint Online. The classic ASP.NET pages served up there aren’t exactly blindingly fast to load, so Angular 1 (used carefully) doesn’t slow things down measurably. Will this change when SPFx finally GA’s? Of course! But in the meantime, Angular 1.x is very comfortable, which leads to…
  • Familiarity = Productivity. Ramping up a custom application in SharePoint with Angular is now very quick to do. This is the whole “go with what you know well and can iterate fast on” approach to framework selection. Spend your time building out the logic of your app rather than fighting an unfamiliar framework.
  • The absolute smorgasbord of community-produced libraries that enhance Angular. A lot of the major ones have Angular 2 versions, but there are some notable exceptions (highlighted below).

So here, in order of frequency of use, are the plugins that I go to time and time again. Check them out and star those github repos!



An awesome state-based routing service for Angular (and there are Angular 2 and React versions as well) – more widely used than the default Angular 1 router as it has a fantastic API which allows you resolve asynchronous data calls (via promises) before you transition to the state that needs it. This keeps your controllers/components light and clean. I use this every custom webpart/SPA I build that has more than one view (which is almost all of them).

If you need modals in your app, you can add in the uib-modal extension that allows UI-Bootstrap modals to be declared as state in your UI-Router state config. Great for deep linking through to modal windows!

Angular Formly


Sick of labouring over large form templates? They are time consuming to wire up and maintain – that’s a lot of markup! Formly allows you to declare your form fields in JavaScript instead. This allows for a lot more control and being able to generate the UI on the fly at run time is a killer feature (that I haven’t done enough with to date!). I hope to have another post on this topic very soon…

Formly makes using custom controls / directives in forms really easy and gives you uniform validation rules over all of them. It’s also wonderfully extensible – there’s so much you can do with it, once you learn the basics. I put off trying it for AGES and now I wouldn’t be without it – if you have any user input in your Angular app, do yourself a favour and use Formly!



The best JavaScript grid component. Period. Like all libraries I’ve mentioned so far, this one is also not just for Angular (this one supports nearly all of the major frameworks, including Aurelia and Vue). There’s a Free and an Enterprise version with a lot of extra bells and whistles. I haven’t had to shell out for Enterprise yet – Free is very fully featured as is. If you have a table of data in your app – you should give this a try for all but the most simple scenarios.



Here’s the first Angular 1 only library in my toolbox. Makes creating complex Gantt-chart interfaces if not dead easy, at least feasible! I shudder to think what I nightmare it would be to write this kind of functionality from scratch…

There’s loads of features here, just like the other libararies listed.



angular-wizardAnother Angular 1-only (although similar Angular 2 projects exist). Great little wizard form directive that allows you to declare steps of your wizard declaratively in your template, or when teamed up with Formly, in your controller. The latter allows you to create dynamic wizard steps by filtering the questions in the next step based on the response to the previous (once again – need to document this in another post in future).

A few extra tricks for SharePoint…

A few other more generic practices when slinging Angular in SharePoint:

  • Don’t be afraid to use $q to wrap your own promises – yes it is overused in a lot of example code on Stack Overflow (hint: if you are calling the $http service, you don’t need $q, just return the result of the $http call), but it’s great if you want/need to use CSOM. Just wrap the result of executeQueryAsync in a promise’s resolve method and you’ve got a far cleaner implementation (no callbacks when you utilise it), so it’s easily packaged up in a service.
  • Create a reusable service layer – lots of people don’t bother to use Angular services, as most example code just keeps the $http calls in the controller for simplicity. Keep all your REST and CSOM calls to interact with SharePoint in a service module and you’ll get a lot more reuse of your code from application to application. Ideally, use ui-router to resolve the promises from your service before the controller is even initialised (as mentioned above).
  • Use Widget Wrangler for hosting your Angular apps in SharePoint pages– this handles all your script dependencies cleanly and lets you host in a ScriptEditor webpart (easily deployed with PnP PowerShell).
  • Think about caching your data or real-time sync – the excellent Angular-Cache is great for client-side caching of data and if your application’s data is frequently updated, you may want to consider a real-time data option to enhance the solution and prevent the need for page refreshes (another post on this coming soon too), such as Firebase or GunJS.
  • Azure Functions-All-The-Things! No more PowerShell running in a scheduled task on a VM for background processing. There is a better (and even cheaper) way.

I hope some people find this useful. Please leave a comment if you’ve got some other Angular-goodness you’d like to share!

Moving SharePoint Online workflow task metadata into the data warehouse using Nintex Flows and custom Web API

This post suggests the idea of automatic copying of SharePoint Online(SPO) workflow tasks’ metadata into the external data warehouse.  In this scenario, workflow tasks are becoming a subject of another workflow that performs automatic copying of task’s data into the external database using a custom Web API endpoint as the interface to that database. Commonly, the requirement to move workflow tasks data elsewhere arises from limitations of SPO. In particular, SPO throttles requests for access to workflow data making it virtually impossible to create a meaningful workflow reporting system with large amounts of workflow tasks. The easiest approach to solve the problem is to use Nintex workflow to “listen” to the changes in the workflow tasks, then request the task data via SPO REST API and, finally, send the data to external data warehouse Web API endpoint.

Some SPO solutions require creation of a reporting system that includes workflow tasks’ metadata. For example, it could be a report about documents with statuses of workflows linked to these documents. Using conventional approach (ex. SPO REST API) to obtain the data seems unfeasible as SPO throttles requests for workflow data. In fact, the throttling is so tight that generation of reports with more than a hundred of records is unrealistic. In addition to that, many companies would like to create Business Intelligence(BI) systems analysing workflow tasks data. Having data warehouse with all the workflow tasks metadata can assist in this job very well.

To be able to implement the solution a few prerequisites must be met. You must know basics of Nintex workflow creation and to be able to create a backend solution with the database of your choice and custom Web API endpoint that allows you to write the data model to that database. In this post we have used Visual Studio 2015 and created ordinary REST Web API 2.0 project with Azure SQL Database.

The solution will involve following steps:

  1. Get sample of your workflow task metadata and create your data model.
  2. Create a Web API capable of writing data model to the database.
  3. Expose one POST endpoint method of the Web REST API that accepts JSON model of the workflow task metadata.
  4. Create Nintex workflow in the SPO list storing your workflow tasks.
  5. Design Nintex workflow: call SPO REST API to get JSON metadata and pass this JSON object to your Web API hook.

Below is detailed description of each step.

We are looking here to export metadata of a workflow task. We need to find the SPO list that holds all your workflow tasks and navigate there. You will need a name of the list to be able to start calling SPO REST API. It is better to use a REST tool to perform Web API requests. Many people use Fiddler or Postman (Chrome Extension) for this job. Request SPO REST API to get a sample of JSON data that you want to put into your database. The request will look similar to this example:

Picture 1

The key element in this request is getbytitle(“list name”), where “list name” is SPO list name of your workflow tasks. Please remember to add header “Accept” with the value “application/json”. It tells SPO to return JSON instead of the HTML. As a result, you will get one JSON object that contains JSON metadata of Task 1. This JSON object is the example of data that you will need to put into your database. Not all fields are required in the data warehouse. We need to create a data model containing only fields of our choice. For example, it can look like this one in C# and all properties are based on model returned earlier:

The next step is to create a Web API that exposes a single method that accepts our model as a parameter from the body of the request. You can choose any REST Web API design. We have created a simple Web API 2.0 in Visual Studio 2015 using general wizard for MVC, Web API 2.0 project. Then, we have added an empty controller and filled  it with the code that works with the Entity Framework to write data model to the database. We have also created code-first EF database context that works with just one entity described above.

The code of the controller:

The code of the database context for Entity Framework

Once you have created the Web API, you should be able to call Web API method like this:

You will need to put your model data in the request body as a JSON object. Also don’t forget to include proper headers for your authentication and header “Accept” with “application/json” and set type of the request to POST. Once you’ve tested the method, you can move on to the next steps. For example, below is how we tested it in our project.

Picture 4

Next, we will create a new Nintex Workflow in the SPO list with our workflow tasks. It is all straightforward. Click Nintex Workflows, then create a new workflow and start designing it.

Picture 5

Picture 6

Once you’ve created a new workflow, click on Workflow Settings button. In the displayed form please set parameters as it shown on screenshot below. We set “Start when items are created” and “Start when items are modified”. In this scenario, any modifications of our Workflow task will start this workflow automatically. It also includes cases when Workflow task have been modified by other workflows.

Picture 7.1

Create 5 steps in this workflow as it shown on the following screenshots labelled as numbers 1 to 5. Please keep in mind that blocks 3 and 5 are there to assist in debugging only and not required in production use.

Picture 7

Step 1. Create a Dictionary variable that contains SPO REST API request headers. You can add any required headers, including Authentication headers. It is essential here to include Accept header with “application/json” in it to tell SPO that we want JSON in responses. We set Output variable to SPListRequestHeaders so we can use it later.

Picture 8

Step 2. Call HTTP Web Service. We call SPO REST API here. It is important to make sure that getbytitle parameter is correctly set to your Workflow Tasks list as we discussed before. The list of fields that we want to be returned is defined in the “$select=…” parameter of OData request. We need only fields that are included in our data model. Other settings are straightforward: we supply our Request Headers created in Step 1 and create two more variables for response. SPListResponseContent will get resulting JSON object that we going to need at the Step 4.

Picture 9

Step 3 is optional. We’ve added it to debug our workflow. It will send an email with the contents of our JSON response from the previous step. It will show us what was returned by SPO REST API.

Picture 10

Step 4. Here we are calling our custom API endpoint with passing JSON object model that we got from SPO REST API. We supply full URL of our Web Hook, Set method to POST and in the Request body we inject SPListResponseContent from the Step 2. We’re also capturing response code to display later in workflow history.

Picture 11

Step 5 is also optional. It writes a log message with the response code that we have received from our API Endpoint.

Picture 12

Once all five steps are completed, we publish this Nintex workflow. Now we are ready for testing.

To test the system, open list of our Workflow tasks. Click on any task and modify any of task’s properties and save the task. This will initiate our workflow automatically. You can monitor workflow execution in workflow history. Once workflow is completed, you should be able to see messages as displayed below. Notice that our workflow has also written Web API response code at the end.

Picture 13

To make sure that everything went well, open your database and check the records updated by your Web API. After every Workflow Task modification you will see corresponding changes in the database. For example:

Picture 14


In this post we have shown that automatic copying of Workflow Tasks metadata into your data warehouse can be done with a simple Nintex Workflow setup and performing only two REST Web API requests. The solution is quite flexible as you can select required properties from the SPO list and export into the data warehouse. We can easily add more tables in case if there are more than one workflow tasks lists. This solution enables creation of powerful reporting system  using data warehouse and also allows to employ BI data analytics tool of your choice.

Implement a SharePoint Timer job using Azure WebJob

The SharePoint Timer service runs in background to do long running tasks. The Timer service does some important SharePoint clean up tasks in the background but can also be used to provide useful functional tasks. For instance, there may be  a situation when you want to send newsletters to your users on regular basis or want to keep your customer up to date with  some regular timed information.

I will be using SharePoint Timer Service to send an email to newly registered customers/users for this demo. The newly registered customers/users are stored in SharePoint list with a status field capturing whether an email has been sent or not.

There are some implementation choices when developing a SharePoint Timer service:

  1. Azure Web Job
  2. Azure Worker Role
  3. Windows Service (can be hosted on premise or vm on Cloud)
  4. Task Scheduler (hosted on premise)

I am choosing WebJob as it is free of cost and I can leverage my Console application as WebJob. Please check why to choose Web Job.

Azure web job does not live it its own. It sits under Azure Web Apps. For this purpose I am going to create a dummy web app and host my Azure web job. I will be hosting all my CSOM code in this web job.

There are two types of web job:

  •  Continuous best fit for queuing application where it keeps receiving messages from queue.
  • On Demand can be scheduled for hourly, weekly and monthly etc.

The Web Job is used to host and execute CSOM code to get information about the user/customers from SharePoint to send email. Following code snippets show what web job is doing:

Querying SharePoint using CSOM and CAML Query:

Sends Email using Office365 Web Exchange:

Composing email using Razor Engine templating engine:

And finally update SharePoint list item using CSOM:

You can download full source code from Codeplex:

When writing a Web Job, the following points should be considered to make your web job diagnosable and reusable:

  1. Do not absorbs exceptions. Handle it first throw it to let web job know something went wrong.
  2. Try to use interfaces so that it can be mocked for unit testings
  3. Always log major steps and errors using Console.WriteLine etc
  4. Make your code like it can be used as console application so that it can be used in Task scheduler
  5. Try to avoid hardcoding. Maximise the use of configuration. It can be plugged from Azure portal as well.

It is time to publish this web job. There are lots of article out there how to create schedule for the web job. I would simply be using Visual Studio to create the schedule before publish it. On Visual Studio, right click the project and click “Publish as Azure Web Job…” and it will launch a UI to specify your schedule as shown below:

Schedule settings

That’s it. Happy SharePointing 🙂

The SharePoint Governance Puzzle

If you have been using or managing SharePoint then you would know how quickly content, customisations and their growth, if not properly governed, can get out of control. Today most organisations deploying SharePoint have realised this and have SharePoint Governance Plans or Policies defined for their deployments. A Governance Plan can just be a brief document focusing on what users can and cannot do or it may be a detailed framework governing every part of SharePoint from custom development to usage and maintenance.

However, a large number of organisations are still struggling with effective governance despite of having a well-defined plan. Often an organisations’ focus is on defining a detailed and comprehensive governance plan that overlooks the process of actually implementing the plan – it is assumed that it is now up to IT support to ensure the processes and policies defined in the plan are enforced and implemented. Although a good number of them would take a step further and create a SharePoint Governance Group, however often the group itself struggles to enforce the standards and policies, specially in the area of day to day operations.

So let’s first take a look at the key implementation challenges and then what can be done to address them.

Implementation Challenges

A. The implementation deserves respect, treat it just like another project!

One of the reasons why the implementation doesn’t get due attention is the absence of some tangible outcomes. Implementing a plan and associated policies is not like delivering a system. Furthermore it is not that straightforward to measure the outcomes and see the associated benefits immediately. As a result, interest in rolling out the governance plan wanes soon after its creation and governance documents are filed and forgotten somewhere on the same SharePoint site that is supposed to be being governed!

B. Yes, it does require (a little bit) of effort and resources!

Another challenge is the availability and allocation of the people and time required to enforce the plan with consistency. This is further complicated when we recall the fundamental principle behind existence of SharePoint: user empowerment. This can give rise to uncontrolled SharePoint growth where the balance of end-user power and IT control is not right for the environment. In other words SharePoint governance is lot different and demanding than managing, say your Active Directory. This makes it difficult for IT to enforce the policies without hindering user productivity (and annoying them).

C. If they don’t know it then they won’t do it!

Finally, enforcing governance does not merely require flicking some switches. The first step in implementing any set of governance policies is communicating them to users and support staff. Users and staff often do not have the time and interest to read lengthy governance documents so you can’t just send them the documents and expect them to be consumed, understood and followed! Being humans (and as SharePoint users) they do not like tightly controlled environments either and need to be convinced that the controls you are putting place will eventually work for them. An unhappy and ill-informed user may, instead, negatively see the governance policies as a roadblock towards being productive.

The secret behind a successful governance plan implementation

Well, there is no secret here really. In summary, it requires the IT management taking the implementation seriously (just like any other project) and seeing the benefits; doing effective user training and communication; and investing in automation to assist the IT teams in enforcing the governance policies.

User Communication and Training

We know very well that no one loves to read lengthy documents. Furthermore, reading text is one thing and actually understanding the content’s meaning is another. So post-creation the next challenge is to communicate the plan to users (end user and support staff) while ensuring it is not too difficult for them for them to understand and remember all the key points.

One technique that can help here is creating summarised posters or cheat-sheets for specific group of users such as SharePoint site members, owners and administrators. You can combine the information with any visualisation techniques that you feel may be effective, like charts, tables and even comics. See a couple of examples below.

SharePoint Site Owners Poster

It is also quite helpful (I would rather say necessary) to integrate the governance plan ‘knowhow’ into HR and IT processes like employees’ induction, role change and termination procedures. This will ensure things happen when they are required to.

The next big thing is … Automation

To help with enforcing governance policies without creating a lot of overhead for IT support and end-users, you can also get assistance from some SharePoint governance tools. There are quite a few third party tools available for this purpose. These tools can provide:

  • finer control in managing the SharePoint environment;
  • powerful growth forecasting and management;
  • comprehensive and centralized security control and permission monitoring;
  • provisioning workflows;
  • critical alerts and notifications etc.

We can divide the tools into two categories – strategic and tactical. The basis of this subdivision is the product feature set, associated costs and resources required to implement them.

Tactical Tools

Tactical tools are low cost options that enable the support teams to analyse and understand their SharePoint implementation and assess its health with reference to the governance plan and policies.

1. SharePoint Documentation Kit (SPDocKit)

A windows application that can run standalone or be installed on the server. SPDocKit:

  • generates detailed farm documentation on the target SharePoint environment
  • compares SharePoint farm configuration against best practices
  • can take snapshot of the environment and allows the support to compare with other SharePoint farms or the same farm but a different point in time

2. The SharePoint Diagram Tool

This is a quite handy tool that enables you to check if your SharePoint sites are having some kind of mushroom growth or flourishing like a beautiful well-maintained garden. It reverse engineers your site structure and let you visualise it in tree structure form. The tool can generates its output in multiple formats and you can then use Excel, a browser or Visual Studio to render the site structure. Regular generation and analysis of site structure diagrams can be added to IT support processes for detecting any major violations of site-structure related governance policies.

The following screenshot shows a real world example , a complex site tree containing 600+ sites .

SharePoint Site Tree

Advanced Tools

1. ControlPoint by Metalogix:

Metalogix offers a suite of tools for migrating and managing SharePoint content and that includes ControlPoint. ControlPoint is a well-refined product with an extensive set of features. It includes all the key components such as provisioning workflows, content growth monitoring, user action reporting and site stats; governance policy and permission management etc.

2. AvePoint Docave Governance Automation:

A solid product for SharePoint content migration and automating governance workflows. It can be seen as a strategic investment i.e. having AvePoint as the provider of some key SharePoint add-ons. They offer a range of SharePoint products, both for SharePoint Online and on-premises. However the governance module mainly focuses on SharePoint sites and site objects provisioning and deletion workflows. You will need to buy additional modules for end-to-end monitoring and reporting purposes.

3. Sharegate – SharePoint migration and Management Tool:

An easy to use product offering both content migration and governance tools in one product. It allows SharePoint admins to manage security settings, monitor environment growth, get rid of unused and obsolete content, ensuring that SharePoint meets the organisational standards. However, if you do not have SharePoint content migration needs then you cannot buy the governance module as a standalone product.

4. Acceleratio Governance Kit for Office 365:

This is a low cost cloud based Office 365 governance tool (they do not have a version for SharePoint on-premise). They also have another product SharePoint Documentation Kit (described in the ‘Tactical Tools’ section previously) that can help with reporting and site structure analysis. With this governance kit administrators can setup and configure SharePoint Online rules. The rules then can be applied to a specific site, list or library. It can generate detailed reports.


I conclude here by emphasizing that first and foremost the implementation of a SharePoint governance plan should be taken and treated like a project. This approach ensures that the SharePoint governance policies do not just stay within the boundaries of a document but they do come out into action and reward everyone. Finally a picture is worth a thousand words, the following diagram concludes this article.


The SharePoint Puzzle

This article was originally published on my own blog at The SharePoint Governance Puzzle

Kloud develops online learning portal for leading education organisation

Customer Overview

Catholic Education South Australia (CESA) is made up of the South Australian Commission for Catholic Schools (SACCS), Catholic Education Office of South Australia (CEO) and 103 Catholic schools across South Australia. The organisation comprises 6,000 staff who care for more than 48,000 students.

Business Situation

Catholic Education South Australia recently made the decision to offer the capabilities of Office 365 to its 103 schools across the state (including Exchange email, Lync, SharePoint and Office on Demand). As part of this offering, CESA sought to leverage Office 365 to provide each school with a portal for students and teachers to collaborate.


Kloud worked with CESA to ensure comprehensive understanding of the requirements and delivered a solution design document based on the needs of the organisation. Following acceptance of the design document, Kloud commenced configuration of the tenant (in particular SharePoint Online) in readiness for the deployment of to-be created templates. Kloud worked closely with the Learning and Technology Team to create conceptual designs for the following types of templates that would be used within each school portal:

  • School
  • Class
  • Community
  • Professional Learning.

From these designs Kloud developed prototypes in SharePoint and iteratively refined them with regular reviews from the Learning and Technologies Team. The final solution included a custom application which created the school sites in Office 365 and a remote provisioning application in Azure for self-service site creation. The latter provided teachers with a mechanism to create their own class, community and professional learning sites based on the predefined template which they could then fine-tune to suit their needs.


The school portal empowers students and teachers to collaborate in a safe and well-monitored environment. They can now easily share documents, images and videos as well as create blogs or post questions in a single place.

Through the class sites, students will be able to spend their time collaborating with others in their class as well as teachers who will provide additional resources and oversight. The community sites allow students to join groups of interest, either social or academic, and is a great way for like-minded students to expand their learning or be more informed about what is happening. Likewise, the professional learning sites allow teachers to share ideas and resources about a subject or stream which will translate to better learning outcomes for students.

“CESA’s Learning and Technologies team worked with consultants from Kloud on the functionality and design of the Office 365 and SharePoint Online templates for schools. We were impressed by the communication and analytical skills used to meet our organisation’s needs and to inform its direction. High levels of expertise supported the project, as well as knowledge of the product, solid task prioritisation, budget management and timely reporting” – Karen Sloan, Learning and Technologies Senior Education Advisor, CESA.

I Keep Getting Prompted to Authenticate in SharePoint Online!

Every once in a while I come across a problem which is vexing and irritating and equally, an absolute joy when it is finally resolved.

The Scene

This particular issue had been on-going for some time with one of my customers with reports coming in from various people saying that they were being prompted to authenticate when opening an Office document in SharePoint Online. “I’ve already logged in so why do I keep getting these prompts?!” – It’s a fair question, especially since the organisation had implemented AD FS to reduce the frequency of authentication dialogue boxes. It seemed that Microsoft was not only being overprotective but overbearing as well. Here’s the dialogue I’m talking about by the way:

Figure 1 – Overprotective

Along with this issue, the customer had started to map network drives to SharePoint. Let’s face it, it’s easier to manage files in a folder structure than in the SPO web interface, and with improvements over the last few years with SharePoint mapping network drives actually works, or at least is should work. While it was working some of the time, there were reports of access issues to these mapped drives, and a different dialogue box which while uglier, on first glance appeared more promising from a technical perspective. A recommendation for a fix and a reference to a KB article!

Figure 2 – Ugly but useful

There are a plethora of articles, wikis, forum posts, etc out there which all talk about adding your SharePoint Online sites to your browser’s Trusted Sites list. The referenced KB article covers it off nicely, and I’ve added the link in case you’ve not come across it, here.

In my particular case the SharePoint Online domains were all happily existing in the IE Trusted Sites list (I checked about 30 times during my investigations)

I’ve mentioned that AD FS is in play, further a redirection URL is used with a Smart Link in this particular environment to provide a more seamless login experience. Until very recently you were not able to customise the Office 365 Portal page, Check out this article for more info on how to brand the Sign-In Page with company branding.

While you can now brand the Sign-In page to your heart’s content (if you have an Azure subscription along with your O365 subscription), you’ll need to type in your UPN at least once (normally not an experience IT wants to establish for a new Intranet rollout), and so smart links are not yet dead.

There’s a couple of good resources for creating SmartLinks, including one written by a Kloudy here and a good Office365 community article here which discusses some of the components of the link.

The Solution

I’ve set the scene, and for most of you you’ve probably bypassed everything above and come straight to this point. For those of you who took the slow route of reading from the top; Thanks for sticking around!

The problem I was seeing was due to a very simple issue really. You need to check the check box which says “Keep me signed in”. It even says so in the ugly helpful dialogue box above. The thing is, if you’re using AD FS and SmartLinks you don’t actually hit the Office 365 login page and so don’t get the opportunity to sign in. We need to pass the “Keep me signed in” option to the AD FS server somehow, and the only way to do that is to encode it in the URL.

How do we do that? As it turns out there’s a way!

At the end of the SmartLink is the LoginOptions attribute.

LoginOption=3 will pass the “Keep me signed in” option as off (Not checked)

LoginOption=1 will pass the “Keep me signed in” option as on (Checked)

Don’t keep me signed in. Boo!!!

Keep me signed in. Yay!!!

I only made this change on the Internal SmartLinks by the way. I’ve left the SmartLinks published to the Internet with LoginOptions=3, we don’t really want people on untrusted PCs and devices to remain logged into Office 365 services.


I’d like to make a big thankyou to Carolyn at Microsoft PSS for helping me resolve this issue.



Map SharePoint Libraries with local file drive – A step-by-step guide

There are a number of articles talking about how to map SharePoint libraries to your local drive, but few of them are in depth, especially on how to set up the prerequisites. I hope this post can provide detailed information for users who need to do this. I will also give an overview and talking about some advantages of mapping SharePoint libraries with local file drive.


You need to make sure the “WebClient” service is up and running on your server. If this service is disabled or missing on your server, you will have an error saying “The folder you entered does not appear to be valid. Please choose another”


To check if “WebClient” service is in your services list, just open services and see. If it is missing, please follow the steps to make it available:

  1. Start the Windows Server Manager.
  2. In the tree view, highlight the Features node.
  3. In the details pane, click Add Features.
  4. In the Add Features Wizard, check the Desktop Experience box, and then click next.
  5. Click Install.
  6. When the Add Features Wizard has finished, click Close.
  7. Click ‘Yes’ when promoted to restart the computer.

After that you should be able to see the “WebClient” service is avaliable in your services:   And then start ‘WebClient’ service. Now you are ready!

Map SharePoint with file drive

  1. Open “Windows Explorer” from your server
  2. Right click on “Computer” icon, select “Map network drive..”
  3. Click “connect to a web site that you can use to store your documents and pictures”
  4. In “specify the location of your website”, input the library url where you want to store files (e.g. http://yourwebapplication/Shared%20Documents). Note in lower versions of SharePoint the url can’t contain %20 charactors, so simply change the url as http://yourwebapplication/Shared Documents in this case.
  5. Give it a name
  6. Follow the instructor and finish!

After that you will be able to see the folder under your computers:   If you just want to quickly open a SharePoint library on your windows explorer, simply reformat your library url and paste into the address bar. For example if the url of your library is http://MySPSite:1234/DocumentLibrary then you should change it to file://MySPSite@1234/DocumentLibrary


Benefits for using this mapping

  1. You can drag and drop files/folders into the mapped drive and it will be synced to SharePoint library (especially when you are using lower SharePoint versions before SharePoint2013)
  2. Manage (adding/updating/changing/deleting) files from either SharePoint or file drive
  3. Leverage version control of SharePoint
  4. Leverage SharePoint search.
  5. Security controls

Office 365 Sharepoint – Search Results web part returning incorrect results for sites

I have been using Search Results Web Part (SRWP) in previous version of SP and it is a convenient way to display results filtered by content type, URL…etc without worrying about user permissions as it is taken care by SP Search.

In SP13 Online I came across a problem where the results returned from SRWP are incorrect when filtered on sites:

Results Preview display correct results – total of 12:

preview results

Results displayed on SRWP – total of 10:

webpart results

This issue is due to “trim duplicates” setting in the web part. Even if the sites have a unique name and the contents might not be the same, the search considers them as duplicates and collapse them in the result set.  Thus, the correct results are actually being trimmed/excluded.

This is a known issue with MS and a fix is expected sometime in the future. For now, there is a workaround to solve this problem:

  1. Export the Search Results Web Part
  2. Open the .webpart file with a text editor
  3. Search for the text “TrimDuplicates”:true
  4. Modify it to “TrimDuplicates”:false
  5. Save the file
  6. Upload and add this web part to page for use