"Cannot complete this action" error in Sharepoint team site

Lately I was assigned a long standing issue in a well known organisation with Sharepoint 2013 on premise, in which some of its users have been getting this weird “Cannot complete this action” error screen whenever they delete a document from a file or modify a list view in their team sites.

Lots and lots of testings were done throughout a few days and I came out with the following an analysis summary:

  • Issue exists in some sub-sites of a team site under a team site collection (Sharepoint 2013 on-premise)
  • Error occurring consistently for team site users (including site collection admin), changes did get actioned/ saved
  • Users have to click back to get back to the previous screen to get back to the site
  • Error didn’t occur for some other team sites
  • No specific error correlation ID, nor anything suspicious in ULS log

Luckily i was able to find an answer from Microsoft. This was due to the fact that there was a load balanzer running which hasn’t had HTTP compression enabled plus these sub-sites having Minimal Download Strategy (MDS) site feature enabled.
Minimal Download Strategy feature was supposed to optimize loading speed for users as it allows the browser to only retrieve the changes required for a web page, rather than the whole page. However, when you have a load balanzer running for Sharepoint, HTTP compression will need to be enabled in order for MDS to work properly. Otherwise, simply turn off the site feature and this error will disappear.
See: https://support.microsoft.com/en-au/help/2934590/modifying-a-list-view-returns-cannot-complete-this-action-when-mds-is

Migrate SharePoint contents using ShareGate


The first step in any migration project is to do the inventory and see what is the size of the data you have got which you are looking to migrate. For the simplicity, this post assumes you have already done this activity and have already planned new destination sites for your existing content. This means you have some excel sheet somewhere to identify where your content is going to reside in your migrated site and every existing site (in scope) have got at least a new home pointer if it qualifies for migration.
The project I was working on had 6 level of sites and subsites within a site collection, for simplicity we just consider one site collection in scope for this post and had finalised it will have max. 3 levels of site and a subsite initially after discussing it with business.
In this blog post, we will be migrating SharePoint 2010 and 2013 contents from on-premise to SharePoint Online (SPO) in our customer’s Office 365 tenant.

Creating Structure

After doing the magic of mapping an existing site collection, site and subsites; we came up with an excel sheet to have the mapping for our desired structure as shown below:




The above files will be used as a reference master sheet to create site structure before pulling the trigger for migrating contents. We will be using PowerShell script below to create the structure for our desired state within our new site collection.


[code language=”powershell”]
$url = “https://your-client-tenant-name.sharepoint.com/”
function CreateLevelOne(){
Connect-PnPOnline -Url $url -Credentials ‘O365-CLIENT-CREDENTIALS’
filter Get-LevelOne {
New-PnPWeb -Title $_.SiteName -Url $_.URL1 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description “Site Structure created as part of Migration”
Import-Csv C:\_temp\Level1.csv | Get-LevelOne
function CreateLevelTwo(){
filter Get-LevelTwo {
$connectUrl = $url + $_.Parent
Connect-PnPOnline -Url $connectUrl -Credentials ‘O365-CLIENT-CREDENTIALS’
New-PnPWeb -Title $_.SiteName -Url $_.URL2 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description “Site Structure created as part of Migration”
Import-Csv C:\_temp\Level2.csv | Get-LevelTwo
function CreateLevelThree(){
filter Get-LevelThree {
$connectUrl = $url + $_.GrandPrent + ‘/’ + $_.Parent
Connect-PnPOnline -Url $connectUrl -Credentials ‘O365-CLIENT-CREDENTIALS’
New-PnPWeb -Title $_.SiteName -Url $_.URL3 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description “Site Structure created as part of Migration”
Import-Csv C:\_temp\Level3.csv | Get-LevelThree

Migrating Contents

Once you have successfully created your site structure, this is the time now to start migrating contents to the newly created structure as per the mapping you have identified earlier. CSV file format looks like below:

Final step is to execute PowerShell script and migrate content using ShareGate commands from your source site to your destination site (as defined in your mapping file above)


[code language=”powershell”]
# folder where files will be produced
$folderPath = “C:\_Kloud\SGReports\”
$folderPathBatches = “C:\_Kloud\MigrationBatches\”
filter Migrate-Content {
# URLs for source and destination
$urlDes = $_.DesintationURL
$urlSrc = $_.SourceURL
# file where migration log will be added again each run of this script
$itemErrorFolderPath = $folderPath + ‘SG-Migration-Log-Webs.csv’
# migration settings used by ShareGate commands
$copysettings = New-CopySettings -OnContentItemExists IncrementalUpdate -VersionOrModerationComment “updated while migration to Office 365”
$pwdDest = ConvertTo-SecureString “your-user-password” -AsPlainText -Force
$siteDest = Connect-Site -Url $urlDes -Username your-user-name -Password $pwdDest
$listsToCopy = @()
$siteSrc = Connect-Site -Url $urlSrc
$listsInSrc = Get-List -Site $siteSrc
foreach ($list in $listsInSrc)
if ($list.Title -ne “Content and Structure Reports” -and
$list.Title -ne “Form Templates” -and
$list.Title -ne “Master Page Gallery” -and
$list.Title -ne “Web Part Gallery” -and
$list.Title -ne “Pages” -and
$list.Title -ne “Style Library” -and
$list.Title -ne “Workflow Tasks”){
$listsToCopy += $list
# building a log entry with details for migration run
$date = Get-Date -format d
$rowLog = ‘”‘ + $siteSrc.Address + ‘”,”‘ + $siteSrc.Title + ‘”,”‘ + $listsInSrc.Count + ‘”,”‘ + $siteDest.Address + ‘”,”‘ + $siteDest.Title + ‘”,”‘ + $listsToCopy.Count + ‘”,”‘ + $date + ‘”‘
$rowLog | Out-File $itemErrorFolderPath -Append -Encoding ascii
Write-Host Copying $listsToCopy.Count out of $listsInSrc.Count lists and libraries to ‘(‘$siteDest.Address’)’
#Write-Host $rowLog
$itemLogFolderPath = $folderPath + $siteSrc.Title + ‘.csv’
#Write-Host $itemLogFolderPath
$result = Copy-List -List $listsToCopy -DestinationSite $siteDest -CopySettings $copysettings -InsaneMode -NoCustomPermissions -NoWorkflows
Export-Report $result -Path $itemLogFolderPath
function Start-Migration($batchFileName)
$filePath = $folderPathBatches + $batchFileName
Import-Csv $filePath | Migrate-Content
Start-Migration -batchFileName “MG-Batch.csv”


In this blog post, we had used a silent mode of ShareGate using which we can run in parallel multiple migration jobs from the different workstations (depending on your ShareGate licensing).
For a complete list of ShareGate PowerShell commands, you can refer a list at URL.
I hope you have found this post useful to create a site structure and migrate contents (list/libraries) to content’s new home inside SharePoint Online.

Connecting Salesforce and SharePoint Online with MuleSoft – Nothing but NET

Often enterprises will choose their integration platform based on the development platform required to build integration solutions. That is, java shops typically choose Oracle ESB, JBoss, IBM WebSphere or MuleSoft to name but a few. Microsoft shops have less choice and typically choose to build custom .NET solutions or use Microsoft BizTalk Server. Choosing an integration platform based on the development platform should not be a driving factor and may limit your options.

Your integration platform should be focused on interoperability. It should support common messaging standards, transport protocols, integration patterns and adapters that allow you to connect to a wide range of line of business systems and SaaS applications. Your integration platform should provide frameworks and pattern based templates to reduce implementation costs and improve the quality and robustness of your interfaces.

Your integration platform should allow your developers to use their development platform of choice…no…wait…what!?!

In this post I will walkthrough integrating Salesforce.com and SharePoint Online using the java based Mule ESB platform while writing nothing but .NET code.

MuleSoft .NET Connector

The .NET connector allows developers to use .NET code in their flows enabling you to call existing .NET Framework assemblies, 3rd party .NET assemblies or custom .NET code. Java Native Interface (JNI) is used to communicate between the Java Virtual Machine (JVM), in which your MuleSoft flow is executing, and the Microsoft Common Language Runtime (CLR) where your .NET code is executed.


To demonstrate how we can leverage the MuleSoft .NET Connector in our flows, I have put together a typical SaaS cloud integration scenario.

Our Scenario

  • Customers (Accounts) are entered into Salesforce.com by the Sales team
  • The team use O365 and SharePoint Online to manage customer and partner related documents.
  • When new customers are entered into Salesforce, corresponding document library folders need to be created in SharePoint.
  • Our interface polls Salesforce for changes and needs to create a new document library folder in SharePoint for this customer according to some business rules
  • Our developers prefer to use .NET and Visual Studio to implement the business logic required to determine the target document library based on Account type (Customer or Partner)

Our MuleSoft flow looks something like this:


  • Poll Salesforce for changes based on a watermark timestamp
  • For each change detected:
    • Log and update the last time we sync’d
    • Call our .NET business rules to determine target document library
    • Call our .NET helper class to create the folder in SharePoint

Salesforce Connector

Using the MuleSoft Salesforce Cloud Connector we configure it to point to our Salesforce environment and query for changes to the Accounts entity. Using DataSense, we configure which data items to pull back into our flow to form the payload message we wish to process.


Business Rules

Our business rules are implemented using a bog standard .NET class library that checks the account type and assign either “Customers” or “Partners” as the target document library. We then enrich the message payload with this value and return it back to our flow.

public object GetDocumentLibrary(SF_Account account)
    var docLib = "Unknown";     // default

    // Check for customer accounts
    if (account.Type.Contains("Customer"))
        docLib = "Customers";

    // Check for partner accounts
    if (account.Type.Contains("Partner"))
        docLib = "Partners";

    return new 
            Name = account.Name, 
            Id = account.Id, 
            LastModifiedDate = account.LastModifiedDate, 
            Type = account.Type,
            DocLib = docLib 

Note: JSON is used to pass non-primitive types between our flow and our .NET class.

So our message payload looks like


and is de-serialised by the .NET connector into our .NET SF_Account object that looks like

public class SF_Account
    public DateTime LastModifiedDate;
    public string Type;
    public string Id;
    public string type;
    public string Name;
    public string DocLib;

Calling our .NET business rules assembly is a simple matter of configuration.


SharePoint Online Helper

Now that we have enriched our message payload with the target document library


we can pass this to our .NET SharePoint client library to connect to SharePoint using our O365 credentials and create the folder in the target document library

public object CreateDocLibFolder(SF_Account account)
    using (var context = new Microsoft.SharePoint.Client.ClientContext(url))
            // Provide client credentials
            System.Security.SecureString securePassword = new System.Security.SecureString();
            foreach (char c in password.ToCharArray()) securePassword.AppendChar(c);
            context.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, securePassword);

            // Get library
            var web = context.Web;
            var list = web.Lists.GetByTitle(account.DocLib);
            var folder = list.RootFolder;

            // Create folder
            folder = folder.Folders.Add(account.Name);
        catch (Exception ex)

    // Return payload to the flow 
    return new { Name = account.Name, Id = account.Id, LastModifiedDate = account.LastModifiedDate, Type = account.Type, DocLib = account.DocLib, Site = string.Format("{0}/{1}", url, account.DocLib) };

Calling our helper is the same as for our business rules



Configuring MuleSoft to know where to load our .NET assemblies from is best done using global configuration references.


We have three options to reference our assembly:

  1. Local file path – suitable for development scenarios.
  2. Global Assembly Cache (GAC) – suitable for shared or .NET framework assemblies that are known to exist on the deployment server.
  3. Packaged – suitable for custom and 3rd party .NET assemblies that get packaged with your MuleSoft project and deployed together.



With our flow completed, coding in nothing but .NET, we are good to test and deploy our package to our Mule ESB Integration Server. At the time of writing, CloudHub does not support the .NET Connector but this should be available in the not too distant future. To test my flow I simply spin up and instance on the development server and watch the magic happen.

We enter an Account in Salesforce with Type of “Customer – Direct”…


and we see a new folder in our “Customers” document library for that Account name in a matter of seconds


SLAM DUNK!…Nothing but NET Smile


Integration is all about interoperability and not just at runtime. It should be a core capability of our Integration framework. In this post we saw how we can increase our Integration capability without the need to sacrifice our development platform of choice by using MuleSoft and the MuleSoft .NET Connector.

SharePoint Online Storage Improvements in Office 365

The following new features are now available in SharePoint Online as part of Office 365:

◦ A 1 TB site collection limit for OneDrive for Business and team sites
◦ Infinite tenant storage scale

1 TB site collection limit increase for OneDrive for Business and team sites

A 1 TB will let you create large team document centers that teams can use without needing to introduce multiple websites and unneeded complexities. All SharePoint Online site collections can now have up to 1 TB of storage allocation. You simply assign the desired amount to the selected site collection in the SharePoint Online admin console.

OneDrive for Business provides a default 25 GB/user storage allocation. Your can now allocate more storage, up to 1 TB, to users in the following increments: 50 GB, 100 GB, 250 GB, 500 GB, and 1024 GB.

Infinite tenant storage scale

Office 365 and SharePoint Online have architected the storage to enable infinite storage scale for current storage needs and unlimited future needs.

Here is the default storage allocation for an Office 365 tenant:
◦ 25 GB of OneDrive for Business storage per user
◦ 50 GB of email storage per user
◦ 5 GB for each site mailbox you create
◦ 10 GB + 500MB times the number of users

If you need more storage than the default allocation, you can now purchase unlimited additional storage right from the SharePoint Online admin center at a cost per gigabyte (GB) per month.

If you are looking to move a portion of your storage from an on-prem environment to Office 365, please contact Kloud Solutions using the URL below:


An Overview of Server Name Indication (SNI) and Creating an IIS SNI Web SSL Binding Using PowerShell in Windows Server 2012

One of the frustrating limitations in supporting secure websites has been the inability to share IP addresses among SSL websites. In the day, there were a few ways to solve this limitation. One, you could use multiple IP addresses, binding a SSL certificate to each combination of an IP address and standard SSL port. This has been the best method to date but it is administratively heavy and not necessarily a good use of valuable IP addresses. Another approach was to use additional non-standard ports for SSL. While this saved IP addresses, you would potentially run up against strict firewall or proxy limitations making this method undesirable. Finally, in the IIS 7 and 7.5 worlds you could use host-headers to share a certificate among websites but you were limited to a single certificate that each web site would have to share.

The reason behind these limitations rest in the handshake that takes place between the browser and the web server. When a SSL client request is initiated, the HTTP header data is not available to the web server. Only after successful handshake are the headers encrypted and sent to the web server. Too late to allow for successful redirection to the desired web site.

Solving this limitation required an extension to the Transport Layer Security (TLS) protocol that includes the addition of what hostname a client is connecting to when a handshake is initiated with a web server. The name of the extension is Server Name Indication (SNI). Of course, extending the definition of a protocol is never as easy as updating an RFC. Both client and server compatibility are required to make use of these extensions. On the client side, roughly 95% of browsers support SNI. Specifically those are:

  • Internet Explorer 7 or later
  • Mozilla Firefox 2.0 or later
  • Opera 8 or later
  • Google Chrome 6 or later
  • Safari 3 or later
  • Test Your Browser

In the Microsoft world, support for the SNI extensions to TLS were introduced with Windows Server 2012 and IIS 8.  Through the Internet Information Services (IIS) Manager and a web sites bindings UI, SNI can be specified for a HTTPS site along with a host header:


There are many resources on the Internet that deal with setting up and configuring a site using SSL bindings as well as utilizing SNI from within the IIS Manager.  Where I’d like to focus the second part of this blog is in creating SNI web-bindings using PowerShell.  As a driver for implementing SNI is the scalability it provides, this scalability might be for naught if not coupled with the ability to deploy a solution without the use of a GUI.

There are three parts to successfully assigning and associating any SSL binding with a website through PowerShell:

  1. A SSL binding needs to be created for the web site
  2. A certificate needs to exist in the local machine certificate store
  3. A SSL binding relationship needs to be created to associate a certificate with a web site
Creating the Web Site Binding

Creating the web site binding is a straightforward process.  The following PowerShell sequence would be used to create the binding and assign the correct port, host header and specification for use of SNI:

[code language=”powershell” gutter=”true” light=”true”]
# Import IIS Management PowerShell Module
Import-Module WebAdministration

$hostHeader = "test.com"

New-WebBinding -Name "Test Website" -Protocol "https" -Port 443 -HostHeader $hostHeader -SslFlags 1

The name specified would be the name of the web site you’d like to add the binding to.  The protocol and port are standard for SSL bindings.  The host header is the URL you’d like the web site to respond to.  Finally, SslFlags with a value of 1 enables SNI for this binding.

Retrieving the Certificate from the Certificate Store

While I won’t cover the process to request a certificate or import the certificate into the local machine store, there are two factors we need to address before using the certificate in the third and final step.

In order to use the certificate in IIS it is critical that the certificate is imported allowing the private key to be exported.  If a certificate is used without an exportable private key, IIS will be unable to bind that certificate.

Creating the SSL association in the third step requires we have some reference to the certificate we’d like to associate with the web site.  There are two values that can be used.  The thumbprint of the certificate or a reference to the certificate object itself.

In order to retrieve the thumbprint of a certificate the following PowerShell command is used:

[code language=”powershell” gutter=”true” light=”true”]
$thumbprint = (Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.FriendlyName -eq "Test Cert"}).Thumbprint

In the above example the friendly name of the certificate is used as the matching context.  One could also use the subject instead.

In order to get a reference to the certificate itself the following syntax can be used:

[code language=”powershell” gutter=”true” light=”true”]
$certificate = Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.FriendlyName -eq "Test Cert"}

After this step you will now either have a direct reference to the certificate or the value for the certificates thumbprint.

Creating the SSL Association

The final step in the puzzle is tying together both the binding and the certificate.  This can be the trickiest part to get right.  PowerShell doesn’t provide a native cmdlet to directly do this. Instead, one needs to use the IIS drive exposed by the WebAdministration module to create a SslBinding object and associate that object with the certificate.

The PowerShell sequence for that task is as follows, if you’re using the certificate object:

[code language=”powershell” gutter=”true” light=”true”]
New-Item -Path "IIS:\SslBindings\!443!test.com" -Value $certificate -SSLFlags 1

If you’re using the thumbprint your command would be:

[code language=”powershell” gutter=”true” light=”true”]
New-Item -Path "IIS:\SslBindings\!443!test.com" -Thumbprint $certificate -SSLFlags 1

If successful, you should receive confirmation displaying the host name, along with the site the host name is bound to.  To confirm that SNI is in use run the following command from the command line:


In the above, notice the SSL binding is using the hostname:port syntax which confirms SNI is in use.

Following the above steps will allow you to take advantage of the new Server Name Indication (SNI) implementation in Windows Server 2012 and IIS 8.

Adding Additional Nodes to a Forefront Identity Manager 2010 R2 Service Pack 1 SharePoint 2013 Farm

Microsoft recently released Service Pack 1 for Forefront Identity Manager 2010 R2. With the release of Service Pack 1 came some really good support for the latest elements that form the foundation of the FIM Portal, namely Windows Server 2012 and SharePoint Foundation 2013.

While basing the FIM 2010 R2 SP1 Portal on a SharePoint 2013 Foundation doesn’t offer any feature advantages over SharePoint 2010, it does provide compatibility with Windows Server 2012 which SharePoint 2010 won’t do until the release of Service Pack 2.

Many small-scale deployments of FIM are based on single instance deployments of WSS or SharePoint Foundation. Where redundancy is required, SharePoint Farms are still not often deployed, instead local WSS or SharePoint Foundation installations are deployed, which are then load balanced. While these deployment scenarios make the setup of the FIM Portal less technically challenging, as a centralised SharePoint Farm does not need to be considered, they sacrifice some of the advantages to deploying a Farm. Namely, the fact that only a single database is required and most importantly that a scale-out of the FIM Portal becomes as easy as adding a new SharePoint Foundation server.

Installing FIM 2010 R2 SP1 on SharePoint Foundation 2013 is well documented in Technet as is installing and configuring a SharePoint Farm. As long as you’re then installing FIM 2010 R2 SP1 to an existing SharePoint Foundation Farm you’ll experience a fairly smooth ride. Where you might experience issues is in adding additional nodes to a SharePoint 2013 Farm as the process requires a few additional steps that unless done will render your new nodes nonfunctional.

This article will run through the process of installing the new node and executing the additional steps to get FIM 2010 R2 SP1 working.

Installing the New Node

There are two primary mechanisms for installing the additional SharePoint 2013 node. One can either install via the SharePoint Products Configuration wizard or through a PowerShell interface.

Installing through the UI is as easy as entering the database server, configuration database name and the pass phrase used when creating the farm:

Once started the configuration wizard will add the new server to the farm and deploy any solution and/or feature packs to the new node. Alternatively, one can use PowerShell to complete the same task as below:

[code language=”powershell” gutter=”true” light=”true”]
$passphrase = (ConvertTo-SecureString "Farm Passphrase" -AsPlainText -force)

Connect-SPConfigurationDatabase -DatabaseServer $databaseServer -DatabaseName $configDatabase -Passphrase $passphrase

Start-Service SPTimerV4

Install-SPHelpCollection -All
Install-SPFeature -AllExistingFeatures[/code]

Getting the FIM Portal Working

By now you should have a SharePoint Farm and an additional farm node configured and working. To test the scenario you can browse to your root site from either server and should receive a functioning site. If, however, you were to try browse to the FIM Portal on the new node at ‘/IdentityManagement’ you would be greeted with the following:


The error above is due to an incomplete deployment of the FIM 2010 R2 SP1 solution pack and features to the new node. When adding a new server to a SharePoint Farm, existing features are deployed to the new server to match the other servers in the farm. Likewise, the FIM Portal is ultimately a combination of a solution pack and features that get deployed to the SharePoint Farm servers.

One of these features ‘MSILM2Configuration’ is configured to deploy a resource file to nodes of the farm. When being installed on an existing farm the feature will be deployed correctly. However, when scaling out the farm and adding servers after the FIM Portal has already been installed. This feature will not deploy the resource file or update the configuration of the web.config for the web application.FIM-SP-FeatureError-1

To complete the deployment of the FIM Portal solution pack and features the following PowerShell command needs to be run.

[code language=”powershell” gutter=”true” light=”true”]
Enable-SPFeature MSILM2Configuration -Url "http://[FQDN of FIM Portal]/IdentityManagement" -Force

The above command will force the missing components of the feature to deploy, resulting in a complete configuration of the FIM Portal on the new SharePoint 2013 Farm server.

I hope this post was helpful and saves others a bit of time configuring additional SharePoint Farm members.

Follow ...+

Kloud Blog - Follow