SharePoint site template error : IsProduction field is not valid or does not exists

Introduction

In this post I will be talking about exception “IsProduction field not accessible or does not exist”. In our case we had saved an existing site as site template in solution gallery and created a new site collection from saved site template but it was breaking with the below exception message.

Error message:

“The field specified with the name IsProduction is not accessible or does not exist”.


Background

The idea of using Site templates feature in SharePoint OnPrem helps with saving site as template and reusing the site template to pre provision the standard site elements in new site collection such as list, libraries, views, workflows, logos, branding and other elements for different department. Site templates are blue print for the site which can be used when we create new site collections.

Here the requirement was to save the existing site collection as site template with all the custom list, libraries, pages, content type and Nintex workflow. When the site collection was saved as site template it gets saved in the solution gallery and then can be available under the custom template section in the new site collection wizard.

Issue was when a new site was created using the saved custom template the provisioning terminated with the error “The field specified with the name IsProduction is not accessible or does not exist”. Since the error was not much descriptive and checking the SharePoint logs did not provide much information either.

To understand the root cause for the error, I checked field reference in site columns and content type but could not find any reference. Next step was to check the site template cab file (can be downloaded from the solution gallery) and looked for the reference in the site artifacts scheme definition files which pointed me to the Nintex list definition.

Nintex maintains a list internally to manage the site workflow definition, this list had a reference to the column “IsProduction”

.

On checking Nintex documentation and forums “The ‘IsProduction’ Field was introduced in 3.1.7.0 for subscription based Nintex. It was later removed due to few critical bugs

Resolution:

To resolve the issue the reference for the column “IsProduction” had to be removed from the site template, then rebuild the package and deploy it to SharePoint.

I have put together the steps briefly to remove the field reference and deploy the wsp to SharePoint

Steps

  1. Download the solution package for site template from the solution gallery in SharePoint site
  2. Change the extension for wsp package to cab. To unzip the cap file we can use tool or command prompt. I had used the command prompt

    Expand -R “Filename.cab” “Destination Folder” -F:*

  3. Once the cab has been unzipped, go to the files folder.
  4. Under Files è List folder => NintexWorkflows è Schema.xml

  5. In the schema definition file remove the reference to the IsProduction field and save the file.

  6. Last step is to rebuild the wsp using the SharePoint stsadmin command prompt. After the wsp is built it has to be uploaded to the solution gallery and activated again.

With the new custom template, I was able to create the site collection without any issues. I hope this will help solve the issue. Happy Coding!!

Web Application ADFS integration error: Invalid Cryptographic Algorithm

Introduction

In this post I will be talking about invalid cryptographic algorithm exception in web application. We have a multi-tenant single sign on asp.net application which connected with different identity provider to enable single sign on experience.

Background

Single sign-on multi-application scenario has been a soughed feature lately to make the user experience seamless across applications. In this case web application (service provider) was integrating with the ADFS 2.0 client hosted on Windows server 2012 R2 to implement single sign on experience for the end user on their network.

The application code written in C# uses component space helper facade to builds the http request using the Service provider configuration input parameters.

  1. Service provider name
  2. Assertion service endpoint url
  3. Service Provider sign on certificate and certificate password.

Certificate which was previously being used in the application for the assertion request had expired and new issued certificate which was when added to the ADFS server( Identity Provider) and the web application( Service Provider) when used was throwing an exception “Cryptographic Exception: Invalid Algorithm specified”.

On looking closely and debugging the code for the error I could notice exception “SAMLSignatureException: Failed to generate signature” was being thrown when it was stepping through the code segment where it was reading the certificate.


Resolution:

Certificate which has to be used by the assertion service expects to have Microsoft Cryptographic Service Provider (CSP) attribute set to “Microsoft Enhanced RSA and AES Cryptographic Provider”.

In this case the default certificate had the service provider configuration set as “Microsoft rsa schannel cryptographic provider”.

Difference is in the list of supported algorithms, key operations and key sizes. Microsoft RSA sChaneel Cryptographic provider doesn’t support the SHA-256 signature.

To check the certificate CSP we can check it using the below command and need to have open ssl on your system.

Command Prompt

\bin\openssl pkcs12 -in WebAppSelfSignedSSO.pfx

Make sure you point correct path to open ssl.

After the command is executed look for the Microsoft CSP Name attribute to confirm is if the CSP supports SHA-256 signature or not.

In this case we need to change the attribute to “Microsoft Enhanced RSA and AES Cryptographic Provider” to support SHA-256.

Then to update the CSP attribute to support SHA-256 signature in assertion request we need to run the below command to update the CSP.

  1. Convert the pfx file to .pem from command prompt

    Once the command is execute successfully it will generate .pem file.

  2. Next convert the .pem back to pfx and update the CSP attribute property

  3. We can verify the CSP property has been changed to “Microsoft Enhanced RSA and AES Cryptographic Provider”

I hope this will help solve the issue. Happy Coding!!

Handle Throttling in SharePoint Online

Introduction

In this post I will be talking about handling throttling in SharePoint online custom code. I had recently faced throttling issue in custom CSOM code running from a console application which was working fine with the exponential delay pattern suggestion by community to handle throttling but lately started facing some issues.


Background

For one of our client we had console application to migrate documents from network file-share to SharePoint and update the corresponding managed metadata using the client side object model. The application was designed to read the list of files to be migrated from csv then pull the file from shared file location managed by another external tool, upload new files to SharePoint and then update the corresponding metadata. All these back and forth calls to SharePoint Online breached the threshold limits set on SharePoint online tenant.

There have been few guidelines suggested by Microsoft to avoid throttling the tenant from custom code.

In this case in order to handle throttling in SharePoint online we used the exponential delay after subsequent requests in order to avoid throttling. There is Git sample which can be used to implement the exponential back off technique.

Git sample helps with

  • Implement Incremental back off and retry pattern
  • Handle http web exception code 429

Problem

After including the exponential back-off as suggested in the above Git sample application was working alright but with some recent changes in Microsoft online tenant lately connection to the site was getting terminated with an exception “Addition to this website has blocked. Please contact your administrator to resolve this problem“.


Resolution

The exception was not very helpful. Once I started looking at the http traffic logs I realized that the CSOM traffic request that was sent to SharePoint Online was undecorated.

Undecorated traffic means when there is no AppID/ App Title or User Agent string for CSOM or Rest call made to SharePoint Online.

Decorated traffic will get prioritize over the traffic which is not properly decorated, so to give priority to the request it is recommended to decorate traffic using AppID/AppTitle or User Agent string in CSOM or REST API call to SharePoint Online.

SPO request was updated to include the user agent string as below

Another change which helped was to increase the client context request timeout property to set it to infinite.

Hope this helps with handling throttling in custom code in SharePoint Online application. Happy Coding!!

Automate deployment pipeline tasks using Gulpjs APIs


Introduction

In this post I will be talking about gulpjs api and how gulp can be useful in automating deployment tasks. In a greenfield project there are a lot of post development tasks that a developer has to focus on besides development and with CI/CD being in focus now, post-deployment tasks are expected to be automated to make deployment pipeline more consistent and repeatable. These repetitive and common tasks not only adds-on to the project time and effort for the developer but also takes the focus away from the primary task.

Overview

In JavaScript ecosystem there are many tool libraries which help’s the developer with various coding tasks. One of the common one is the Linting libraries which are helpful during the development phase. A good linting tool will capture the unhandled error early and can also help make sure a project adheres to a coding standard. Out of the toolkit repository another useful one is the task runner which help automate certain time consuming tasks which have to be done over and over again.

Task runner tools are used to automate those repetitive and time consuming tasks.

GulpJs

Gulp is just another tool out of the javascript toolbox which is used as a stream build system, it is JavaScript based task runner that lets you automate common tasks in a very simple manner.

Gulp has become very popular over the years and comes with huge library of plugin. Gulp automates tasks, helps with handling build pipeline, watching for changes, doing tasks like minification/concatenating of css or javascript files, handling vendor prefix, preparing javascript /html/css files for production or testing.

Gulp passes files through a stream and it starts with set of source files, process those files in the stream and delivers the processed output out of the stream. Gulpjs APIs are used to handle input, alter and deliver output during different phases. In this post, i will focus on is gulp.tasks , gulp.src , gulp.dest and gulp.watch.


In most scenario’s with project deployment constant value or configuration variables have to be managed based on the environment or files need to be updated or inject certain html during build time. This is where gulp api’s can be handy to automate those repetitive tasks in a consistent way. Gulp helps to improve quality, deliver faster and provides better transparency and control over the processes.

Gulp APIs

Gulp.task (name , [,dep] , fn) : As you can see in the task signature Gulp task takes three parameters the first parameter defines the name of the task “customTask” then if next one i.e dependencies is optional so only if there are any dependencies which needs to be passed in can be added or otherwise the last parameter is the callback function.

In the code snippet below after the task definition next the “.src” takes in the path of the files to be parsed. Once source has the files then it is piped to the next step to alter the files. In this example it is performing minify action on the js files and in the last step it will pipe them in the destination path defined in the gulp.dest.

More steps can be added to the pipe here to alter the files or perform more action on the files before passing them to the gulp.dest.

Gulp.src : gulp.src(glob[,options]) : gulp.src looks for source pattern match for the files that we want to pass to the stream and gets them.

Gulp.dest : gulp.dest(folder[,options] ) this api is used as to where we are going to send our files to after files have been altered in the gulp pipeline.

Gulp watch: gulp.watch(glob,[options,callback]) : looks for when an event/change happens on the file system matching glob string or array parameter. If watcher notice a change in the files which match the path mentioned in the glob parameter, then it will execute a task or series of task from the task array in this case it is task1 and task2. If required a callback function can also be added instead of passing in the tasks array.

Gulp watch is useful to run test on changed files, look for code changes and perform tasks.

In general syntax definition below it defines as gulp tasks with two parameters task name and callback function. Here gulp.watch takes two parameters first is the path of the files to be parsed , second one the single task or array of task it will execute. We can also add a callback function here to the watch task.

Example

I will stick to the earlier example to update the configuration variables during deployment. We can consider here an SPFx project to understand that. Web parts usually have constant values that are being used across by different methods and value of the constants change with the environment they are being deployed. In order to make sure it is consistent with the build and to make it more robust we can use the gulp task to define build task.

//Command

gulp setConstants –-env=uat

//files path read by the gulp source command above, it will look for *.base.json files on the folder path below


In the code snipped above a custom task is added to the pipeline as “SetConstants” and an “env” parameter is passed to the callback function. As the task command is run it will read the file in the source path defined i.e. “config/env” folder and pipe the files to the function or another task to alter the file. Once files are altered they can be send to destination using the gulp.dest api.

This will help with updating those repetitive tasks and automate the deployment pipeline.

Hope this helped in getting understanding on Gulpjs APIs in project to automate tasks during deployment. Happy Coding!!


SharePoint Online forecast storage requirements

Background

Office 365 users get quite a bit of storage on SharePoint Online for content, be it files, metadata, etc. But still to manage the storage and forecast as to when additional storage has to be added becomes a challenge with very limited analytics available in SharePoint Online. Since adding more storage cost money so adding before you actually require or a bit too late would not be ideal.

SharePoint Online provides two ways to track the storage one from the admin center and other one from within the site collection using Storage metrics.

In SharePoint Online tenant’s admin center it gives us a bar matrix with details about the total storage, used storage and storage available. Here customers can manage the total amount of space allocated to each site collection, the total amount of space utilized and available.

To check Storage Metrics to the Site collection under site settings we can find the Storage metrics link, storage metrics provides us with the space utilization breakdown by each sub site, library and list in that Site Collection.

As shown in the image below sub sites, libraries, lists etc. are listed with size, % of parent. This can help in finding resources which are consuming most resources.

There are some third party storage monitoring utilities available to connect to SharePoint online to determine usage patterns and trends.

But if we need to forecast storage usage, find out the site collection which is growing the most over a period of time in a tenant, with more than hundreds of site collection then it becomes a challenge to keep that track.

Solution

We can use PowerShell script and get the storage information from the tenant, store it in a SharePoint list on a periodic basis. Once we have this data available then on this data Power BI can be used to build a report which shows the growth % for each site collection over a period and get some insight to the storage.

By using PowerShell, we get heaps of information from SharePoint tenant using the command Get-SPOSite but we will here use it to get site collection list, storage information.

Get-SPOSite -Detailed -Limit All | select *

Then once the information is retrieved from SPO, use PowerShell to iterate over the data and save the relevant data to a SharePoint list.

SharePoint list Columns: Site Name/Title, Site Url, Storage, Report Run Date.

Data in SharePoint list can be filtered, grouped and sorted in list views to make it more usable for business to analyze and predict storage growth and requirements.

In order to get detailed analytics on the data, use Power BI reports which can help in putting data together in form of reports and predict the growth percentage for each site collection. PowerBI provides many connectors one of them is for SharePoint Online to easily connect with lists. To set up connector in Power BI check this Kloud Blog.

Below is the Power BI report that was built on the SharePoint list data which was pulled in using PowerShell, Data in the report below is sorted by sites with maximum storage increase within a selected interval.

Using Power BI more smarts can be added like filters, graphs and make the data more usable.

Report details

  • x-axis : List of site collection
  • y-axis : Storage value’s in MB

As we hover the curser over the graph it shows the Max and Min value for the storage over the selected period of time. The graph captures the sites in decreasing order of growth value by storage.

Tabular view for the report.

PowerShell script can be scheduled to run periodically and gather the data by weekly or fortnightly we can calculate the average weekly or fortnightly storage requirement for the tenant.

Secondly data captured is used to forecast weekly storage requirement. From SPO tenant we get details about total storage, used storage, storage available.

Average weekly storage requirement can be taken from list view grouped by week and aggregate of the storage as shown below in the image.

Once we have the above information then we get the average requirement for week is 24,349 MB i.e. 24 GB (difference in storage growth in two weeks) in this case, this can be used to predict the weekly storage requirement which can help forecast as to when the tenant storage is going to run out and additional storage would be required.

SharePoint Online external user access error “User Not in directory”

Background

Organization wants to share their SharePoint online site collection, documents and collaborate with external partners, vendors or customers. By default site collection are shared to internal user’s only. But this can be extended to authenticated external users or also with limited sharing to anonymous users. External users do not have a license to office 365 subscription, they are limited to basic collaboration tasks.

I had recently enabled external access for site collection on the SPO tenant only to selected domains and authenticated external users. External sharing in SharePoint online works well in most scenario’s but there are few issues which pop up while enabling access for external user’s and with limited error details it becomes a bit challenging to understand the cause.

Problem

Error: “User Not in directory”

Error message which user’s get as they try to login to external SharePoint site is quiet generic ” User not in directory “it is not that descriptive and did not points to the cause for the issue.

Solution

To troubleshoot the access for the user. Clear browser cache or open Incognito or Private session, then try below steps.

First check to make sure the account which is used to accept the email invitation to the site is the same account which is being used to login later.

In Office 365 login screen if below screen is popping up prompting “Which account do you want to use?” when you sign in, it means that two different accounts have been configured with Microsoft using the same email address:

A “Work or school” account, which probably was created by your IT department

A “personal” account, which you probably created later on by the user.   

            

Personal account can be renamed which means using a different email address to sign in to it. To fix it follow this KB article (https://support.microsoft.com/en-us/help/11545/microsoft-account-rename-your-personal-account)

If external user’s accepted the invite using the personal account and later on try to connect by selecting the work account and is getting the error “User not in directory”. This is the most common cause for the error. Make sure the user is using the same account to accept the invite and log-on to the site.

Secondly if the account used for accepting the invite and login are same and still the error screen pop’s up, then user account has to be set up again, but before we need to do the clean up on the existing references for user profile and remove the user from SharePoint and then send fresh invite. To remove the user and all references follow the below steps.

External users are managed from a site-collection–by–site-collection basis. An external user account must be removed from each site collection that the user was granted access to.

Browse to each site collection that the user previously had access to, and then follow below steps:

  • In the site collection, edit the URL in the browser by appending the following string to the site address:
    _layouts/15/people.aspx/membershipGroupId=0

  • Select the user from the list and Click Delete. Then once user is removed next

  • Start the SharePoint Online Management Shell.
  • Type the following cmdlet:
    $cred = Get-Credential
    In the Windows PowerShell Credential required dialog box, type your site collection admin account and password, and then click OK.
  • Connect to SharePoint Online, and then type the following cmdlet:
    Connect-SPOService -Url https://tenant-admin.sharepoint.com -Credential $cred
  • Remove the user from each site collection by using the following cmdlet:
    $ExtUser = Get-SPOExternalUser -filter someone@example.com
  • Type the following cmdlet:
    Remove-SPOExternalUser -UniqueIDs @($ExtUser.UniqueId)

Then we can add back the user and resend the invite. It should fix up the issue.

Last thing to check is that user has a proper role assigned to user account under user profile in the office portal.

  • To check the role assigned to the user, Go to office 365 admin center.
  • Sign in with global administrator’s account.
  • Check the external user in Users>Active users, then, check the roles of the external user and change it to User (no admin access).