Connecting Salesforce and SharePoint Online with MuleSoft – Nothing but NET

Often enterprises will choose their integration platform based on the development platform required to build integration solutions. That is, java shops typically choose Oracle ESB, JBoss, IBM WebSphere or MuleSoft to name but a few. Microsoft shops have less choice and typically choose to build custom .NET solutions or use Microsoft BizTalk Server. Choosing an integration platform based on the development platform should not be a driving factor and may limit your options.

Your integration platform should be focused on interoperability. It should support common messaging standards, transport protocols, integration patterns and adapters that allow you to connect to a wide range of line of business systems and SaaS applications. Your integration platform should provide frameworks and pattern based templates to reduce implementation costs and improve the quality and robustness of your interfaces.

Your integration platform should allow your developers to use their development platform of choice…no…wait…what!?!

In this post I will walkthrough integrating Salesforce.com and SharePoint Online using the java based Mule ESB platform while writing nothing but .NET code.

MuleSoft .NET Connector

The .NET connector allows developers to use .NET code in their flows enabling you to call existing .NET Framework assemblies, 3rd party .NET assemblies or custom .NET code. Java Native Interface (JNI) is used to communicate between the Java Virtual Machine (JVM), in which your MuleSoft flow is executing, and the Microsoft Common Language Runtime (CLR) where your .NET code is executed.

mule-net-connector---jni_thumb

To demonstrate how we can leverage the MuleSoft .NET Connector in our flows, I have put together a typical SaaS cloud integration scenario.

Our Scenario

  • Customers (Accounts) are entered into Salesforce.com by the Sales team
  • The team use O365 and SharePoint Online to manage customer and partner related documents.
  • When new customers are entered into Salesforce, corresponding document library folders need to be created in SharePoint.
  • Our interface polls Salesforce for changes and needs to create a new document library folder in SharePoint for this customer according to some business rules
  • Our developers prefer to use .NET and Visual Studio to implement the business logic required to determine the target document library based on Account type (Customer or Partner)

Our MuleSoft flow looks something like this:

mule_net_connector_-_flow_thumb5

  • Poll Salesforce for changes based on a watermark timestamp
  • For each change detected:
    • Log and update the last time we sync’d
    • Call our .NET business rules to determine target document library
    • Call our .NET helper class to create the folder in SharePoint

Salesforce Connector

Using the MuleSoft Salesforce Cloud Connector we configure it to point to our Salesforce environment and query for changes to the Accounts entity. Using DataSense, we configure which data items to pull back into our flow to form the payload message we wish to process.

mule_net_connector_-_sf_connector_th

Business Rules

Our business rules are implemented using a bog standard .NET class library that checks the account type and assign either “Customers” or “Partners” as the target document library. We then enrich the message payload with this value and return it back to our flow.

public object GetDocumentLibrary(SF_Account account)
{
    var docLib = "Unknown";     // default

    // Check for customer accounts
    if (account.Type.Contains("Customer"))
        docLib = "Customers";

    // Check for partner accounts
    if (account.Type.Contains("Partner"))
        docLib = "Partners";

    return new 
        { 
            Name = account.Name, 
            Id = account.Id, 
            LastModifiedDate = account.LastModifiedDate, 
            Type = account.Type,
            DocLib = docLib 
        };
}

Note: JSON is used to pass non-primitive types between our flow and our .NET class.

So our message payload looks like

{
	"LastModifiedDate":"2014-11-02T11:11:07.000Z",
	"Type":"TechnologyPartner",
	"Id":"00190000016hEhxAAE",
	"type":"Account",
	"Name":"Kloud"
}

and is de-serialised by the .NET connector into our .NET SF_Account object that looks like

public class SF_Account
{
    public DateTime LastModifiedDate;
    public string Type;
    public string Id;
    public string type;
    public string Name;
    public string DocLib;
}

Calling our .NET business rules assembly is a simple matter of configuration.

mule_net_connector_-_business_rules_

SharePoint Online Helper

Now that we have enriched our message payload with the target document library

{
	"LastModifiedDate":"2014-11-02T11:11:07.000Z",
	"Type":"TechnologyPartner",
	"Id":"00190000016hEhxAAE",
	"type":"Account",
	"Name":"Kloud",
	"DocLib":"Partners"
}

we can pass this to our .NET SharePoint client library to connect to SharePoint using our O365 credentials and create the folder in the target document library

public object CreateDocLibFolder(SF_Account account)
{
    using (var context = new Microsoft.SharePoint.Client.ClientContext(url))
    {
        try
        {
            // Provide client credentials
            System.Security.SecureString securePassword = new System.Security.SecureString();
            foreach (char c in password.ToCharArray()) securePassword.AppendChar(c);
            context.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, securePassword);

            // Get library
            var web = context.Web;
            var list = web.Lists.GetByTitle(account.DocLib);
            var folder = list.RootFolder;
            context.Load(folder);
            context.ExecuteQuery();

            // Create folder
            folder = folder.Folders.Add(account.Name);
            context.ExecuteQuery();
        }
        catch (Exception ex)
        {
            System.Diagnostics.Debug.WriteLine(ex.ToString());
        }
    }

    // Return payload to the flow 
    return new { Name = account.Name, Id = account.Id, LastModifiedDate = account.LastModifiedDate, Type = account.Type, DocLib = account.DocLib, Site = string.Format("{0}/{1}", url, account.DocLib) };
}

Calling our helper is the same as for our business rules

mule_net_connector_-_sharepoint_help

Configuration

Configuring MuleSoft to know where to load our .NET assemblies from is best done using global configuration references.

mule_net_connector_-_global_elements[2]

We have three options to reference our assembly:

  1. Local file path – suitable for development scenarios.
  2. Global Assembly Cache (GAC) – suitable for shared or .NET framework assemblies that are known to exist on the deployment server.
  3. Packaged – suitable for custom and 3rd party .NET assemblies that get packaged with your MuleSoft project and deployed together.

mule_net_connector_-_global_elements[1]

Deployment

With our flow completed, coding in nothing but .NET, we are good to test and deploy our package to our Mule ESB Integration Server. At the time of writing, CloudHub does not support the .NET Connector but this should be available in the not too distant future. To test my flow I simply spin up and instance on the development server and watch the magic happen.

We enter an Account in Salesforce with Type of “Customer – Direct”…

mule_net_connector_-_sf_customerng_t

and we see a new folder in our “Customers” document library for that Account name in a matter of seconds

mule_net_connector_-_sharepoint_fold[1]

SLAM DUNK!…Nothing but NET Smile

Conclusion

Integration is all about interoperability and not just at runtime. It should be a core capability of our Integration framework. In this post we saw how we can increase our Integration capability without the need to sacrifice our development platform of choice by using MuleSoft and the MuleSoft .NET Connector.

SharePoint Online Storage Improvements in Office 365

The following new features are now available in SharePoint Online as part of Office 365:

◦ A 1 TB site collection limit for OneDrive for Business and team sites
◦ Infinite tenant storage scale

1 TB site collection limit increase for OneDrive for Business and team sites

A 1 TB will let you create large team document centers that teams can use without needing to introduce multiple websites and unneeded complexities. All SharePoint Online site collections can now have up to 1 TB of storage allocation. You simply assign the desired amount to the selected site collection in the SharePoint Online admin console.

OneDrive for Business provides a default 25 GB/user storage allocation. Your can now allocate more storage, up to 1 TB, to users in the following increments: 50 GB, 100 GB, 250 GB, 500 GB, and 1024 GB.

Infinite tenant storage scale

Office 365 and SharePoint Online have architected the storage to enable infinite storage scale for current storage needs and unlimited future needs.

Here is the default storage allocation for an Office 365 tenant:
◦ 25 GB of OneDrive for Business storage per user
◦ 50 GB of email storage per user
◦ 5 GB for each site mailbox you create
◦ 10 GB + 500MB times the number of users

If you need more storage than the default allocation, you can now purchase unlimited additional storage right from the SharePoint Online admin center at a cost per gigabyte (GB) per month.

If you are looking to move a portion of your storage from an on-prem environment to Office 365, please contact Kloud Solutions using the URL below:

http://www.kloud.com.au/contact-us/

An Overview of Server Name Indication (SNI) and Creating an IIS SNI Web SSL Binding Using PowerShell in Windows Server 2012

One of the frustrating limitations in supporting secure websites has been the inability to share IP addresses among SSL websites. In the day, there were a few ways to solve this limitation. One, you could use multiple IP addresses, binding a SSL certificate to each combination of an IP address and standard SSL port. This has been the best method to date but it is administratively heavy and not necessarily a good use of valuable IP addresses. Another approach was to use additional non-standard ports for SSL. While this saved IP addresses, you would potentially run up against strict firewall or proxy limitations making this method undesirable. Finally, in the IIS 7 and 7.5 worlds you could use host-headers to share a certificate among websites but you were limited to a single certificate that each web site would have to share.

The reason behind these limitations rest in the handshake that takes place between the browser and the web server. When a SSL client request is initiated, the HTTP header data is not available to the web server. Only after successful handshake are the headers encrypted and sent to the web server. Too late to allow for successful redirection to the desired web site.

Solving this limitation required an extension to the Transport Layer Security (TLS) protocol that includes the addition of what hostname a client is connecting to when a handshake is initiated with a web server. The name of the extension is Server Name Indication (SNI). Of course, extending the definition of a protocol is never as easy as updating an RFC. Both client and server compatibility are required to make use of these extensions. On the client side, roughly 95% of browsers support SNI. Specifically those are:

  • Internet Explorer 7 or later
  • Mozilla Firefox 2.0 or later
  • Opera 8 or later
  • Google Chrome 6 or later
  • Safari 3 or later
  • Test Your Browser

In the Microsoft world, support for the SNI extensions to TLS were introduced with Windows Server 2012 and IIS 8.  Through the Internet Information Services (IIS) Manager and a web sites bindings UI, SNI can be specified for a HTTPS site along with a host header:

SNI-SSL-Binding-1

There are many resources on the Internet that deal with setting up and configuring a site using SSL bindings as well as utilizing SNI from within the IIS Manager.  Where I’d like to focus the second part of this blog is in creating SNI web-bindings using PowerShell.  As a driver for implementing SNI is the scalability it provides, this scalability might be for naught if not coupled with the ability to deploy a solution without the use of a GUI.

There are three parts to successfully assigning and associating any SSL binding with a website through PowerShell:

  1. A SSL binding needs to be created for the web site
  2. A certificate needs to exist in the local machine certificate store
  3. A SSL binding relationship needs to be created to associate a certificate with a web site
Creating the Web Site Binding

Creating the web site binding is a straightforward process.  The following PowerShell sequence would be used to create the binding and assign the correct port, host header and specification for use of SNI:

# Import IIS Management PowerShell Module
Import-Module WebAdministration

$hostHeader = "test.com"

New-WebBinding -Name "Test Website" -Protocol "https" -Port 443 -HostHeader $hostHeader -SslFlags 1

The name specified would be the name of the web site you’d like to add the binding to.  The protocol and port are standard for SSL bindings.  The host header is the URL you’d like the web site to respond to.  Finally, SslFlags with a value of 1 enables SNI for this binding.

Retrieving the Certificate from the Certificate Store

While I won’t cover the process to request a certificate or import the certificate into the local machine store, there are two factors we need to address before using the certificate in the third and final step.

In order to use the certificate in IIS it is critical that the certificate is imported allowing the private key to be exported.  If a certificate is used without an exportable private key, IIS will be unable to bind that certificate.

Creating the SSL association in the third step requires we have some reference to the certificate we’d like to associate with the web site.  There are two values that can be used.  The thumbprint of the certificate or a reference to the certificate object itself.

In order to retrieve the thumbprint of a certificate the following PowerShell command is used:

$thumbprint = (Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.FriendlyName -eq "Test Cert"}).Thumbprint

In the above example the friendly name of the certificate is used as the matching context.  One could also use the subject instead.

In order to get a reference to the certificate itself the following syntax can be used:

$certificate = Get-ChildItem Cert:\LocalMachine\My | Where-Object {$_.FriendlyName -eq "Test Cert"}

After this step you will now either have a direct reference to the certificate or the value for the certificates thumbprint.

Creating the SSL Association

The final step in the puzzle is tying together both the binding and the certificate.  This can be the trickiest part to get right.  PowerShell doesn’t provide a native cmdlet to directly do this. Instead, one needs to use the IIS drive exposed by the WebAdministration module to create a SslBinding object and associate that object with the certificate.

The PowerShell sequence for that task is as follows, if you’re using the certificate object:

New-Item -Path "IIS:\SslBindings\!443!test.com" -Value $certificate -SSLFlags 1

If you’re using the thumbprint your command would be:

New-Item -Path "IIS:\SslBindings\!443!test.com" -Thumbprint $certificate -SSLFlags 1

If successful, you should receive confirmation displaying the host name, along with the site the host name is bound to.  To confirm that SNI is in use run the following command from the command line:

SNI-SSL-Binding-2

In the above, notice the SSL binding is using the hostname:port syntax which confirms SNI is in use.

Following the above steps will allow you to take advantage of the new Server Name Indication (SNI) implementation in Windows Server 2012 and IIS 8.

kloudlogo.jpg

Adding Additional Nodes to a Forefront Identity Manager 2010 R2 Service Pack 1 SharePoint 2013 Farm

Microsoft recently released Service Pack 1 for Forefront Identity Manager 2010 R2. With the release of Service Pack 1 came some really good support for the latest elements that form the foundation of the FIM Portal, namely Windows Server 2012 and SharePoint Foundation 2013.

While basing the FIM 2010 R2 SP1 Portal on a SharePoint 2013 Foundation doesn’t offer any feature advantages over SharePoint 2010, it does provide compatibility with Windows Server 2012 which SharePoint 2010 won’t do until the release of Service Pack 2.

Many small-scale deployments of FIM are based on single instance deployments of WSS or SharePoint Foundation. Where redundancy is required, SharePoint Farms are still not often deployed, instead local WSS or SharePoint Foundation installations are deployed, which are then load balanced. While these deployment scenarios make the setup of the FIM Portal less technically challenging, as a centralised SharePoint Farm does not need to be considered, they sacrifice some of the advantages to deploying a Farm. Namely, the fact that only a single database is required and most importantly that a scale-out of the FIM Portal becomes as easy as adding a new SharePoint Foundation server.

Installing FIM 2010 R2 SP1 on SharePoint Foundation 2013 is well documented in Technet as is installing and configuring a SharePoint Farm. As long as you’re then installing FIM 2010 R2 SP1 to an existing SharePoint Foundation Farm you’ll experience a fairly smooth ride. Where you might experience issues is in adding additional nodes to a SharePoint 2013 Farm as the process requires a few additional steps that unless done will render your new nodes nonfunctional.

This article will run through the process of installing the new node and executing the additional steps to get FIM 2010 R2 SP1 working.

Installing the New Node

There are two primary mechanisms for installing the additional SharePoint 2013 node. One can either install via the SharePoint Products Configuration wizard or through a PowerShell interface.

Installing through the UI is as easy as entering the database server, configuration database name and the pass phrase used when creating the farm:

SP-UI-2SP-UI-2
Once started the configuration wizard will add the new server to the farm and deploy any solution and/or feature packs to the new node. Alternatively, one can use PowerShell to complete the same task as below:

$passphrase = (ConvertTo-SecureString "Farm Passphrase" -AsPlainText -force)

Connect-SPConfigurationDatabase -DatabaseServer $databaseServer -DatabaseName $configDatabase -Passphrase $passphrase

Start-Service SPTimerV4

Install-SPHelpCollection -All
Initialize-SPResourceSecurity
Install-SPService
Install-SPFeature -AllExistingFeatures

Getting the FIM Portal Working

By now you should have a SharePoint Farm and an additional farm node configured and working. To test the scenario you can browse to your root site from either server and should receive a functioning site. If, however, you were to try browse to the FIM Portal on the new node at ‘/IdentityManagement’ you would be greeted with the following:

Bug

The error above is due to an incomplete deployment of the FIM 2010 R2 SP1 solution pack and features to the new node. When adding a new server to a SharePoint Farm, existing features are deployed to the new server to match the other servers in the farm. Likewise, the FIM Portal is ultimately a combination of a solution pack and features that get deployed to the SharePoint Farm servers.

One of these features ‘MSILM2Configuration’ is configured to deploy a resource file to nodes of the farm. When being installed on an existing farm the feature will be deployed correctly. However, when scaling out the farm and adding servers after the FIM Portal has already been installed. This feature will not deploy the resource file or update the configuration of the web.config for the web application.FIM-SP-FeatureError-1

To complete the deployment of the FIM Portal solution pack and features the following PowerShell command needs to be run.

Enable-SPFeature MSILM2Configuration -Url "http://[FQDN of FIM Portal]/IdentityManagement" -Force

The above command will force the missing components of the feature to deploy, resulting in a complete configuration of the FIM Portal on the new SharePoint 2013 Farm server.

I hope this post was helpful and saves others a bit of time configuring additional SharePoint Farm members.