Community AppStack (Part 1) – Decentralised Realtime Chat Demo With IPFS PubSub and Cycle.js

Introduction

I’ve been trying to come up with a good general-purpose tech stack for building web apps for small, local community user-bases. There are lots of great “share-economy-style” collaborative use cases at this level of organisation and I want a vehicle to test out ideas, to see what works and what doesn’t in my local area, from a social standpoint.

I obviously want to make what I develop freely available in hopes that others might contribute ideas or code and I want people to be able to trust that it does what it says on the tin to be able to use it, so everything needs to go up on github.

Constraints

I’m a big believer in the power of constraints to focus the design process and these were nice and clear for this endeavour in regards to the technology to be used.

  1. Cost – needs to be as close to free to operate as possible
  2. Usability – the barriers to someone using the apps built on the stack need to be as low as possible
  3. Administrative simplicity – spinning up an instance of the stack, needs to be a straight-forward as possible
  4. Open Source tech is to be favoured over proprietary (where feasible)
  5. Flexibility to extend into native mobile versions in future

Serverless Cloud Provider

Being a huge fan of the serverless paradigm and the opportunities it unlocks (not least from a cost perpective), I knew I would be leaning heavily on the serverless capabilities of one of the major cloud providers to build out my solution. Already being well versed in what Azure has to offer (and admittedly knowing precious little about the alternatives), the choice was easy.

Realtime comms on a shoe-string

Get an effectively free static website up is laughably easy these days. So many great services like now and glitch exist in addition to the big-name cloud providers who all have free tiers, but for me the trick was real-time communication (via websockets or similar) was never covered on those platforms (or at least not to the scale I needed to potentially support). Same goes for the many real-time “pub-sub-as-a-service” 3rd party offerings – their free tiers all stop at 100 simultaneous connections (which is probably way more than I need, but still a constraint that I could well bump up against at some point). Plus, as per point 3 above, I don’t particularly want people to have to sign up to multiple cloud providers to get this all working.

The solution: the distributed magic that is IPFS pubsub.

Interplanetary File System (IPFS)

I won’t go into any detail into what IPFS is here – the official website is as good an intro as you could want. Suffice it to say, it is a distributed storage platform for web content, including publish-subscribe realtime messaging capbilities (experimental feature). You don’t have to pay to host an IPFS node, and through some voodoo that I have no idea about, you can even host one in a web browser (a decent modern one). So basically, we get ephemeral (don’t expect it to stick around in the IPFS network) realtime comms for free.

N.B. In the next installment in this series of posts, I’ll be persisting the realtime messages to Azure Table Storage… stay tuned…

Reactive Web Framework – Cycle.js

I’m a relative newcomer to React (love React Native, less enamoured with the original web version), but having worked with RxJS on a large Angular project recently, I was keen to see if I could find a framework that does a better job of managing state (yes, React has redux and and various side effects plugins, but they all seemed a little tacked-on) and uses the power of observables and found one in Cycle.js

Cycle has state management as it’s raison detré or at least, the way that it is structured kind of relegates state to a by-product of how data flows through your application, as opposed to treating it as an object you need to explicitly maintain.

IPFS Driver

Cycle uses a abstraction called a “driver” to handle any external effects (incoming or outgoing) to your application. The primary ones are drivers for interacting with the virtual DOM and making HTTP requests, but there are many others, including myriad community efforts. I couldn’t find one for IPFS, so created one to wrap the ipfs-pubsub-room helper library.

Here we are setting up the incoming (listening for new messages being broadcasted) and outgoing (broadcasting messages input by the user) observable streams. There are more API methods for ipfs-pubsub-room (e.g. for sending private messages to individual peers) but for this example, we’ll stick with the basics.

A quick note on importing modules: I wasn’t able to get IPFS-JS to bundle cleanly on my Windows machine, so am just loading from a CDN in a script tag. Works just fine for the purposes of this demo.

Show me the demo already!

Alright, you’ve been patient – here you go: https://ipfs-cycle-chat.azurewebsites.net/

N.B. Use Chrome for best results – haven’t seen this work in IE or Edge.

I won’t paste the rest of the code here, but you check out the full sample at https://github.com/balbany/ipfs-cycle-demo. Note that the main logic of the Cycle.js app is lifted pretty-much wholesale from CreaturePhil’s cyclejs-chat socket.io sample (cheers Phil!) and the styling is pilfered from here.

Just the beginning…

As alluded to before, this is just the first in a planned series of posts as I build out my little project that I’m calling Community AppStack for now. Please check back for the next installment (or follow me on Twitter @BruceAlbany) and let me know if you liked this one in the comments!

Quickly deploying all the things from Excel with PowerShell

As an ex-Excel Developer, I tend to resolve any perceived inefficiencies when dealing with any tabular data by automating the snot out of it with Excel and VBA. As a SharePoint developer, there is no shortage of stuff that we need to provision into SharePoint to build business solutions, often repeatedly throughout the development cycle or between environments (Dev, Test, UAT, etc).

PowerShell was always going to be the automation tool of choice, and while PowerShell scripts are commonly fed with CSV files of flat tabular data, editing those has always been painful if you’ve got more than a handful of rows. Excel has always been a far better tool than a text editor for quickly editing tabular data, thanks to features like autofill, but that’s just the tip of the iceberg…

PowerShell Deployment Agent

So, I’ve been using Excel and VBA to deploy things for a long time (about 10 years at best guess). I’ve changed companies a handful of times over that period, and every time I do I create a new and better Excel model for generating stuff in whatever platform I was working with. Last year, I started to open source that model as it had reached a maturity level where I no longer want to start over – it’s pretty darned solid. The simplest model of this was called CSV-Exporter and it did what it said on the tin. I’ve extended it into what I now call my PowerShell Deployment Agent (PSDA), which doesn’t just export CSVs from Excel Tables, but also launches a PowerShell script of your choosing to streamline the deployment process.

By careful script design, Excel filtering, and of course some VBA, this allows for some very fine-grained control over exactly what you want to deploy. To the extent that you can just highlight a bunch of rows in a sheet and hit CTRL-SHIFT-D to deploy them to your target:

QuickDeploy

When / Why would you use it?

Excel’s always been a good way to store configuration data, but it really comes into it’s own as launching pad for PowerShell when you have rows of data that need to be pumped through the same cmdlet or function.

To prove how easy it is to get started, we’ll go with the the use-case that I’m usually working with: SharePoint.

So let’s open the workbook and pick something simple that we want to deploy. How about folders in a document library?

All we need to do once we’ve input our target site is:

  • Find the PnP PowerShell cmdlet we want (Add-PnPFolder in this case)
  • Click the ‘New Blank Sheet’ button
  • Select the name for our sheet (‘Folders’ in this case – a new .ps1 file with the same name will be created from a template in the ‘Functions’ subfolder under the path to your deployment script)
  • Copy and paste the cmdlet signature to let PSDA know what columns to map to the cmdlet parameters,
  • Fill in our data
  • Double click the URL of the target site we want to deploy to

AddFolders

This is obviously the most basic scenario and there’s a good chance that you’ll want to customize both the auto-generated script and the table (with some of the advanced features below).

Speaking of… what other benefits of using Excel over raw CSVs are there? I’m glad you asked.

Formulas

You obviously can’t have a calculated value in a CSV file, which means that your PowerShell script is more complex than it needs to be, by performing that calculation on each row at run-time. Excel is clearly the superior tool here – you can see if your calculation is correct right there in the cell.

PSDA Perk – Formulas are exported as values when you are deploying against a target, but are preserved when you want to export your data for versioning, etc.

Comments

You obviously can’t have comments in a CSV, but these are very useful in Excel, particularly in column headers to advise what data/format to put in that column.

PSDA Perk – You can preserve column header comments when exporting from the workbook. They will be reinstated when re-importing.

Data Validation

Want to guard against data entry errors while keeping your script clean? Use Excel to prevent erroneous entries before deployment with data validation. You can restrict based on a formula, or a reference cell range (there’s a reference sheet in the workbook for that).

PSDA Perk – Data validation rules will be exported and reinstated on re-import.

Conditional Formatting

In addition to data validation, conditional formatting is a powerful way to show that some data is incorrect or missing under certain conditions. Obviously, from a data entry standpoint, we don’t need anything too fancy here – usually setting the font or background of the cell to red when a formula evaluates to false is all we need to prompt the user.

PSDA Perk – Basic conditional formatting (as per the above) can be exported and reinstated on re-import.

A Note on Credentials

If you went to the trouble of downloading the workbook and poking it with a stick, you’ll note that there is a single column in the launch table on the control sheet for the credential to use for each target environment. Nowhere to put a password. Because you shouldn’t be storing passwords in Excel workbooks. Ever.

Instead, you should be using something like CredentialManager, which leverages the Windows Credential Manager (a safe place to store your admin passwords). Which means that you just refer to the label of the credential in WCM in the workbook. Nice and clean and means that anyone getting your workbook doesn’t have the keys to your environments listed.

If you are using the outstanding PnP PowerShell cmdlets for deploying to SharePoint Online/On Premises, you get this functionality OOTB (no need to use CredentialManager).

Thanks for checking this out and please try out the workbook and let me know what you like or needs improvement.

Cheers!
Bruce

Angular Bag of Tricks for SharePoint

Introduction

I’ve been using Angular 1.x for building custom UI components and SPAs for SharePoint for years now. There’s a lot of rinse and repeat here, and over time my “stack” of open-source custom Directives and Services that make Angular such a powerful framework, have settled down to a few key ones that I wanted to highlight – hence this post.

Some may be wondering why am I still working almost exclusively with Angular 1.x? Why not Angular 2, or React, or Aurelia? A few reasons:

  • It’s often already in use. Quite often a customer is already leveraging Angular 1.x in a custom masterpage, so it makes sense not to add another framework to the mix.
  • Performance improvements are not a high priority. I work almost entirely with SharePoint Online. The classic ASP.NET pages served up there aren’t exactly blindingly fast to load, so Angular 1 (used carefully) doesn’t slow things down measurably. Will this change when SPFx finally GA’s? Of course! But in the meantime, Angular 1.x is very comfortable, which leads to…
  • Familiarity = Productivity. Ramping up a custom application in SharePoint with Angular is now very quick to do. This is the whole “go with what you know well and can iterate fast on” approach to framework selection. Spend your time building out the logic of your app rather than fighting an unfamiliar framework.
  • The absolute smorgasbord of community-produced libraries that enhance Angular. A lot of the major ones have Angular 2 versions, but there are some notable exceptions (highlighted below).

So here, in order of frequency of use, are the plugins that I go to time and time again. Check them out and star those github repos!

UI-Router 

ui-router

An awesome state-based routing service for Angular (and there are Angular 2 and React versions as well) – more widely used than the default Angular 1 router as it has a fantastic API which allows you resolve asynchronous data calls (via promises) before you transition to the state that needs it. This keeps your controllers/components light and clean. I use this every custom webpart/SPA I build that has more than one view (which is almost all of them).

If you need modals in your app, you can add in the uib-modal extension that allows UI-Bootstrap modals to be declared as state in your UI-Router state config. Great for deep linking through to modal windows!

Angular Formly

formly

Sick of labouring over large form templates? They are time consuming to wire up and maintain – that’s a lot of markup! Formly allows you to declare your form fields in JavaScript instead. This allows for a lot more control and being able to generate the UI on the fly at run time is a killer feature (that I haven’t done enough with to date!). I hope to have another post on this topic very soon…

Formly makes using custom controls / directives in forms really easy and gives you uniform validation rules over all of them. It’s also wonderfully extensible – there’s so much you can do with it, once you learn the basics. I put off trying it for AGES and now I wouldn’t be without it – if you have any user input in your Angular app, do yourself a favour and use Formly!

AG-Grid

ag-grid

The best JavaScript grid component. Period. Like all libraries I’ve mentioned so far, this one is also not just for Angular (this one supports nearly all of the major frameworks, including Aurelia and Vue). There’s a Free and an Enterprise version with a lot of extra bells and whistles. I haven’t had to shell out for Enterprise yet – Free is very fully featured as is. If you have a table of data in your app – you should give this a try for all but the most simple scenarios.

Angular-Gantt

angular-gantt

Here’s the first Angular 1 only library in my toolbox. Makes creating complex Gantt-chart interfaces if not dead easy, at least feasible! I shudder to think what I nightmare it would be to write this kind of functionality from scratch…

There’s loads of features here, just like the other libararies listed.

Angular-Wizard

 

angular-wizardAnother Angular 1-only (although similar Angular 2 projects exist). Great little wizard form directive that allows you to declare steps of your wizard declaratively in your template, or when teamed up with Formly, in your controller. The latter allows you to create dynamic wizard steps by filtering the questions in the next step based on the response to the previous (once again – need to document this in another post in future).

A few extra tricks for SharePoint…

A few other more generic practices when slinging Angular in SharePoint:

  • Don’t be afraid to use $q to wrap your own promises – yes it is overused in a lot of example code on Stack Overflow (hint: if you are calling the $http service, you don’t need $q, just return the result of the $http call), but it’s great if you want/need to use CSOM. Just wrap the result of executeQueryAsync in a promise’s resolve method and you’ve got a far cleaner implementation (no callbacks when you utilise it), so it’s easily packaged up in a service.
  • Create a reusable service layer – lots of people don’t bother to use Angular services, as most example code just keeps the $http calls in the controller for simplicity. Keep all your REST and CSOM calls to interact with SharePoint in a service module and you’ll get a lot more reuse of your code from application to application. Ideally, use ui-router to resolve the promises from your service before the controller is even initialised (as mentioned above).
  • Use Widget Wrangler for hosting your Angular apps in SharePoint pages– this handles all your script dependencies cleanly and lets you host in a ScriptEditor webpart (easily deployed with PnP PowerShell).
  • Think about caching your data or real-time sync – the excellent Angular-Cache is great for client-side caching of data and if your application’s data is frequently updated, you may want to consider a real-time data option to enhance the solution and prevent the need for page refreshes (another post on this coming soon too), such as Firebase or GunJS.
  • Azure Functions-All-The-Things! No more PowerShell running in a scheduled task on a VM for background processing. There is a better (and even cheaper) way.

I hope some people find this useful. Please leave a comment if you’ve got some other Angular-goodness you’d like to share!

Monitor SharePoint Changelog in Azure Function

Azure Functions have officially reached ‘hammer’ status

I’ve been enjoying the ease with which we can now respond to events in SharePoint and perform automation tasks, thanks to the magic of Azure Functions. So many nails, so little time!

The seminal blog post that started so many of us on that road, was of course John Liu’s Build your PnP Site Provisioning with PowerShell in Azure Functions and run it from Flow and that pattern is fantastic for many event-driven scenarios.

One where it currently (at time of writing) falls down is when dealing with a list item delete event. MS Flow can’t respond to this and nor can a SharePoint Designer workflow.

Without wanting to get into Remote Event Receivers (errgh…), the other way to deal with this is after the fact via the SharePoint change log (if the delete isn’t time sensitive). In my use case it wasn’t – I just needed to be able to clean up some associated items in other lists.

SharePoint Change Logs

SharePoint has had an API for getting a log of changes of certain types, against certain objects, in a certain time window since the dawn of time. The best post for showing how to query it from the client side is (in my experience) Paul Schaeflin’s Reading the SharePoint change log from CSOM and was my primary reference for the below PowerShell-based Function.

In my case, I am only interested in items deleted from a single list, but this could easily be scoped to an entire site and capture more/different event types (see Paul’s post for the specifics).

The biggest challenge in getting this working was persisting the change token to Azure Storage, and this wasn’t that difficult in and of itself – it’s just that the PowerShell bindings for Azure are as of yet woefully under-documented (TIP: Get-Content and Set-Content are the key to the PowerShell bindings… easy when you know how). In my case I have an input and output binding to a single Blob Storage blob (to persist the change token for reference the next time the Function runs) and another output to Queue Storage to trigger another function that actually does the cleanup of the other list items linked to the one now sitting in the recycle bin. The whole thing is triggered by an hourly timer. If nothing has been deleted, then no action is taken (other than the persisted token blob update).

A Note on Scaling

Note that if multiple delete events occurred since the last check, then these are all deposited in one message. This won’t cause a problem in my use case (there will never be more than a handful of items deleted in one pass of the Function), but it obviously doesn’t scale well, as too many being handled by the receiving Function would threaten to bump up against the 5 min execution time limit. I wanted to use the OOTB message queue binding for simplicity, but if you needed to push multiple messages, you could simple use the Azure Storage PowerShell cmdlets instead of an out binding.

Code Now Please

Here’s the Function code (following the PnP PowerShell Azure Functions implementation as per John’s article above and liberally stealing from Paul’s guide above).

Going Further

This is obviously a simple example with a single objective, but you could take this pattern and ramp it up as high as you like. By targeting the web instead of a single list, you could push a lot of automation through this single pipeline, perhaps ramping up the execution recurrence to every 5 mins or less if you needed that level of reduced latency. Although watch out for your Functions consumption if you turn up the executions too high!