Report of All Taxonomy Fields containing a term in SharePoint Tenancy

Recently we had a request to find fields/columns in all lists across the tenancy which have a specific Taxonomy term because we needed to report on field usage across all site collections. However, we found that getting a report of all Taxonomy fields in your SharePoint tenancy that is linked to a specific Term Set can get quite daunting because there is no direct SharePoint Query to fetch the associations.

The technical challenge is that using PnP PowerShell, the Taxonomy fields are returned as a generic SP.Field type and not of type SP.TaxonomyField. Hence the Taxonomy field metadata values such as Group ID and Termset ID are absent. To resolve the above limitation, we used the Field $field.SchemaXml to find the specified values.

Note: Querying the Term store while searching for a specific termset by using Get-PnPTerm can add a lot of latency time. Hence to decrease the additional time we could download the entire term store to an Excel file and use that Excel file as the master data for matching. Below is the command to get an export of all Taxonomy values as a CSV file.

Export-PnPTaxonomy -Path "[path]\taxonomyreport.csv" -Delimiter "," -IncludeID

Steps:

The steps to retrieve and check for taxonomy fields can be found below.

1. Get all lists in a web site
2. Get the Taxonomy fields for a List
3. Read the schema.xml and search for a TermsetID and AnchorID (Thanks to @Colin Philips (http://itgroove.net/mmman/) for finding the correct xpath for parsing xml)
4. Match the data with the above Taxonomy report for Group ID, Termset ID, and Anchor ID with Term ID that is column is linked to
5. In case of a match, save the values into the report.

The below code uses PnP PowerShell. For a quick set up of PnP PowerShell, please refer to this blog.

Conclusion:

Hence with the above approach, we can retrieve the taxonomy fields for all site collections.  Be aware that the above process can take about half a day or more to run depending on the number of site collections and taxonomy fields in your tenancy.  Be sure to give it enough time to run before prematurely cancelling it.

Happy Coding!! 😊

Building a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller

Introduction

This is the third and final post on my recent experiments integrating small micro controllers (ESP8266) running Mongoose OS integrated with Azure IoT Services.

In the first post in this series I detailed creating the Azure IoT Hub and registering a NodeMCU (ESP8266 based) micro controller with it. The post detailing that can be found here. Automating the creation of Azure IoT Hubs and the registration of IoT Devices with PowerShell and VS Code

In the second post I detailed communicating with the micro controller (IoT device) using MQTT and PowerShell. That post can be found here. Integrating Azure IoT Devices with MongooseOS MQTT and PowerShell

Now that we have end to end functionality it’s time to do something with it.

I have two teenagers who’ve been trained well to use headphones. Whilst this is great at not having to hear the popular teen bands of today, and numerous Facetime, Skype, Snapchat and similar communications it does come with the downside of them not hearing us when we require their attention and they are at the other end of the house. I figured to avoid the need to shout to get attention, a simple visual notification could be built to achieve the desired result. Different colours for different requests? Sure why not. This is that project, and the end device looks like this.

Overview

Quite simply the solution goes like this;

  • With the Microsoft Flow App on our phones we can select the Flow that will send a notification

2018-03-25 18.56.38 500px.png

  • Choose the Notification intent which will drive the color displayed on the Teenager Notifier.

2018-03-25 18.56.54 500px

  • The IoT Device will then display the color in a revolving pattern as shown below.

The Architecture

The end to end architecture of the solution looks like this.

IoT Cloud to Device - NeoPixel - 640px

Using the Microsoft Flow App on a mobile device gives a nice way of having a simple interface that can be used to trigger the notification. Microsoft Flow sends the desired message and details of the device to send it to, to an Azure Function that puts a message into an MQTT queue associated with the Mongoose OS driven Azure IoT Device (ESP8266 based NodeMCU micro controller) connected to an Azure IoT Hub. The Mongoose OS driven Azure IoT Device takes the message and displays the visual notification in the color associated with the notification type chosen in Microsoft Flow at the beginning of the process.

The benefits of this architecture are;

  • the majority of the orchestration happens in Azure, yet thanks to Azure IoT and MQTT no inbound connection is required where the IoT device resides. No port forwarding / inbound rules to configure on your home router. The micro controller is registered with our Azure IoT Hub and makes an outbound connection to subscribe to its MQTT topic. As soon as there is a message for the device it triggers its logic and does what we’ve configured
  • You can initiate a notification from anywhere in the world (most simply using the Flow mobile app as shown above)
  • And using Mongoose OS allows for the device to be managed remote via the Mongoose OS Dashboard. This means that if I want to add an additional notification (color) I can update Flow for a new option to select and update the configuration on the Notifier device to display the new color if it receives such a command.

Solution Prerequisites

This post builds on the previous two. As such the prerequisites are;

  • you have an Azure account and have set up an IoT Hub, and registered an IoT Device with it
  • your IoT device (micro controller) can run Mongoose OS on. I’m using a NodeMCU ESP8266 that I purchased from Amazon here.
  • the RGB LED Light Ring (generic Neopixel) I used I purchased from Amazon here.
  • 3D printer if you want to print an enclosure for the IoT device

With those sorted we can;

  • Install and configure my Mongoose OS Application. It includes all the necessary libraries and sample config to integrate with a Neopixel, Azure IoT, Mongoose Dashboard etc.
  • Create the Azure PowerShell Function App that will publish the MQTT message the IoT Device will consume
  • Create the Microsoft Flow that will kick off the notifications and give use a nice interface to send what we want
  • Build an enclosure for our IoT device

How to build this project

The order I’ve detailed the elements of the architecture here is how I’d recommend approaching this project. I’d also recommend working through the previous two blog posts linked at the beginning of this one as that will get you up to speed with Mongoose OS, Azure IoT Hub, Azure IoT Devices, MQTT etc.

Installing the AzureIoT-Neopixel-js Application

I’ve made the installation of my solution easy by creating a Mongoose OS Application. It includes all the libraries required and sample code for the functionality I detail in this post.

Clone it from Github here and put it into your .mos directory that should be in the root of your Windows profile directory. e.g C:\Users\Darren.mos\apps-1.26 then from the MOS Configuration page select Projects, select AzureIoT-Neopixel-JS then select the Rebuild App spanner icon from the toolbar. When it completes select the Flash icon from the toolbar.  When your micro controller restarts select the Device Setup from the top menu bar and configure it for your WiFi network. Finally configure your device for Azure MQTT as per the details in my first post in this series (which will also require you to create an Azure IoT Hub if you don’t already have one and register your micro controller with it as an Azure IoT Device). You can then test sending a message to the device using PowerShell or Device Explorer as shown in post two in this series.

I have the Neopixel connected to D1 (GPIO 5) on the NodeMCU. If you use a different micro controller and a different GPIO then update the init.js configuration accordingly.

Creating the Azure Function App

Now that you have the micro controller configured and working with Azure IoT, lets abstract the sending of the MQTT messages into an Azure Function. We can’t send MQTT messages from Microsoft Flow, so I’ve created an Azure Function that uses the AzureIoT Powershell module to do that.

Note: You can send HTTP messages to an Azure IoT device but … 

Under current HTTPS guidelines, each device should poll for messages every 25 minutes or more. MQTT and AMQP support server push when receiving cloud-to-device messages.

….. that doesn’t suit my requirements 

I’m using the Managed Service Identity functionality to access the Azure Key Vault where credentials for the identity that can interact with my Azure IoT Hub is stored. To enable and use that (which I highly recommend) follow the instructions in my blog post here to configure MSI on an Azure Function App. If you don’t already have an Azure Key Vault then follow my blog post here to quickly set one up using PowerShell.

Azure PowerShell Function App

The Function App is an HTTP Trigger Based one using PowerShell. In order to interact with Azure IoT Hub and integrate with the IoT Device via Azure I’m using the same modules as in the previous posts. So they need to be located within the Function App.

Specifically they are;

  • AzureIoT v1.0.0.5
  • AzureRM v5.5.0
  • AzureRM.IotHub v3.1.0
  • AzureRM.profile v4.2.0

I’ve put them in a bin directory (which I created) under my Function App. Even though AzureRM.EventHub is shown below, it isn’t required for this project. I uploaded the modules from my development laptop (C:\Program Files\WindowsPowerShell\Modules) using WinSCP after configuring Deployment Credentials under Platform Features for my Azure Function App. Note the path relative to mine as you will need to update the Function App script to reflect this path so the modules can be loaded.

Azure Function PS Modules.PNG

The configuration in WinSCP to upload to the Function App for me is

WinSCP Configuration

Edit the AzureRM.IotHub.psm1 file

The AzureRM.IotHub.psm1 will locate an older version of the AzureRM.IotHub PowerShell module from within Azure Functions. As we’ve uploaded the version we need, we need to comment out the following lines in AzureRM.IotHub.psm1 so that it doesn’t do a version check. See below the lines to remark out (put a # in front of the lines indicated below) that are near the start of the module. The AzureRM.IotHub.psm1 file can be edited via WinSCP & notepad.

#$module = Get-Module AzureRM.Profile
#if ($module -ne $null -and $module.Version.ToString().CompareTo("4.2.0") -lt 0)
#{
# Write-Error "This module requires AzureRM.Profile version 4.2.0. An earlier version of AzureRM.Profile is imported in the current PowerShell session. Please open a new session before importing this module. This error could indicate that multiple incompatible versions of the Azure PowerShell cmdlets are installed on your system. Please see https://aka.ms/azps-version-error for troubleshooting information." -ErrorAction Stop
#}
#elseif ($module -eq $null)
#{
# Import-Module AzureRM.Profile -MinimumVersion 4.2.0 -Scope Global
#}

HTTP Trigger Azure PowerShell Function App

Here is my Function App Script. You’ll need to update it for the location of your PowerShell Modules (I created a bin directory under my Function App D:\home\site\wwwroot\myFunctionApp\bin), your Key Vault details and the user account you will be using. The User account will need permissions to your Key Vault to retrieve the password (credential) for the account you will run the process as and to your Azure IoT Hub.

You can test the Function App from within the Azure Portal where you created the Function App as shown below. Update for the names of the IoT Hub, IoT Device and the Resource Group in your associated environment.

Testing Function App.PNG

Microsoft Flow Configuration

The Flow is very simple. A manual button and a resulting HTTP Post.

Microsoft Flow Config 1

For the message I have configured a list. This is where you can choose the color of the notification.

Manual Trigger.PNG

The Action is an HTTP Post to the Azure Function URL. The body has the configuration for the IoTHub, IoTDevice, Resource Group Name, IoTKeyName and the Message selected from the manual button above. You will have the details for those settings from your initial testing via the Function App (or PowerShell).

The Azure Function URL you get from the top of the Azure Portal screen where you configure your Function App. Look for “Get Function URL”.

HTTP Post

Testing

Now you have all the elements configured, install the Microsoft Flow App on your mobile if you don’t already have it for Apple iOS Appstore and Android Google Play Log in with the account you created the Flow as, select the Flow, the message and done. Depending on your internet connectivity you should see the notification in < 10 seconds displayed on the Notifier device.

Case 3D Printer Files

Lastly, we need to make it look all pretty and make the notification really pop. I’ve created a housing for the neopixel that sits on top of a little case for the NodeMCU.

As you can see from the final unit, I’ve printed the neopixel holder in a white PLA that allows the RGB LED light to be diffused nicely and display prominently even in brightly lit conditions.

Neopixel Enclosure

I’ve printed the base that holds the micro controller in a different color. The top fits snugly through the hole in the micro controller case. The wires from the neopixel to connect it to the micro controller slide through the shaft of the top housing. It also has a backplate that attaches to the back of the enclosure that I secure with a little hot glue.

Here is a link to the Neopixel (WS2812) 16 RGB LED light holder I created on Thingiverse.

NodeMCU Enclosure.PNG

Depending on your micro controller you will also need an appropriately sized case for that. I’ve designed the neopixel light holder top assembly to sit on top of my micro controller case. Also available on Thingiverse here.

Summary

Using a combination of Azure IoT, Azure PaaS Services, Mongoose OS and a cheap micro controller with an RGB LED light ring we have a very versatile Internet of Things device. The application here is a simple visual notifier. A change of output device or even in conjunction with an input device could change the application, whilst still re-using all the elements of the solution that glues it all together (micro-controller, Mongoose OS, Azure IoT, Azure PaaS). Did you build one? Did you use this as inspiration to build something else? Let me know.

Deploy VM via ARM template: Purchase eligibility failed

I recently tried to deploy a VM using an ARM template executed via PowerShell and I encountered the purchase eligibility failed error as seen below.

PurchaseEligibilityFailedError

As I have encountered this before I ensured I accepted marketplace terms for the VM image in question using the PowerShell commands:

Get-AzureRmMarketplaceTerms -Publisher PublisherName -Product ProductName -Name Name | Set-AzureRmMarketplaceTerms -Accept

I then reattempted to deploy my VM using my ARM template and still got the same error, I even waited 24 hours and tried again with no luck.

I then discovered that from the Azure portal you can create a new resource using the “Template deployment” option and deploy your ARM template via the Azure portal.

TemplateDeploymentOption

After I uploaded and executed my ARM template using this method it deployed my VM successfully with no purchase eligibility errors.

Any subsequent VM deployments via PowerShell using the exact same ARM template now worked as expected with no purchase eligibility errors.

Integrating Azure IoT Devices with MongooseOS MQTT and PowerShell

Introduction and Recap

In my last post here on IoT I detailed getting started with Azure IoT Hubs and registering an IoT device and sending telemetry from the IoT Device to the Azure IoT Hub. And doing all that using PowerShell.

If you haven’t read that post or worked through those steps, stop here, work through that and then come back. This post details configuring MongooseOS to receive MQTT messages from Azure IoT which is the last mile to making the IoT Device flexible for integration with anything you can think of.

Prerequisites

The only change to my setup from the previous post is I installed the Mongoose Demo App onto my ESP8266 device. Specifically the demo-js App detailed in the application list here. Install is quick and simple on Windows using the MOS Tool. Details are here. I also enabled the Mongoose Dashboard on my Mongoose IoT Device so that I don’t have to have the IoT Device connected to my laptop when configuring and experimenting with it. Essentially check the checkbox for Dashboard when configuring the IoT Device when connected locally via a USB cable.

The rest of the configuration is using the defaults in Azure IoT with respect to MQTT.

MongooseOS MQTT Subscribe Configuration – Init.js

On your IoT Device in the MongooseOS init.js we need to configure the ability to subscribe to a MQTT topic. In the first post we were publishing to send telemetry. Now we want to receive messages from Azure IoT.

Include the following lines in your init.js configuration file and restart your IoT Device. The devices//messages/devicebound/# path for the MQTT Subscription will allow the IoT device to subscribe to messages from the Azure IoT Hub.

// Receive MQTT Messages from Azure
MQTT.sub('devices/' + Cfg.get('device.id') + '/messages/devicebound/#', function(conn, topic, msg) {
 print('Topic:', topic, 'message:', msg);
}, null);

In order to test the configuration of the IoT Device I initially use the Device Explorer. It is available from GitHub here. The screenshot below shows me successfully sending a message to my IoT Device.

DeviceExplorer to IoT Device.PNG

From the Mongoose OS Dashboard we can inspect the Console Log and see the telemetry we are sending to the IoT Hub, but also the message we have received. Success.

Mongoose Device Log.PNG

Sending MQTT Messages from Azure IoT to MongooseOS using PowerShell

Now that we’ve verified that we have everything setup correctly let’s get to the end goal of sending messages to the IoT Device using PowerShell. Here is a little script that uses the AzureIoT Module that we used previously to assist with configuration automation, to send a message from Cloud to Device.

Update it for your Resource Group, IoTHub, DeviceID and IoTKeyName. Also the message if you feel the need (line 40).

Hello from the Cloud via PowerShell MQTT Message Received.

Cloud to Device ConsoleLog.PNG

Summary

Through the two blog posts I’ve detailed the creation of an Azure IoT Hub, registration of an IoT Device, sending telemetry from MongooseOS on the IoT Device to Azure IoT and now sending messages to the IoT Device from Azure, all via PowerShell. Now we have end to end connectivity bi-directionally, what can we do with it? Stay tuned for future posts.

 

Validating a Yubico YubiKeys’ One Time Password (OTP) using Single Factor Authentication and PowerShell

Multi-factor Authentication comes in many different formats. Physical tokens historically have been very common and moving forward with FIDO v2 standards will likely continue to be so for many security scenarios where soft tokens (think Authenticator Apps on mobile devices) aren’t possible.

Yubico YubiKeys are physical tokens that have a number of properties that make them desirable. They don’t use a battery (so aren’t limited to the life of the battery), they come in many differing formats (NFC, USB-3, USB-C), can hold multiple sets of credentials and support open standards for multi-factor authentication. You can checkout Yubico’s range of tokens here.

YubiKeys ship with a configuration already configured that allows them to be validated against YubiCloud. Before we configure them for a user I wanted a quick way to validate that the YubiKey was valid. You can do this using Yubico’s demo webpage here but for other reasons I needed to write my own. There wasn’t any PowerShell examples anywhere, so now that I’ve worked it out, I’m posting it here.

Prerequisites

You will need a Yubikey. You will need to register and obtain a Yubico API Key using a Yubikey from here.

Validation Script

Update the following script to change line 2 for your ClientID that  you received after registering against the Yubico API above.

Running the script validates that the Key if valid.

YubiKey Validation.PNG

Re-running the submission of the same key (i.e I didn’t generate a new OTP) gets the expected response that the Request is Replayed.

YubiKey Validation Failed.PNG

Summary

Using PowerShell we can negate the need to leverage any Yubico client libraries and validate a YubiKey against YubiCloud.

 

Commanding your Philips Hue lights with PowerShell

A couple of years ago I bought a number of Philips Hue bulbs and put them in the living areas of my house. Typically we control them via the Hue App on our phones, or via the Google Assistant. This all works very well, but of course I’m a techie and have a bunch of other Internet of Things devices and it would be great to integrate the Hue lights with those.

This post is the first in doing that integration. Setting up access to the Philips Hue Bridge and manipulating the lights. For ease of initial experimentation I’m using PowerShell to perform the orchestration. The API calls can easily be transposed to any other language as they are simple web requests.

Prerequisites

First you will need to have your Philips Hue lights setup with your Philips Hue Bridge. Test the lights are all working via the Philips Hue mobile app.

Locate the IP address of your Philips Hue Bridge. I found mine easily via my Unifi console and you should be able to get it via your home router.

Getting Started

Navigate to your Philips Hue Bridge using a browser and its IP Address. You will see a splash screen with a list of the open source modules that it utilises. Now append the IP Address with /debug/clip.html For me that is;

http://192.168.1.124/debug/clip.html

Create an Account

The Rest API takes a JSON payload. We can quickly create that in the API Debugger. See my example body below and change the URL to /api. Whilst pressing the button on the top of your Philips Hue Bridge select the POST button. This will create an account that you can then use to orchestrate your hue lights.

{“devicetype”:”AzureFunction#iphone Darren”}

Create Philips Hue User.PNG

Via the API we’ve just created an account. Copy the username from the response. We’ll need this for the API calls.

Test Connection

Change the URL in the debugger as shown below and clear the Message Body. Select GET and you should get returned the light(s) connected to your Philip Hue Bridge.

http:///api//lights

Lights.PNG

Controlling a Light

I have many lights. In our kitchen we have three pendant lights in a row that are all Philips Hue lights. I’m going to start by testing with one of them. Choosing one from the list from the response above Light 5 should be the middle light. The command is:

http://<yourHueBridge/api//lights//state

In the body put On and True to turn on. False would be to turn it off. Select PUT. My light turned on. Updating the message body to false and pressing PUT turned it off.

Turn Light On.PNG

Using PowerShell to Manage a Philips Hue Light

Now lets manipulate the Hue Light using PowerShell now that we have an account and know the light we want to manage.

Update the following test script for the IP address of your Philips Hue Bridge, the light number you wish to control and the username you got when you performed the enablement steps earlier. The script will then get the current status of the light and reverse it (turn OFF if it was ON and ON if it was OFF).

Flipping the state of a light

If you have configured everything correctly your light will change and you will get a success reply and the state it transitioned too.

Reverse Light State.PNG

Controlling Multiple Lights

Now let’s do that for multiple lights. In my kitchen we have 3 drop lights over the counter bench. Lets control all three of those. I created a collection of the light numbers, then iterate through each one and flip its state. NOTE: you can also control multiple lights through the Groups method. I won’t be covering that though

I had one set in an inverse state to the other two before I started, to show each is individually updated.

Reverse Multiple Lights State.PNG

Controlling Multiple Lights and Changing Colors

Now lets change the color. Turn the lights on if they aren’t already and make the color PINK.

As you can see, iterating through the lights the script turns them on and makes them Pink.

Turn Lights On and make them PINK.PNG

Finally, effects for multiple lights

Now lets turn on all the lights, and set them to use the color loop effect (transition through the color spectrum) for 15 seconds then make them all pink.

The lights transition through the color spectrum for 15 seconds then the effect is turned off and the color is set to pink.

Turn Lights On Color Effect and make them PINK.PNG

Summary

We created an account on the Philips Hue Bridge, were able to enumerate the lights that we have and then orchestrate them via PowerShell.

Here is a short video showing three lights being turned on, changing color and iterating through the color specturm then setting them Pink.

Now to integrate them with other IoT Devices.

Intro to Site Scripts and Site Designs with a Simple SharePoint Modern Site provisioning

Microsoft announced Site Scripts and Site Designs in late 2017 which became available for Targeted release in Jan 2018, and released to general use recently. It is a quick way to allow users to create custom modern sites, without using any scripting hacks. Hence, in this blog we will go through the steps of Site Scripts and Site design for a Simple SharePoint Modern Site Creation.

Before we get into detailed steps, lets’ get a brief overview about Site Designs and Site Scripts.

Site designs: Site designs are like a template. They can be used each time a new site is created to apply a consistent set of actions. You create site designs and register them in SharePoint to one of the modern template sites: the team site, or communication site. A site design can run multiple scripts. The script IDs are passed in an array, and they will run in the order listed.

Site Scripts: Site script is custom JSON based script that runs when a site design is selected. The site scripts detail the provisioning items such as creating new lists or applying a theme. When the actions in the scripts are completed, SharePoint displays detailed results of those actions in a progress pane. They can even call flow trigger that is essential for site creation.

Software Prerequisites:

  1. PowerShell 3.0 or above
  2. SharePoint Online Management Shell
  3. Notepad or any notes editor for JSON creation – I prefer Notepad++
  4. Windows System to run PowerShell
  5. And a must – SharePoint Tenant J

 Provisioning Process Overview:

The Provisioning process is divided into 4 steps

1. Create a Site Script using JSON template to call actions to be run after a Modern Site is created.
2. Add the Script to your tenant
3. Create a Site Design and add the Site Script to the design. Associate the Site Design to Modern Site Templates – Team Site template is 64 and Communication Site Template is 68
4. Create a Modern Site from SharePoint landing page
5. Wait for the Site Script to apply changes and refresh your site

Quick and Easy right!? Now lets’ get to the “how to”.

Steps

1. JSON Template: We will need to create a JSON template that will have the declarations of site resources that will be provisioned after the site is created. For more details, here is a link to Microsoft docs. The brief schema is below.

{
"$schema": "schema.json",
"actions": [
... [one or more verb   actions]  ...
],
"bindata": { },
"version": 1
};

For our blog here, we will use the below schema where we are creating a custom list with few columns.

2. Site Script: Add the above site script to your tenant using PowerShell. The below code with return the Site Script GUID. Copy that GUID and will be used later

Get-Content '[ JSON Script location ]' -Raw | Add-SPOSiteScript -Title “[ Title of the script ]

3. Site Design: After the Site Script is added, create the Site Design from the Site Script to be added to the dropdown menu options for creating the site.

Add-SPOSiteDesign -Title “[ Site design title ]” -WebTemplate "64"  -SiteScripts “[ script GUID from above step ]”  -Description "[ Description ]"

4. Create a Modern Site: After the Site Design is registered, you could see the design while creating a site as shown below

ModernTeamSiteWIthcustom

5. Click on the Manual Refresh button as per screenshot after the site upgrade process is complete.

SiteScriptFinish

When ready, the final Team site will look like the screenshot below after provisioning is complete.

CustomTeamSiteWithScriptResult

In this blog, we came to know about Site Script, Site design and how to use them to provision modern team sites.

Migrate SharePoint contents using ShareGate

Background

The first step in any migration project is to do the inventory and see what is the size of the data you have got which you are looking to migrate. For the simplicity, this post assumes you have already done this activity and have already planned new destination sites for your existing content. This means you have some excel sheet somewhere to identify where your content is going to reside in your migrated site and every existing site (in scope) have got at least a new home pointer if it qualifies for migration.

The project I was working on had 6 level of sites and subsites within a site collection, for simplicity we just consider one site collection in scope for this post and had finalised it will have max. 3 levels of site and a subsite initially after discussing it with business.

In this blog post, we will be migrating SharePoint 2010 and 2013 contents from on-premise to SharePoint Online (SPO) in our customer’s Office 365 tenant.

Creating Structure

After doing the magic of mapping an existing site collection, site and subsites; we came up with an excel sheet to have the mapping for our desired structure as shown below:

Level1.csv

Level2.csv

Level3.csv

The above files will be used as a reference master sheet to create site structure before pulling the trigger for migrating contents. We will be using PowerShell script below to create the structure for our desired state within our new site collection.

SiteCreation.ps1

$url = "https://your-client-tenant-name.sharepoint.com/"

function CreateLevelOne(){
    Connect-PnPOnline -Url $url -Credentials 'O365-CLIENT-CREDENTIALS'

    filter Get-LevelOne {
        New-PnPWeb -Title $_.SiteName -Url $_.URL1 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level1.csv | Get-LevelOne
}

function CreateLevelTwo(){

    filter Get-LevelTwo {
        $connectUrl = $url + $_.Parent

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL2 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level2.csv | Get-LevelTwo
}

function CreateLevelThree(){

    filter Get-LevelThree {
        $connectUrl = $url +  $_.GrandPrent + '/' + $_.Parent 

        Connect-PnPOnline -Url $connectUrl -Credentials 'O365-CLIENT-CREDENTIALS'
        New-PnPWeb -Title $_.SiteName -Url $_.URL3 -Template BLANKINTERNETCONTAINER#0 -InheritNavigation -Description "Site Structure created as part of Migration"
    }

    Import-Csv C:\_temp\Level3.csv | Get-LevelThree
}

CreateLevelOne

CreateLevelTwo

CreateLevelThree

Migrating Contents

Once you have successfully created your site structure, this is the time now to start migrating contents to the newly created structure as per the mapping you have identified earlier. CSV file format looks like below:

MG-Batch.csv

Final step is to execute PowerShell script and migrate content using ShareGate commands from your source site to your destination site (as defined in your mapping file above)

Migration-ShareGate.ps1

<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span># folder where files will be produced
$folderPath = "C:\_Kloud\SGReports\"
$folderPathBatches = "C:\_Kloud\MigrationBatches\"

filter Migrate-Content {
    # URLs for source and destination
    $urlDes = $_.DesintationURL
    $urlSrc = $_.SourceURL 

    # file where migration log will be added again each run of this script
    $itemErrorFolderPath = $folderPath + 'SG-Migration-Log-Webs.csv'

    # migration settings used by ShareGate commands
    $copysettings = New-CopySettings -OnContentItemExists IncrementalUpdate -VersionOrModerationComment "updated while migration to Office 365" 

    # 

    $pwdDest = ConvertTo-SecureString "your-user-password" -AsPlainText -Force
    $siteDest = Connect-Site -Url $urlDes -Username your-user-name -Password $pwdDest

    $listsToCopy = @() 

    $siteSrc = Connect-Site -Url $urlSrc

    $listsInSrc = Get-List -Site $siteSrc 

    foreach ($list in $listsInSrc)
    {
        if ($list.Title -ne "Content and Structure Reports" -and
            $list.Title -ne "Form Templates" -and
            $list.Title -ne "Master Page Gallery" -and
            $list.Title -ne "Web Part Gallery" -and
            $list.Title -ne "Pages" -and
            $list.Title -ne "Style Library" -and
            $list.Title -ne "Workflow Tasks"){

            $listsToCopy += $list
        }
    }

    # building a log entry with details for migration run

    $date = Get-Date -format d
    $rowLog = '"' + $siteSrc.Address + '","' + $siteSrc.Title + '","' + $listsInSrc.Count + '","' +  $siteDest.Address + '","' +  $siteDest.Title + '","' + $listsToCopy.Count + '","' + $date + '"'
    $rowLog | Out-File $itemErrorFolderPath -Append -Encoding ascii

    Write-Host Copying $listsToCopy.Count out of $listsInSrc.Count lists and libraries to '('$siteDest.Address')'
    #Write-Host $rowLog

    $itemLogFolderPath = $folderPath + $siteSrc.Title + '.csv'
    #Write-Host $itemLogFolderPath

    $result = Copy-List -List $listsToCopy -DestinationSite $siteDest -CopySettings $copysettings -InsaneMode -NoCustomPermissions -NoWorkflows
    Export-Report $result -Path $itemLogFolderPath
}

function Start-Migration($batchFileName)
{
    $filePath = $folderPathBatches + $batchFileName

    Import-Csv $filePath | Migrate-Content
}

Start-Migration -batchFileName "MG-Batch.csv"
<span id="mce_SELREST_start" style="overflow:hidden;line-height:0;"></span>

Conclusion

In this blog post, we had used a silent mode of ShareGate using which we can run in parallel multiple migration jobs from the different workstations (depending on your ShareGate licensing).

For a complete list of ShareGate PowerShell commands, you can refer a list at URL.

I hope you have found this post useful to create a site structure and migrate contents (list/libraries) to content’s new home inside SharePoint Online.

Automating the submission of WordPress Blog Posts to your Microsoft MVP Community Activities Profile using PowerShell

Introduction

In November last year (2017) I was honored to be awarded Microsoft MVP Status for Enterprise Mobility – Identity and Access. MVP Status is awarded based on community activities and even once you’ve attained MVP Status you need to keep your community activity contributions updated on your profile.

Up until recently this was done by accessing the portal and updating your profile, however mid last year a MVP PowerShell Module (big thanks to Francois-Xavier Cat and Emin Atac) was released that allows for some automation.

In this post I’ll detail using the MVP PowerShell Module to retrieve your latest WordPress Blog Post and submit it to your MVP Profile as a MVP Community Contribution.

Prerequisites

In order for this to work you will need;

  • to be a Microsoft MVP
  • Register at the MS MVP API Developer Portal using your MVP registered email/profile address
    • subscribe to the MVP Production Application
    • copy your API key (you’ll need this in the script below)

The Script

Update the script below for;

  • MVP API Key
    • Update Line 5 with your API key as detailed above
  • WordPress Blog URL (mine is blog.darrenjrobinson.com)
  • Your Award Category
    • Update Line 17 with your category $contributionTechnology = “Identity and Access”
    • type New-MVPContribution – ContributionTechnology and you’ll get a list of the MVP Award Categories

Award Categories.PNG

The Script

Here is the simple script. Run it after publishing a new entry you also want added to your Community Contributions Profile. No error handling, nothing complex, but you’re a MVP so you can plagiarise away for your submissions to make it do what suits your needs.

Summary

Hopefully that makes it simple to keep your MVP profile up to date. If you’re using a different Blogging platform I’m sure the basic process will work with a few tweaks after returning a query to get the content. Enjoy.

Automating the creation of Azure IoT Hubs and the registration of IoT Devices with PowerShell and VS Code

The creation of an Azure IoT Hub is quick and simple, either through the Azure Portal or using PowerShell. But what can get more time-consuming is the registration of IoT Devices with the IoT Hub and generation of SAS Tokens for them for authentication.

In my experiments with micro-controllers and their integration with Azure IoT Services I often find I keep having to manually do tasks that should have just been automated. So I did. In this post I’ll cover using PowerShell to;

  • create an Azure IoT Hub
  • register an Azure IoT Device
  • generate a SAS Token for the IoT Device to use for authentication to an Azure IoT Hub from a Mongoose OS enabled ESP8266 micro controller

IoT Integration

Prerequisites

In order to fully test this, ideally you will have a micro-controller. I’m using an ESP8266 based micro-controller like this one. If you want to test this out without physical hardware, you could generate your own DeviceID (any text string) and use the AzureIoT Library detailed further on to send MQTT messages.

You will also require an Azure Subscription. I detail using a Free Tier Azure IoT Hub which is limited to 8000 messages per day. And instead of using PowerShell/PowerShell ISE get/use Visual Studio Code.

Finally you will need the AzureRM and AzureIoT PowerShell modules. With WinRM 5.x you can get them from the PowerShell Gallery with;

install-module AzureRM
install-module AzureIoT

Create an Azure IoT Hub

The script below will create a Free Tier Azure IoT Hub. Change the location (line 15) for which Azure Region you will use (the commands on the lines above will list what regions are available), the Resource Group Name that will be created to hold it (line 18) and the name of the IoT Hub (line 23) and let it rip.

From your micro-controller we will need the DeviceID. I’m using the ID generated by the device which I obtained from the Device Configuration => Expert View of my Mongoose OS enabled ESP8266.

Device Config.PNG

Register the IoT Device with our Azure IoT Hub

Using the AzureIoT PowerShell module we can automate the creation/registration of the IoT Device. Update the script below for the name of your IoTHub and the Resource Group that contains it that you created earlier (lines 7 and 11). Update line 21 for the DeviceID or your new IoT Device. I’m using the AzureIoT module to do this. With WinRM 5.x you can install it quickly fromt the gallery with install-module AzureIoT

Looking at our IoTHub in the Azure Portal we can see the newly registered IoT Device.

DeviceCreated.png

Generate an IoT Device SAS Token

The final step is to create a SAS Token for our IoT Device to use to connect to the Azure IoTHub. Historically you would use the IoT Device Explorer to do that. Alternatively you can also use the code samples to implement the SAS Device Token generation via an Azure Function App. Examples exist for JavaScript and C#. However as of mid-January 2018 you can do it direct from VS Code or Azure Cloud Shell using the Azure CLI and the IOT Extension. I’m using this method here as it is the quickest and simplest method of generating the Device SAS Token.

The command to generate a token that would work for all Devices on an IoT Hub is

az iot hub generate-sas-token --hub-name

Here I show executing it via the Azure Cloud Shell after installing the IOT Extensions as detailed here. To open the Bash Cloud Shell select the >_ icon next to the notification bell in the right top menu list.

Generate IOT Device SAS Token.PNG

As we have done everything else via PowerShell and VS Code we can also do it easily from VS Code. Install the Azure CLI Tools (v0.4.0 or later in VS Code as detailed here. Then from within VS Code press Control + Shift + P to open the Command Palette and enter Azure: Sign In. Sign in to Azure. Then Control + Shift + P again and enter Azure: Open Bash in Cloud Shell to open a Bash Azure CLI Shell. You can check to see if you have the Azure CLI IOT Extension (if you’ve previously used the Azure CLI for IoT operations) by typing;

az extension show --name azure-cli-iot-ext

and install it if you don’t with;

az extension add --name azure-cli-iot-ext

Then run the same command from VS Code to generate the SAS Token

az iot hub generate-sas-token --hub-name

VSCode Generate SAS Token.PNG

NOTE: That token can then be used for any Device registered with that IOT Hub. Best practice is to have a token per device. To do that type

az iot hub generate-sas-token --hub-name  --device-id

Generate SAS Token VS Code Per Device.PNG

By default you will get a token valid for 1 hour. Use the –duration switch to specify the duration of the token you require for your environment.

We can now take the SAS Token and put it into our MQTT Config on our Mongoose OS IoT Device. Update the Device Configuration using Expert View and Save.

Mongoose SAS Config.PNG

We can then test our IoT Device sending updates to our Azure IoT Hub. Update Init.js using the telemetry sample code from Mongoose.

load('api_config.js');
 load('api_mqtt.js');
 load('api_sys.js');
 load('api_timer.js');

let topic = 'devices/' + Cfg.get('device.id') + '/messages/events/';

Timer.set(1000, true /* repeat */, function() {
 let msg = JSON.stringify({ ram: Sys.free_ram() });
 let ok = MQTT.pub(topic, msg, 1);
 print(ok, topic, '->', msg);
 }, null);

We can then see the telemetry being sent to our Azure IOT Hub using MQTT. In the Device Logs after the datestamp and before device/ if you see a 0 instead of 1 (as shown below) then your conenction information or SAS Token is not correct.

Mongoose IOT Events.png

On the Auzre IoT side we can then check the metrics and see the incoming telemetry using the counter Telemetry Metrics Sent as shown below.

Telemetry Metrics Sent.PNG

If you don’t have an IoT Device you can simulate one using PowerShell. The following example shows sending a message to our IoT Hub (using variables from previous scripts).

$deviceParams = @{
 iotConnString = $IoTConnectionString
 deviceId = $deviceID
}
$deviceKeys = Get-IoTDeviceKey @deviceParams 
# Get Device 
$device = Get-IoTDeviceClient -iotHubUri $IOTHubDeviceURI -deviceId $deviceID -deviceKey $deviceKeys.DevicePrimaryKey

# Send Message
$deviceMessageParams = @{
 deviceClient = $device
 messageString = "Azure IOT Hub"
}
Send-IoTDeviceMessage -deviceClient $deviceMessageParams

Summary

Using PowerShell we have quickly been able to;

  • Create an Azure IoT Hub
  • Register an IoT Device
  • Generate the SAS Token for the IoT Device to authenticate to our IoT Hub with
  • Configure our IoT Device to send telemetry to our Azure IoT Hub and verify integration/connectivity

We are now ready to implement logic onto our IoT Device for whatever it is you are looking to achieve.