Sending Events from IoT Devices to Azure IoT Hub using HTTPS and REST

Overview

Different IoT Devices have different capabilities. Whether it is a Micro-controller or Single Board Computer your options will vary. In this post I detailed using MQTT to send messages from an IoT Device to an Azure IoT Hub as well as using the AzureIoT PowerShell Module.

For a current project I needed to send the events from an IoT Device that runs Linux and had Python support. The Azure IoT Hub includes an HTTPS REST endpoint. For this particular application using the HTTPS REST endpoint is going to be much easier than compiling the Azure SDK for the particular flavour of Linux running on my device.

Python isn’t my language of choice so first I got it working in PowerShell then converted it to Python. I detail both scripts here as a guide for anyone else trying to do something similar but also for myself as I know I’m going to need these snippets in the future.

Prerequisites

You’ll need to have configured an;

Follow this post to get started.

PowerShell Device to Cloud Events using HTTPS and REST Script

Here is the PowerShell version of the script. Update Line 3 for your DeviceID, Line 5 for your IoT Hub Name and LIne 11 for your SAS Token.

Using Device Explorer to Monitor the Device on the associated IoT Hub I can see that the message is received.

Device Explorer

Python Device to Cloud Events using HTTPS and REST Script

Here is my Python version of the same script. Again update Line 5 for your IoT DeviceID, Line 7 for your IoT Hub and Line 12 for the SAS Token.

And in Device Explorer we can see the message is received.

Device Explorer Python

Summary

When you have a device that has the ability to run Python you can use the IoT Hub HTTPS REST API to send messages from the Client to Cloud negating the need to build and compile the Azure IoT SDK to generate client libraries.

Auto-redirect ADFS 4.0 home realm discovery based on client IP

As I mentioned in my previous post here that I will explain how to auto-redirect the home realm discovery page to an ADFS namespace (claims provider trust) based on client’s IP so here I am.

Let’s say you have many ADFS servers (claims providers trusts) linked to a central ADFS 4.0 server and you want to auto-redirect the user to a linked ADFS server login page based on user’s IP instead of letting the user to choose a respective ADFS server from the list on the home realm discovery page as explained in the below request flow diagram.

You can do so by doing some customization as mentioned below:

  1. Create a database of IP ranges mapped to ADFS namespaces
  2. Develop a Web API which returns the relevant ADFS namespace based on request IP
  3. Add custom code in onload.js file on the central ADFS 4.0 server to call the Web API and do the redirection

It is assumed that all the boxes including Central ADFS, linked ADFS, Web Server, SQL Server are setup. All the nitties and gritties are sorted out in terms of firewall rules, DNS lookups, SSL certificates. If not then you can get help from an infrastructure guy on that.

Lets perform the required action on SQL, Web and ADFS Server.

SQL Server

Perform the following actions on the SQL Server:

  1. Create a new database
  2. Create a new table called Registration as shown below

  1. Insert some records in the table for the linked ADFS server IP range, for example

Start IP: 172.31.117.1, End IP: 172.31.117.254, Redirect Name: http://adfs.adminlab.com/adfs/services/trust

Web Server

Perform the following actions for the Web API development and deployment:

  1. Create a new ASP.NET MVC Web API project using Visual Studio
  2. Create a new class called Redirect.cs as shown below (I would have used the same name as database table name ‘Registration’ but it’s OK for now)

  1. Insert a new Web API controller class called ResolverController.cs as shown below. What we are doing here is getting the request IP address and getting the IP ranges from the database, comparing the request IP address with the IP ranges from the database by converting both to long IP address. If the request IP is in range then returning the redirect object.

  1. Add a connection string in the web.config named DbConnectionString pointing to the database we created above.
  2. Deploy this web API project to the web server IIS
  3. Configure the HTTPS binding as well for this web API project using the SSL certificate
  4. Note down the URL of the web API, something like ‘https://{Web-Server-Web-API-URL}/api/resolver/get’, this will be used in the onload.js

Central ADFS 4.0 Server

Perform the following actions on the central ADFS 4.0 server:

  1. Run the following PowerShell command to export current theme to a location

Export-AdfsWebTheme -Name default -DirectoryPath D:\Themes\Custom

  1. Run the following PowerShell command to create a new custom theme based on current theme

New-AdfsWebTheme -Name custom -SourceName default 

  1. Update onload.js file extracted in step 1 at D:\Themes\Custom\theme\script with following code added at the end of the file. What we are doing here is calling the web API which returns the matched Redirect object with RedirectName as ADFS namespace and setting the HRD.selection as that redirect name.

  1. Run the following PowerShell command to update back the onload.js file in the theme

Set-AdfsWebTheme -TargetName custom -AdditionalFileResource @{Uri=’/adfs/portal/script/onload.js’;path=”D:\Themes\Custom\theme\script\onload.js”} 

  1. Run the following PowerShell command to make the custom theme as your default theme

Set-AdfsWebConfig -ActiveThemeName custom -HRDCookieEnabled $false

Now when you test from your linked ADFS server or a client machine linked to the linked ADFS server (which is linked to a central ADFS server), the auto-redirect kicks in from onload.js and forwards it to web API which gets the client IP and matches it with relevant ADFS where the request came from and redirects the user to the relevant ADFS login page, instead of user selecting the relevant ADFS namespace from the available list on home realm discovery page.

If the relevant match is not found, the default home realm discovery page with list of available ADFS namespaces is shown.

Utilising Azure Speech to Text Cognitive Services with PowerShell

Introduction

Yesterday I posted about using Azure Cognitive Services to convert text to speech. I also eluded that I’ve been leveraging Cognitive Services to do the conversion from Speech to Text. I detail that in this post.

Just as with the Text to Speech we will need an API key to use Cognitive Services. You can get one from Azure Cognitive Services here.

Source Audio File

I created an audio file in Audacity  for testing purposes. In my real application it is direct spoken text, but that’s a topic for another time.  I set the project rate to 16000hz for the conversion source file then exported the file as a .wav file.

Capture Audio

The Script

The Script below needs to be updated for your input file (line 2) and your API Key (line 7). Run it liine by line in VSCode or PowerShell ISE.

Summary

That’s it. Pretty simple once you have a reference script to work with. Enjoy.

Converted

 

Automate network share migrations to Sharepoint Online using ShareGate PowerShell

Sharegate supports PowerShell scripting which can be used to automate and schedule migrations. In this post, I am going to demonstrate an example of end to end automation to migrate network Shares to SharePoint Online. The process effectively reduces the task of executing migrations to “just flicking a switch”.

Pre-Migration

The following pre-migration activities were conducted before the actual migration:

  1. Analysis of Network Shares
  2. Discussions with stakeholders from different business units to identify content needs
  3. Pilot migrations to identify average throughput capability of migration environment
  4. Identification of acceptable data filtration criteria, and prepare Sharegate migration template files based on business requirements
  5. Derive a migration plan from above steps

Migration Automation flow

The diagram represents a high-level flow of the process:

 

The migration automation was implemented to execute the following steps:

  1. Migration team indicates that migration(s) are ready to be initiated by updating the list item(s) in the SharePoint list
  2. Updated item(s) are detected by a PowerShell script polling the SharePoint list
  3. The list item data is downloaded as a CSV file. It is one CSV file per list item. The list item status is updated to “started”, so that it would not be read again
  4. The CSV file(s) are picked up by another migration PowerShell script to initiate migration using Sharegate
  5. The required migration template is selected based on the options specified in the migration list item / csv to create a migration package
  6. The prepared migration task is queued for migration with Sharegate, and migration is executed
  7. Information mails are “queued” to be dispatched to migration team
  8. Emails are sent out to the recipients
  9. The migration reports are extracted out as CSV and stored at a network location.

Environment Setup

Software Components

The following software components were utilized for the implementing the automation:

  1. SharePoint Online Management shell
  2. SharePoint PnP PowerShell
  3. Sharegate migration Tool

Environment considerations

Master and Migration terminals hosted as Virtual machines – Each terminal is a windows 10 virtual machine. The use of virtual machines provides following advantages over using desktops:

  • VMs are generally deployed directly in datacenters, hence, near the data source.
  • Are available all the time and are not affected by power outages or manual shutdowns
  • Can be easily scaled up or down based on the project requirements
  • Benefit from having better internet connectivity and separate internet routing can be drawn up based on requirements

Single Master Terminal – A single master terminal is useful to centrally manage all other migration terminals. Using a single master terminal offers following advantages:

  • Single point of entry to migration process
  • Acts as central store for scripts, templates, aggregated reports
  • Acts as a single agent to execute non-sequential tasks such as sending out communication mails

Multiple Migration terminals – It would be optimal to initiate parallel migrations and use multiple machines (terminals) to expedite overall throughput by parallel runs in an available migration window (generally non-business hours).  Sharegate has option to use either 1 or 5 licenses at once during migration. We utilized 5 ShareGate licenses on 5 separate migration terminals.

PowerShell Remoting – Using PowerShell remoting allows opening remote PowerShell sessions to other windows machines. This will allow the migration team to control and manage migrations using just one terminal (Master Terminal) and simplify monitoring of simultaneous migration tasks. More information about PowerShell remoting can be found here.

PowerShell execution policy – The scripts running on migration terminals will be stored at a network location in Master Terminal. This will allow changing / updating scripts on the fly without copying the script over to other migration terminals. The script execution policy of the PowerShell window will need to be set as “Bypass” to allow execution of scripts stored in network location (for quick reference, the command is “Set-ExecutionPolicy -ExecutionPolicy Bypass”.

Windows Scheduled Tasks – The PowerShell scripts are scheduled as tasks through Windows Task schedulers and these tasks could be managed remotely using scripts running on the migration terminals. The scripts are stored at a network location in master terminal.

Basic task in a windows scheduler

PowerShell script file configured to run as a Task

Hardware specifications

Master terminal (Manage migrations)

  • 2 cores, 4 GB RAM, 100 GB HDD
  • Used for managing scripts execution tasks on other terminals (start, stop, disable, enable)
  • Used for centrally storing all scripts and ShareGate property mapping and migration templates
  • Used for Initiating mails (configured as basic tasks in task scheduler)
  • Used for detecting and downloading migration configuration of tasks ready to be initiated (configured as basic tasks in task scheduler)
  • Windows 10 virtual machine installed with the required software.
  • Script execution policy set as “Bypass”

Migration terminals (Execute migrations)

  • 8 cores, 16 GB RAM, 100 GB HDD
  • Used for processing migration tasks (configured as basic tasks in windows task scheduler)
  • Multiple migration terminals may be set up based on the available Sharegate licenses
  • Windows 10 Virtual machines each installed with the required software.
  • Activated Sharegate license on each of the migration terminals
  • PowerShell remoting needs to be enabled
  • Script execution policy set as “Bypass”

Migration Process

Initiate queueing of Migrations

Before migration, migration team must perform manual pre – migration tasks (if any as defined by the migration process as defined and agreed with stake holders). Some of the pre-migration tasks / checks may be:

  • Inform other teams about a possible network surge
  • Confirming if another activity is consuming bandwith (scheduled updates)
  • Inform the impacted business users about the migration – this would be generally set up as the communication plan
  • Freezing the source data store as “read-only”

A list was created on a SharePoint online site to enable users to indicate that the migration is ready to be processed. The updates in this list shall trigger the actual migration downstream. The migration plan is pre-populated in this list as a part of migration planning phase. The migration team can then update one of the fields (ReadyToMigrate in this case) to initiate the migration. Migration status is also updated back to this list by the automation process or skip a planned migration (if so desired).

The list provides as a single point of entry to initiate and monitor migrations. In other words, we are abstracting out migration processing with this list and can be an effective tool for migration and communication teams.

The list was created with the following columns:

  • Source => Network Share root path
  • Destination site => https://yourteanant.sharepoint.com/sites/<Sitename>
  • Destination Library => Destination library on the site
  • Ready to migrate => Indicates that the migration is ready to be triggered
  • Migrate all data => Indicate if all data from the source is to be migrated (default is No). Only filtered data based on the predefined options will be migrated. (more on filtered options can be found here)
  • Started => updated by automation when the migration package has been downloaded
  • Migrated => updated by automation after migration completion
  • Terminal Name => updated by automation specifying the terminal being used to migrate the task

 

Migration configuration list

 

After the migration team is ready to initiate the migration, the field “ReadyToMigrate” for the migration item in the SharePoint list is updated to “Yes”.


“Flicking the switch”

 

Script to create the migration configuration list

The script below creres the source list in sharepoint online.

Script to store credentials

The file stores the credentials and can be used subsequent scripts.

Queuing the migration tasks

A PowerShell script is executed to poll the migration configuration list in SharePoint at regular intervals to determine if a migration task is ready to be initiated. The available migration configurations are then downloaded as CSV files, one item / file and stored in a migration packages folder on the master terminal. Each CSV file maps to one migration task to be executed by a migration terminal and ensures that the same migration task is not executed by more than one terminal. It is important that this script runs on a single terminal to ensure only one migration is executed for one source item.

 

 

Execute Migration

The downloaded migration configuration CSV files are detected by migration script tasks executing on each of the migration terminals. Based on the specified source, destination and migration options the following tasks are executed:

  1. Reads the item from configuration list to retrieve updated data based on item ID
  2. Verify Source. Additionally, sends a failure mail if source is invalid or not available
  3. Revalidates if a migration is already initiated by another terminal
  4. Updates the “TerminalName” field in the SharePoint list to indicate an initiated migration
  5. Checks if the destination site is created. Creates if not already available
  6. Checks if the destination library is created. Creates if not already available
  7. Triggers an information mail informing migration start
  8. Loads the required configurations based on the required migration outcome. The migration configurations specify migration options such as cut over dates, source data filters, security and metadata. More about this can be found here.
  9. Initiates the migration task
  10. Extracts the migration report and stores as CSV
  11. Extracts the secondary migration report as CSV to derive paths of all files successfully migrated. These CSV can be read by an optional downstream process.
  12. Triggers an information mail informing migration is completed
  13. Checks for another queued migration to repeat the procedure.

The automatioin script is given below –

 


Additional Scripts

Send Mails

The script triggers emails to required recipients.

This script polls a folder:   ‘\masterterminal\c$\AutomatedMigrationData\mails\input’ to check any files to be send out as emails. The csv files sepcify subject and body to be send out as emails to recipients configured in the script. Processed CSV files are moved to final folder.

 

Manage migration tasks (scheduled tasks)

The PowerShell script utilizes PowerShell remoting to manage windows Task scheduler tasks configured on other terminals.

 


Conclusion

The migration automation process as described above helps in automating the migration project and reduces manual overhead during the migration. Since the scripts utilize pre-configured migration options / templates, the outcome is consistent with the plan. Controlling and monitoring migration tasks utilizing a SharePoint list introduces transparency in the system and abstracts the migration complexity. Business stakeholders can review migration status easily from the SharePoint list and this ensures an effective communication channel. Automated mails informing about migration status provide additional information about the migration. The migration tasks are executed in parallel across multiple migration machines which aids in a better utilization of available migration window.

 

 

Utilising Azure Text to Speech Cognitive Services with PowerShell

Introduction

Recently I’ve been building an IoT Project that leverages Azure Cognitive Services. A couple of the services I needed to use were for converting Text to Speech and Speech to Text. The guides were pretty good from Microsoft, but not obvious for use with native PowerShell. I’ve got it all working, so am documenting it for myself for the future but also to help anyone else trying to work it out.

Accessing the Cognitive Services Text to Speech API

Azure Cognitive Services Text to Speech is a great service that provides the ability as the name suggests, convert text to speech.

First you’ll need to get an API key. Head to the Cognitive Services Getting Started page and select Try Text to Speech and Get API Key. It will give you a trial key and 5000 transactions limited to 20 per minute. If you want to use it longer, provision a Speech to Text service using the Azure Portal.

The Script

I’m using a female voice in English for my output format. All the available output languages and genders are available here.

There are also 8 audio output formats. The two I’ve used most are raw 16khz pcm for .wav format and 16khz mp3 for MP3 output as highlighted below. The script further below is configured for MP3.

  • ssml-16khz-16bit-mono-tts
  • raw-16khz-16bit-mono-pcm
  • audio-16khz-16kbps-mono-siren
  • riff-16khz-16kbps-mono-siren
  • riff-16khz-16bit-mono-pcm
  • audio-16khz-128kbitrate-mono-mp3
  • audio-16khz-64kbitrate-mono-mp3
  • audio-16khz-32kbitrate-mono-mp3

The script below is pretty self-explanatory. Update Line 5 for your API Key, and Lines 11 and 13 if you want the output audio file to go to a different directory or filename.  The text to be converted is in line 59.

Step through it using VSCode or PowerShell ISE.

Summary

Using Azure Cognitive Services you can quickly convert text to audio. Enjoy.

Some advanced ADFS 4.0 branding customization

As you are aware that you can use some of the PowerShell commands to update the logo, banner/illustration images as well as home, privacy and other links of the ADFS 4.0 home realm discovery or sign in page. Below is an example of doing so

Set-AdfsWebTheme -TargetName custom -Logo @{path=”P:\Theme\Logo\logo.png”}

The above command would update the current logo image on the custom theme.

Set-AdfsGlobalWebContent -HomeLink https://{www.YourWebsite.Com}/ -HomeLinkText Home

Above command would update the “Home” link on all pages of your ADFS theme.

How about doing below customization?

 

You can do above customizations by writing custom code in the “onload.js”. The process of exporting the “onload.js” file and importing it back again to ADFS 4.0 theme is explained in my previous post here.

So our requirements are mentioned below:

  1. Change the home realm discovery page sign in label message from “Sign in with one of these accounts” to something else
  2. Change home realm discovery page title
  3. Add “User agreement” and “Code of conduct” links along with some text message on home realm discovery page
  4. Change sing in page title
  5. Change the sign in label message on sign in page from “Sign in with your organizational account” to something else

But just to repeat the some of the required steps to export/import onload.js file, Run the following PowerShell command to export current theme to a location

Export-AdfsWebTheme -Name default -DirectoryPath D:\Themes\Custom

Run the following PowerShell command to create a new custom theme based on current theme

New-AdfsWebTheme -Name custom -SourceName default

Update onload.js file extracted at D:\Themes\Custom\theme\script with following code snippets added at the end of the file:

Run the following PowerShell command to update back the onload.js file to the theme

Set-AdfsWebTheme -TargetName custom -AdditionalFileResource @{Uri=’/adfs/portal/script/onload.js’;path=”D:\Themes\Custom\theme\script\onload.js”}

Run the following PowerShell command to make the custom theme as your default theme

Set-AdfsWebConfig -ActiveThemeName custom -HRDCookieEnabled $false

You must be thinking that we have missed point no. 6, 7 and 8. No, we haven’t. We don’t need to update anything in the onload.js, this can be achieved by some PowerShell commands mentioned below:

Requirement no. 6 and 7 (combined) can be done by below PowerShell command

Set-AdfsGlobalWebContent -SignInPageDescriptionText “View the {Your Company Name} <A href='{User Agreement URL}’ target=’_blank’>User Agreement</A> and <A href='{Code of Conduct URL}’ target=’_blank’>Code of Conduct</A>. </br></br>Forgotten your password? Please click <A href='{Staff password change URL}’ target=’_blank’>staff</A> or <A href='{Student password change URL}’ target=’_blank’>student</A> to reset your password.”

Requirement no. 8: Run the following PowerShell commands to set the Home, Privacy and HelpDesk link texts

Set-AdfsGlobalWebContent -HomeLink {Your company URL} -HomeLinkText Home

Set-AdfsGlobalWebContent -PrivacyLink {Your company privacy policy URL} -PrivacyLinkText Privacy

Set-AdfsGlobalWebContent -HelpDeskLink {Your company disclaimer URL} -HelpDeskLinkText Disclaimer

In the next article I will explain you, how to auto-redirect the home realm discovery to an ADFS namespace (claims provider trust) based on client’s IP.

Exporting IoT Device Information from Azure IoT Hub(s) using PowerShell

Introduction

I have a number of Azure IoT Hubs each with a number of devices configured on them. I wanted to export the details for each IoT Device. This can’t be done via the Azure Portal (May 2018) so I looked to leverage the Azure.IoTHub New-AzureRmIotHubExportDevices cmdlet.

Now the documentation for New-AzureRmIotHubExportDevices is a little light on. When I was running the New-AzureRmIotHubExportDevices I kept getting the error ‘Operation returned an invalid status code ‘InternalServerError’.

After many attempts (over weeks) I finally was able to export my IoT devices using PowerShell. The key was to generate the SAS Storage Token for the Container rather than creating a blob file to export to and generating a SAS Token for the file. Simply specify the Storage Container to export too.

Overview

My sample script below uses the latest (as of May 2018) version of the Azure.IoTHub Module (v3.1.3). It;

  • enumerates all Resource Groups in an Azure Subscription and looks for IoT Hubs and puts them into a collection
  • then iterates through each IoT Hub, creates an associated Storage Account (if one doesn’t exist)
  • Exports the IoT Devices associated with the IoT Hub to Azure Storage
  • Downloads the IoT Devices Blob File, opens it and displays through PowerShell console output the IoT Device Names and Status

Export IoT Devices 640px

To use the script you will just need to;

  • change your Subscription Name in Line 4
  • The location where you want to download the blob files too in Line 31
  • if you want to display additional info on each device or do something else with the info change line 71 accordingly

The exported file can be found using Azure Storage Explorer as shown below.

Output Blob File.PNG

And the script outputs the status to the PowerShell console as shown below.

Exported IoT Devices.PNG

The exported object contains all the details for each IoT Device as shown below in the IoT Device PSObject.

IoT Device PSObj.PNG

Summary

It is obvious once you work out how the cmdlet works. Hopefully this working example will save someone else a few hours of head scratching.

 

Removing Specific Azure Tags – PowerShell

Azure Tags

You apply tags to your Azure resources to logically organize them by categories. Each tag consists of a name and a value. For example, you can apply the name “Environment” and the value “Production” to all the resources in production.

After you apply tags, you can retrieve all the resources in your subscription with that tag name and value. Tags enable you to retrieve related resources from different resource groups. This approach is helpful when you need to organize resources for billing or management.

 

Problem:

Sometimes tags are applied in environments prior to developing a tagging strategy. The problem in exponentially increased with the size of the environment and the number of users creating resources.

Currently we are looking for a solution to remove specific unwanted tags from Virtual Machines.

For this purpose , the below mentioned script was developed that solves the problem.

Solution :

The below mentioned script performs the following tasks

  • Get the list of all the VMs based on the Scope
  • Get all the Tag Values for each VM in a $VMTags variable.
  • Copy all the values from $VMtags to $newtags except the $tagtoremove value.
  • Configure the resources with the $newtags values using Set-AzureRmResource command.

 

Code:

#Getting the list of VMs based on the resource group. THe Scope can be changed to include more resources.

$VMS = Get-AzureRmVM -ResourceGroupName ResourceGroupName

#Details of the tag to remove are stored in the $TagtoRemove variable.

$TagtoRemove = @{Key="TestVmName";Value="abcd"}

foreach ($VM in $VMs)
{
 $VMtags = $VM.tags # Getting the list of all the tags for the VM.
 $newtag = @{} # Creating a new Hashtable variable to store the Tag Values.
 foreach ( $KVP in $VMtags.GetEnumerator() )
 { 
 Write-Host "`n`n`n"
 If($KVP.Key -eq $TagtoRemove.Key)
 { 
 write-host $TagtoRemove.Key "exists in the "$VM.Name " will be removed `n"}
 Else 
 { 
 
 $newtag.add($KVP.Key,$KVP.Value) # Adding all the tags in the $newtag Variable except the $TagtoRemove.key values 
 }

}
 #Updating the Virtual machine with the updated tag values $newtag.
 Set-AzureRmResource -ResourceGroupName $VM.ResourceGroupName -ResourceName $VM.Name -Tag $newtag -Force -ResourceType Microsoft.Compute/VirtualMachines
}

 

 

How to make Property Bag Values indexed and searchable in SharePoint Online

In an earlier post here we have seen how we can set Property Bag values in Modern SharePoint sites. One of the major reasons for setting Property Bag values in SharePoint sites is to make the SharePoint sites searchable based on custom metadata.

However, property bag values are not crawled by the SharePoint Online Search index directly. To make a property bag value searchable, we must explicit set the property bag values to be indexed by the Search crawler. To do this, we simply set the Property bag values to indexed using the SharePoint PnP PowerShell.

The “-Indexed” attribute of the Set-PnPPropertyBagValue cmdlet makes an entry in a hidden system property bag value vti_indexedpropertykeys which then makes these property bag values searchable. A screenshot of the result is below.

SetPnPPropertyBagIndexed.png

After performing the above, we flag the site to be reindexed so that the SharePoint search crawler will pick up the new Property Bag values in the next crawl. This can be done from “Site Settings -> Search and offline availability” (in the Search group), then click the button “Reindex site” (screenshot below).

Note: Since SharePoint Online is a massive SaaS system, it can take up to 24 hours for the crawler to pick up this property.  Unfortunately, there’s no workaround for this delay, you simply must be patient.

Search and Offline Availability

Conclusion

In the above post we saw how we can enable Property Bag values to be searchable.

 

Report of All Taxonomy Fields containing a term in SharePoint Tenancy

Recently we had a request to find fields/columns in all lists across the tenancy which have a specific Taxonomy term because we needed to report on field usage across all site collections. However, we found that getting a report of all Taxonomy fields in your SharePoint tenancy that is linked to a specific Term Set can get quite daunting because there is no direct SharePoint Query to fetch the associations.

The technical challenge is that using PnP PowerShell, the Taxonomy fields are returned as a generic SP.Field type and not of type SP.TaxonomyField. Hence the Taxonomy field metadata values such as Group ID and Termset ID are absent. To resolve the above limitation, we used the Field $field.SchemaXml to find the specified values.

Note: Querying the Term store while searching for a specific termset by using Get-PnPTerm can add a lot of latency time. Hence to decrease the additional time we could download the entire term store to an Excel file and use that Excel file as the master data for matching. Below is the command to get an export of all Taxonomy values as a CSV file.

Export-PnPTaxonomy -Path "[path]\taxonomyreport.csv" -Delimiter "," -IncludeID

Steps:

The steps to retrieve and check for taxonomy fields can be found below.

1. Get all lists in a web site
2. Get the Taxonomy fields for a List
3. Read the schema.xml and search for a TermsetID and AnchorID (Thanks to @Colin Philips (http://itgroove.net/mmman/) for finding the correct xpath for parsing xml)
4. Match the data with the above Taxonomy report for Group ID, Termset ID, and Anchor ID with Term ID that is column is linked to
5. In case of a match, save the values into the report.

The below code uses PnP PowerShell. For a quick set up of PnP PowerShell, please refer to this blog.

Conclusion:

Hence with the above approach, we can retrieve the taxonomy fields for all site collections.  Be aware that the above process can take about half a day or more to run depending on the number of site collections and taxonomy fields in your tenancy.  Be sure to give it enough time to run before prematurely cancelling it.

Happy Coding!! 😊