Automating the submission of WordPress Blog Posts to your Microsoft MVP Community Activities Profile using PowerShell

Introduction

In November last year (2017) I was honored to be awarded Microsoft MVP Status for Enterprise Mobility – Identity and Access. MVP Status is awarded based on community activities and even once you’ve attained MVP Status you need to keep your community activity contributions updated on your profile.

Up until recently this was done by accessing the portal and updating your profile, however mid last year a MVP PowerShell Module (big thanks to Francois-Xavier Cat and Emin Atac) was released that allows for some automation.

In this post I’ll detail using the MVP PowerShell Module to retrieve your latest WordPress Blog Post and submit it to your MVP Profile as a MVP Community Contribution.

Prerequisites

In order for this to work you will need;

  • to be a Microsoft MVP
  • Register at the MS MVP API Developer Portal using your MVP registered email/profile address
    • subscribe to the MVP Production Application
    • copy your API key (you’ll need this in the script below)

The Script

Update the script below for;

  • MVP API Key
    • Update Line 5 with your API key as detailed above
  • WordPress Blog URL (mine is blog.darrenjrobinson.com)
  • Your Award Category
    • Update Line 17 with your category $contributionTechnology = “Identity and Access”
    • type New-MVPContribution – ContributionTechnology and you’ll get a list of the MVP Award Categories

Award Categories.PNG

The Script

Here is the simple script. Run it after publishing a new entry you also want added to your Community Contributions Profile. No error handling, nothing complex, but you’re a MVP so you can plagiarise away for your submissions to make it do what suits your needs.

Summary

Hopefully that makes it simple to keep your MVP profile up to date. If you’re using a different Blogging platform I’m sure the basic process will work with a few tweaks after returning a query to get the content. Enjoy.

Automating the creation of Azure IoT Hubs and the registration of IoT Devices with PowerShell and VS Code

The creation of an Azure IoT Hub is quick and simple, either through the Azure Portal or using PowerShell. But what can get more time-consuming is the registration of IoT Devices with the IoT Hub and generation of SAS Tokens for them for authentication.

In my experiments with micro-controllers and their integration with Azure IoT Services I often find I keep having to manually do tasks that should have just been automated. So I did. In this post I’ll cover using PowerShell to;

  • create an Azure IoT Hub
  • register an Azure IoT Device
  • generate a SAS Token for the IoT Device to use for authentication to an Azure IoT Hub from a Mongoose OS enabled ESP8266 micro controller

IoT Integration

Prerequisites

In order to fully test this, ideally you will have a micro-controller. I’m using an ESP8266 based micro-controller like this one. If you want to test this out without physical hardware, you could generate your own DeviceID (any text string) and use the AzureIoT Library detailed further on to send MQTT messages.

You will also require an Azure Subscription. I detail using a Free Tier Azure IoT Hub which is limited to 8000 messages per day. And instead of using PowerShell/PowerShell ISE get/use Visual Studio Code.

Finally you will need the AzureRM and AzureIoT PowerShell modules. With WinRM 5.x you can get them from the PowerShell Gallery with;

install-module AzureRM
install-module AzureIoT

Create an Azure IoT Hub

The script below will create a Free Tier Azure IoT Hub. Change the location (line 15) for which Azure Region you will use (the commands on the lines above will list what regions are available), the Resource Group Name that will be created to hold it (line 18) and the name of the IoT Hub (line 23) and let it rip.

From your micro-controller we will need the DeviceID. I’m using the ID generated by the device which I obtained from the Device Configuration => Expert View of my Mongoose OS enabled ESP8266.

Device Config.PNG

Register the IoT Device with our Azure IoT Hub

Using the AzureIoT PowerShell module we can automate the creation/registration of the IoT Device. Update the script below for the name of your IoTHub and the Resource Group that contains it that you created earlier (lines 7 and 11). Update line 21 for the DeviceID or your new IoT Device. I’m using the AzureIoT module to do this. With WinRM 5.x you can install it quickly fromt the gallery with install-module AzureIoT

Looking at our IoTHub in the Azure Portal we can see the newly registered IoT Device.

DeviceCreated.png

Generate an IoT Device SAS Token

The final step is to create a SAS Token for our IoT Device to use to connect to the Azure IoTHub. Historically you would use the IoT Device Explorer to do that. Alternatively you can also use the code samples to implement the SAS Device Token generation via an Azure Function App. Examples exist for JavaScript and C#. However as of mid-January 2018 you can do it direct from VS Code or Azure Cloud Shell using the Azure CLI and the IOT Extension. I’m using this method here as it is the quickest and simplest method of generating the Device SAS Token.

The command to generate a token that would work for all Devices on an IoT Hub is

az iot hub generate-sas-token --hub-name

Here I show executing it via the Azure Cloud Shell after installing the IOT Extensions as detailed here. To open the Bash Cloud Shell select the >_ icon next to the notification bell in the right top menu list.

Generate IOT Device SAS Token.PNG

As we have done everything else via PowerShell and VS Code we can also do it easily from VS Code. Install the Azure CLI Tools (v0.4.0 or later in VS Code as detailed here. Then from within VS Code press Control + Shift + P to open the Command Palette and enter Azure: Sign In. Sign in to Azure. Then Control + Shift + P again and enter Azure: Open Bash in Cloud Shell to open a Bash Azure CLI Shell. You can check to see if you have the Azure CLI IOT Extension (if you’ve previously used the Azure CLI for IoT operations) by typing;

az extension show --name azure-cli-iot-ext

and install it if you don’t with;

az extension add --name azure-cli-iot-ext

Then run the same command from VS Code to generate the SAS Token

az iot hub generate-sas-token --hub-name

VSCode Generate SAS Token.PNG

NOTE: That token can then be used for any Device registered with that IOT Hub. Best practice is to have a token per device. To do that type

az iot hub generate-sas-token --hub-name  --device-id

Generate SAS Token VS Code Per Device.PNG

By default you will get a token valid for 1 hour. Use the –duration switch to specify the duration of the token you require for your environment.

We can now take the SAS Token and put it into our MQTT Config on our Mongoose OS IoT Device. Update the Device Configuration using Expert View and Save.

Mongoose SAS Config.PNG

We can then test our IoT Device sending updates to our Azure IoT Hub. Update Init.js using the telemetry sample code from Mongoose.

load('api_config.js');
 load('api_mqtt.js');
 load('api_sys.js');
 load('api_timer.js');

let topic = 'devices/' + Cfg.get('device.id') + '/messages/events/';

Timer.set(1000, true /* repeat */, function() {
 let msg = JSON.stringify({ ram: Sys.free_ram() });
 let ok = MQTT.pub(topic, msg, 1);
 print(ok, topic, '->', msg);
 }, null);

We can then see the telemetry being sent to our Azure IOT Hub using MQTT. In the Device Logs after the datestamp and before device/ if you see a 0 instead of 1 (as shown below) then your conenction information or SAS Token is not correct.

Mongoose IOT Events.png

On the Auzre IoT side we can then check the metrics and see the incoming telemetry using the counter Telemetry Metrics Sent as shown below.

Telemetry Metrics Sent.PNG

If you don’t have an IoT Device you can simulate one using PowerShell. The following example shows sending a message to our IoT Hub (using variables from previous scripts).

$deviceParams = @{
 iotConnString = $IoTConnectionString
 deviceId = $deviceID
}
$deviceKeys = Get-IoTDeviceKey @deviceParams 
# Get Device 
$device = Get-IoTDeviceClient -iotHubUri $IOTHubDeviceURI -deviceId $deviceID -deviceKey $deviceKeys.DevicePrimaryKey

# Send Message
$deviceMessageParams = @{
 deviceClient = $device
 messageString = "Azure IOT Hub"
}
Send-IoTDeviceMessage -deviceClient $deviceMessageParams

Summary

Using PowerShell we have quickly been able to;

  • Create an Azure IoT Hub
  • Register an IoT Device
  • Generate the SAS Token for the IoT Device to authenticate to our IoT Hub with
  • Configure our IoT Device to send telemetry to our Azure IoT Hub and verify integration/connectivity

We are now ready to implement logic onto our IoT Device for whatever it is you are looking to achieve.

 

Quick start guide for PnP PowerShell

It is quite easy and quick to set up PnP PowerShell on a local system and start using it. Considering that PnP PowerShell has been gaining a lot of momentum among Devs and Admins, I thought it would be good to have a post for everyone’s reference.

So why PnP PowerShell? Because it is the recommended and most updated PowerShell module for IT Pros to work on SharePoint Online, SharePoint on-premise and Office 365 Groups. It allows us to remotely maintain a SharePoint Online and Office 365 tenancy as we will see below.

Installation:

First, we have to make sure that we have the latest and greatest version of PowerShell (at least >= v4.0) on the system. We can simply check it using the following command $PSVersionTable.PSVersion which will give the output below.

PowerShell Version Check

If your version isn’t sufficient, then go to this link to upgrade – https://www.microsoft.com/en-us/download/details.aspx?id=40855. Since there are OS requirements accompanied with PowerShell, please make sure to check the System Requirements in order to make sure the OS matches the specs.

Also, I would recommend installing SharePoint Online Management Shell because there are various Tenant commands that you may not find in PnP PowerShell. The link can be found here – https://www.microsoft.com/en-au/download/details.aspx?id=35588. Again, make sure to check the System Requirements in order to make sure the OS matches the specs.

After you have the proper PowerShell version, run the following commands to install the PnP PowerShell module depending on if you’re using the Online or On-Premise version.  You only need to install the version(s) you need to interact with (no need to install on premise versions if you’re only working online).

SharePoint Version Command to install
SharePoint Online Install-Module SharePointPnPPowerShellOnline
SharePoint 2016 Install-Module SharePointPnPPowerShell2016
SharePoint 2013 Install-Module SharePointPnPPowerShell2013

It will likely take few mins to run and complete the above command(s).

Getting Started Guide:

To start using PnP PowerShell, first we will have to understand how PnP PowerShell works.

PnP PowerShell works in the context of the current Connection the site it is connected to. Assuming it is connected to a Site Collection, all commands refer by default to that Site Collection. This is different from SPO commands which require you to have Tenant Admin rights. A benefit of the PnP approach is that you can have separate Admin personnel to manage separate site collections or sites if needed.

Overview of Basic SharePoint Operations with PnP PowerShell

1. Connect to PnP Online

Connect-PnPOnline -Url

In the above case, if you are not given a prompt for entering your credentials, then create a credential object and pass it to the Connect command (as seen below).

$user = ""
$secpasswd = ConvertTo-SecureString "" -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($user, $secpasswd)
Connect-SPOnline -url $siteAdminURL -Credentials $mycreds

2. Get the web object using Get-PnPSite and then get the subsites in it

## For Site Collection (root web)
$web = Get-PnPSite 
## For Sub web
$web = Get-PnPWeb -Identity ""

Remember, PnP works in the context of the current connection.

3. To work on lists or objects of the site, use the “Includes” parameter to request them or else they will not be initialized.

$web = Get-PnPSite -Includes RootWeb.Lists

PnPPowerShell output

After you get the list object, you can traverse it as needed using the object.

4. To work on list items, you must request the list items and same goes for updating the items.

$listItem = Get-PnPListItem -List "Site Assets"

DocumentResults

Set-PnPListItem -List "Site Assets" -Identity 2 -Values @{"" = ""}

PnPPowerShell_UpdateItem

So, as we saw above, we can use PnP PowerShell to maintain SharePoint assets remotely.

Please feel free to let me know if there are any specific questions you may have regarding the approach above and I will post about those as well.

Automating the generation of Microsoft Identity Manager Configuration Documentation

Introduction

Last year Microsoft released the Microsoft Identity Manager Configuration Documenter which is available here. It is a fantastic little tool from Microsoft that supersedes its predecessor from the Microsoft Identity Manager 2003 Resource Toolkit (which only documented the Sync Server Configuration).

Running the tool (a PowerShell Module) against a base out-of-the-box reference configuration for FIM/MIM Servers reconciled against an exported configuration from the MIM Sync and Service Servers from an implementation, generates an HTML Report document that details the existing configuration of the MIM Service and MIM Sync.

Overview

Last year I wrote this post based on an automated solution I implemented to perform nightly backups of a FIM/MIM environment during development.

This post details how I’ve automated another daily task for a large development environment where a number of changes are going on and I wanted to have documentation generated that detailed the configuration for each day. Partly to quickly be able to work out what has changed when needing to roll back/re-validate changes, and also to have the individual configs from each day so they could also be used if we need to rollback.

The process uses an Azure Function App that uses Remote PowerShell into MIM to;

  1. Leverage a modified (stream lined version) of my nightly backup Azure Function to generate the Schema.xml and Policy.xml MIM Service configuration files and the Lithnet MIIS Automation PowerShell Module installed on the MIM Sync Server to export of the MIM Sync Server Configuration
  2. Create a sub-directory for each day under the MIM Documenter Tool to hold the daily configs
  3. Execute the generation of the Report and have the Report copied to the daily config/documented solution

Obtaining and configuring the MIM Configuration Documenter

Download the MIM Configuration Documenter from here and extract it to somewhere like c:\FIMDoco on your FIM/MIM Sync Server. In this example in my Dev environment I have the MIM Sync and Service/Portal all on a single server.

Then update the Invoke-Documenter-Contoso.ps1 (or whatever you’ve renamed the script to) to make the following changes;

  • Update the following lines for your version and include the new variable $schedulePath and add it to the $pilotConfig variable. Create the C:\FIMDoco\Customer and C:\FIMDoco\Customer\Dev directories (replace Customer with something appropriate.
######## Edit as appropriate ####################################
$schedulePath = Get-Date -format dd-MM-yyyy
$pilotConfig = "Customer\Dev\$($schedulePath)" # the path of the Pilot / Target config export files relative to the MIM Configuration Documenter "Data" folder.
$productionConfig = "MIM-SP1-Base_4.4.1302.0" # the path of the Production / Baseline config export files relative to the MIM Configuration Documenter "Data" folder.
$reportType = "SyncAndService" # "SyncOnly" # "ServiceOnly"
#################################################################
  • Remark out the Host Settings as these won’t work via a WebJob/Azure Function
#$hostSettings = (Get-Host).PrivateData
#$hostSettings.WarningBackgroundColor = "red"
#$hostSettings.WarningForegroundColor = "white"
  • Remark out the last line as this will be executed as part of the automation and we want it to complete silently at the end.
# Read-Host "Press any key to exit"

It should then look something like this;

Azure Function to Automate execution of the Documenter

As per my nightly backup process;

  • I configured my MIM Sync Server to accept Remote PowerShell Sessions. That involved enabling WinRM, creating a certificate, creating the listener, opening the firewall port and enabling the incoming port on the NSG . You can easily do all that by following my instructions here. From the same post I setup up the encrypted password file and uploaded it to my Function App and set the Function App Application Settings for MIMSyncCredUser and MIMSyncCredPassword.
  • I created an Azure PowerShell Timer Function App. Pretty much the same as I show in this post, except choose Timer.
    • I configured my Schedule for 6am every morning using the following CRON configuration
0 0 6 * * *
  • I also needed to increase the timeout for the Azure Function as generation of the files to execute the report and the time to execute the report exceed the default timeout of 5 mins in my environment (19 Management Agents). I increased the timeout to the maximum of 10 mins as detailed here. Essentially added the following to the host.json file in the wwwroot directory of my Function App.
{
 "functionTimeout": "00:10:00"
}

Azure Function PowerShell Timer Script (Run.ps1)

This is the Function App PowerShell Script that uses Remote PowerShell into the MIM Sync/Service Server to export the configuration using the Lithnet MIIS Automation and Microsoft FIM Automation PowerShell modules.

Note: If your MIM Service is on a different host you will need to install the Microsoft FIM Automation PowerShell Module on your MIM Sync Server and update the script below to change references to http://localhost:5725 to whatever your MIM Service host is.

Testing the Function App

With everything configured, manually running the Function App and checking the output window if you’ve configured everything correct will show success in the Logs as shown below. In this environment with 19 Management Agents it takes 7 minutes to run.

Running the Azure Function.PNG

The Report

The outcome everyday just after 6am is I have (via automation);

  • an Export of the Policy and Schema Configuration from my MIM Service
  • an Export of the MIM Sync Server Configuration (the Metaverse and all Management Agents)
  • I have the MIM Configuration Documenter Report generated
  • If I need to rollback changes I have the ability to do that on a daily interval (either for a MIM Service change or an individual Management Agent change

Under the c:\FIMDoco\Data\Customer\Dev\Report directory is the HTML Configuration Report.

Report Output.PNG

Opening the report in a browser we have the configuration of the MIM Sync and MIM Service.

Report

 

Exchange Online & Splunk – Automating the solution

NOTES FROM THE FIELD:

I have recently been consulting on, what I think is a pretty cool engagement to integrate some Office365 mailbox data into the Splunk reporting platform.

I initially thought about using a .csv export methodology however through trial & error (more error than trial if I’m being honest), and realising that this method still required some manual interaction, I decided to embark on finding a fully automated solution.

The final solution comprises the below components:

  • Splunk HTTP event collector
    • Splunk hostname
    • Token from HTTP event collector config page
  • Azure automation account
    • Azure Run As Account
    • Azure Runbook
    • Exchange Online credentials (registered to Azure automation account

I’m not going to run through the creation of the automation account, or required credentials as these had already been created, however there is a great guide to configuring the solution I have used for this customer at  https://www.splunk.com/blog/2017/10/05/splunking-microsoft-cloud-data-part-3.html

What the PowerShell script we are using will achieve is the following:

  • Connect to Azure and Exchange Online – Azure run as account authentication
  • Configure variables for connection to Splunk HTTP event collector
  • Collect mailbox data from the Exchange Online environment
  • Split the mailbox data into parts for faster processing
  • Specify SSL/TLS protocol settings for self-signed cert in test environment
  • Create a JSON object to be posted to the Splunk environment
  • HTTP POST the data directly to Splunk

The Code:

#Clear Existing PS Sessions
Get-PSSession | Remove-PSSession | Out-Null
#Create Split Function for CSV file
function Split-array {
param($inArray,[int]$parts,[int]$size)
if($parts) {
$PartSize=[Math]::Ceiling($inArray.count/$parts)
}
if($size) {
$PartSize=$size
$parts=[Math]::Ceiling($inArray.count/$size)
}
$outArray=New-Object’System.Collections.Generic.List[psobject]’
for($i=1;$i-le$parts;$i++) {
$start=(($i-1)*$PartSize)
$end=(($i)*$PartSize)-1
if($end-ge$inArray.count) {$end=$inArray.count-1}
$outArray.Add(@($inArray[$start..$end]))
}
return,$outArray
}
function Connect-ExchangeOnline {
param(
$Creds
)
#Connect to Exchange Online
$Session=New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/-Credential $Credentials-Authentication Basic -AllowRedirection
$Commands=@(“Add-MailboxPermission”,”Add-RecipientPermission”,”Remove-RecipientPermission”,”Remove-MailboxPermission”,”Get-MailboxPermission”,”Get-User”,”Get-DistributionGroupMember”,”Get-DistributionGroup”,”Get-Mailbox”)
Import-PSSession-Session $Session-DisableNameChecking:$true-AllowClobber:$true-CommandName $commands|Out-Null
}
#Create Variables
$SplunkHost = “Your Splunk hostname or IP Address”
$SplunkEventCollectorPort = “8088”
$SplunkEventCollectorToken = “Splunk Token from Http Event Collector”
$servicePrincipalConnection = Get-AutomationConnection -Name ‘AzureRunAsConnection’
$credentials = Get-AutomationPSCredential -Name ‘Exchange Online’
#Connect to Azure
Add-AzureRMAccount -ServicePrincipal -Tenant $servicePrincipalConnection.TenantID -ApplicationId $servicePrincipalConnection.ApplicationID -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
#Connect to Exchange Online
Connect-ExchangeOnline -Creds $credentials
#Invoke Script
$mailboxes = Get-Mailbox -resultsize unlimited | select-object -property DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Get Current Date & Time
$time = get-date -Format s
#Convert Timezone to Australia/Brisbane
$bnetime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($time, [System.TimeZoneInfo]::Local.Id, ‘E. Australia Standard Time’)
#Adding Time Column to Output
$mailboxes = $mailboxes | Select-Object @{expression = {$bnetime}; Name = ‘Time’}, DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Create Split Array for Mailboxes Spreadsheet
$recipients = Split-array -inArray $mailboxes -parts 5
#Create JSON objects and HTTP Post to Splunk HTTP Event Collector
foreach ($recipient in $recipients) {
foreach($rin$recipient) {
#Create SSL Validation Bypass for Self-Signed Certificate in Testing
$AllProtocols = [System.Net.SecurityProtocolType]’Ssl3,Tls,Tls11,Tls12′
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
#Get JSON string to post to Splunk
$StringToPost = “{ `”Time`”: `”$($r.Time)`”, `”DisplayName`”: `”$($r.DisplayName)`”, `”PrimarySMTPAddress`”: `”$($r.PrimarySmtpAddress)`”, `”IsMailboxEnabled`”: `”$($r.IsMailboxEnabled)`”, `”ForwardingSmtpAddress`”: `”$($r.ForwardingSmtpAddress)`”, `”GrantSendOnBehalfTo`”: `”$($r.GrantSendOnBehalfTo)`”, `”ProhibitSendReceiveQuota`”: `”$($r.ProhibitSendReceiveQuota)`”, `”AddressBookPolicy`”: `”$($r.AddressBookPolicy)`” }”
$uri = “https://” + $SplunkHost + “:” + $SplunkEventCollectorPort + “/services/collector/raw”
$header = @{“Authorization”=”Splunk ” + $SplunkEventCollectorToken}
#Post to Splunk Http Event Collector
Invoke-RestMethod -Method Post -Uri $uri -Body $StringToPost -Header $header
}
}
Get-PSSession | Remove-PSSession | Out-Null

 

The final output that can be seen in Splunk looks like the following:

11/13/17
12:28:22.000 PM
{ [-]
AddressBookPolicy:
DisplayName: Shane Fisher
ForwardingSmtpAddress:
GrantSendOnBehalfTo:
IsMailboxEnabled: True
PrimarySMTPAddress: shane.fisher@xxxxxxxx.com.au
ProhibitSendReceiveQuota: 50 GB (53,687,091,200 bytes)
Time: 11/13/2017 12:28:22
}Show as raw text·         AddressBookPolicy =  

·         DisplayName = Shane Fisher

·         ForwardingSmtpAddress =  

·         GrantSendOnBehalfTo =  

·         IsMailboxEnabled = True

·         PrimarySMTPAddress = shane.fisher@xxxxxxxx.com.au

·         ProhibitSendReceiveQuota = 50 GB (53,687,091,200 bytes)

I hope this helps some of you out there.

Cheers,

Shane.

 

 

 

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

Restoring deleted OneDrive sites in Office365

A customer has requested whether it was possible to restore a OneDrive site that had been deleted when the user’s account was marked for deletion in AD. After a bit of research, I was able to restore the site back and retrieved the files (luckily it was deleted less than 30 days ago).

Read More

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Platform Features

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

ManagedServiceIdentity

Back in Platform Features under General Settings select Application Settings. 

General Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

App Settings

Under Platform Features select Console.

DevelopmentTools

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

Endpoint

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

NewFunction

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Vars

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Vault

Select Add new, Select Principal and locate your Function App and click Select.

Access Policy 1

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Policy - GET

We now have our Function App enabled to access the Key Vault.

Access Policy 2

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Vault Secret URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed. There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights 🙂

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

Quickly creating and using an Azure Key Vault with PowerShell

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault

Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name

Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault

Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credential

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.