commands

How to configure a Graphical PowerShell Dev/Admin/Support User Interface for Azure/Office365/Microsoft Identity Manager

During the development of an identity management solution I find myself with multiple PowerShell/RDP sessions connected to multiple environments using different credentials often to obtain trivial data/information. It is easy to trip yourself up as well with remote powershell sessions to differing environments. If only there was a simple UI that could front-end a set of PowerShell modules and make those simple queries quick and painless. Likewise to allow support staff to execute a canned set of queries without providing them elevated permissions.

I figured someone would have already solved this problem and after some searching with the right keywords I found the powershell-command-executor-ui from bitsofinfo . Looking into it he had solved a lot of the issues with building a UI front-end for PowerShell with the powershell-command-executor and the stateful-process-command-proxy. That solution provided the framework for what I was thinking. The ability to provide a UI for PowerShell using powershell modules including remote powershell was exactly what I was after. And it was built on NodeJS and AngularJS so simple enough for some customization.

Introduction

In this blog post I’ll detail how I’ve leveraged the projects listed above for integration with;

Initially I had a vision of serving up the UI from an Azure WebApp. NodeJS on Azure WebApp’s is supported, however with all the solution dependencies I just couldn’t get it working.

My fallback was to then look to serve up the UI from a Windows Server 2016 Nano Server. I learnt from my efforts that a number of the PowerShell modules I was looking to provide a UI for, have .NET Framework dependencies. Nano Server does not have full .NET Framework support. Microsoft state to do so would mean the server would no longer be Nano.

For now I’ve deployed an Azure Windows Server 2016 Server secured by an Azure NSG to only allow my machine to access it. More on security later.

Overview

Simply, put the details in Github for the powershell-command-executor provide the architecture and integration. What I will detail is the modifications I’ve made to utilize the more recent AzureADPreview PowerShell Module over the MSOL PowerShell Module. I also updated the dependencies of the solution for the latest versions and hooked it into Microsoft Identity Manager. I also made a few changes to allow different credentials to be used for Azure and Microsoft Identity Manager.

Getting Started

I highly recommend you start with your implementation on a local development workstation/development virtual machine. When you have a working version you’re happy with you can then look at other ways of presenting and securing it.

NodeJS

NodeJS is the webserver for this solution. Download NodeJS for your Windows host here. I’m using the 64-bit version, but have also implemented the solution on 32-bit. Install NodeJS on your local development workstation/development virtual machine.

You can accept all the defaults.

Following the installation of NodeJS download the powershell-command-executor-ui from GitHub. Select Clone 0r Download, Download ZIP and save it to your machine.

Right click the download when it has finished and select Extract All. Select Browse and create a folder at the root of C:\ named nodejs. Extract powershell-command-executor-ui.

Locate the c:\nodejs\powershell-command-executor-ui-master\package.json file.

Using an editor such as Notepad++ update the package.json file ……

…… so that it looks like the following. This will utilise the latest versions of the dependencies for the solution.

From an elevated (Administrator) command prompt in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\program files\nodejs npm” installThis will read the package.json file you edited and download the dependencies for the solution.

You can see in the screenshot below NodeJS has downloaded all the items in package.json including the powershell-command-executor and stateful-process-command-proxy.

When you now list the directories under C:\nodejs\powershell-command-executor-ui-master\node_modules you will see those packages and all their dependencies.

We can now test that we have a working PowerShell UI NodeJS website. From an elevated command prompt whilst still in the c:\nodejs\powershell-command-executor-ui-master directory run “c:\Program Files\nodejs\node.exe” bin\www

Open a browser on the same host and go to http://localhost:3000”. You should see the default UI.

Configuration and Customization

Now it is time to configure and customize the PowerShell UI for our needs.

The files we are going to edit are:

  • C:\nodejs\powershell-command-executor-ui-master\routes\index.js
    • Update Paths to the encrypted credentials files used to connect to Azure, MIM. We’ll create the encrypted credentials files soon.
  • C:\nodejs\powershell-command-executor-ui-master\public\console.html
    • Update for your customizations for CSS etc.
  • C:\nodejs\powershell-command-executor-ui-master\node_modules\powershell-command-executor\O365Utils.js
    • Update for PowerShell Modules to Import
    • Update for Commands to make available in the UI

We also need to get a couple of PowerShell Modules installed on the host so they are available to the site. The two I’m using I’ve mentioned earlier. With WMF5 intalled using Powershell we can simply install them as per the commands below.

Install-Module AzureADPreview
Install-Module LithnetRMA

In order to connect to our Microsoft Identity Manager Synchronization Server we are going to need to enable Remote Powershell on our Microsoft Identity Manager Synchronization Server. This post I wrote here details all the setup tasks to make that work. Test that you can connect via RPS to your MIM Sync Server before updating the scripts below.

Likewise for the Microsoft Identity Manager Service Server. Make sure after installing the LithnetRMA Powershell Module you can connect to the MIM Service using something similar to:

# Import LithnetRMA PS Module
import-module lithnetrma

# MIM AD User Admin
$username = "mimadmin@mim.mydomain.com"
# Password 
$password = "Secr3tSq1rr3l!" | convertto-securestring -AsPlainText -Force
# PS Creds
$credentials = New-Object System.Management.Automation.PSCredential $Username,$password

# Connect to the FIM service instance
# Will require an inbound rule for TCP 5725 (or your MIM Service Server Port) in you Resource Group Network Security Group Config
Set-ResourceManagementClient -BaseAddress http://mymimportalserver.:5725 -Credentials $credentials

 

\routes\index.js

This file details the encrypted credentials the site uses. You will need to generate the encrypted credentials for your environment. You can do this using the powershell-credentials-encryption-tools. Download that script to your workstation and unzip it. Open the credentialEncryptor.ps1 script using an Administrator PowerShell ISE session.

I’ve changed the index.js to accept two sets of credentials. This is because your Azure Admin Credentials are going to be different from your MIM Administrator Credentials (both in name and password). The username for my Azure account looks something like myname@mycompany.com whereas for MIM it is Domainname\Username.

Provide an account name for your Azure environment and the associated password.

The tool will create the encrypted credential files.

Rename the encrypted.credentials file to whatever makes sense for your environment. I’ve renamed it creds1.encrypted.credentials.

Now we re-run the script to create another set of encrypted credentials. This time for Microsoft Identity Manager. Once created, rename the encrypted.credentials file to something that makes sense in your environment. I’ve renamed the second set to creds2.encrypted.credentials.

We now need to copy the following files to your UI Website C:\nodejs\powershell-command-executor-ui-master directory:

  • creds1.encrypted.credentials
  • creds2.encrypted.credentials
  • decryptUtil.ps1
  • secret.key

Navigate back to Routes.js and open the file in an editor such as Notepad++

Update the index.js file for the path to your credentials files. We also need to add in the additional credentials file.

The changes to the file are, the paths to the files we just copied above along with the addition var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE for the second set of credentials used for Microsoft Identity Manager.

var PATH_TO_DECRYPT_UTILS_SCRIPT = "C:\\nodejs\\powershell-command-executor-ui-master\\decryptUtil.ps1";
var PATH_TO_ENCRYPTED_CREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds1.encrypted.credentials";
var PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE = "C:\\nodejs\\powershell-command-executor-ui-master\\creds2-encrypted.credentials";
var PATH_TO_SECRET_KEY = "C:\\nodejs\\powershell-command-executor-ui-master\\secret.key";


Also to initCommands to pass through the additional credentials file


initCommands: o365Utils.getO365PSInitCommands(
 PATH_TO_DECRYPT_UTILS_SCRIPT,
 PATH_TO_ENCRYPTED_CREDENTIALS_FILE,
 PATH_TO_ENCRYPTED_RPSCREDENTIALS_FILE,
 PATH_TO_SECRET_KEY,
 10000,30000,3600000),

Here is the full index.js file for reference.

 

public/console.html

The public/console.html file is for formatting and associated UI components. The key things I’ve updated are the Bootstrap and AngularJS versions. Those are contained in the top of the html document. A summary is below.

https://ajax.googleapis.com/ajax/libs/angularjs/1.6.1/angular.min.js
https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.6.1/angular-resource.min.js
http://javascripts/ui-bootstrap-tpls-2.4.0.min.js
http://javascripts/console.js
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap.min.css">
<link rel="stylesheet" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.7/css/bootstrap-theme.min.css">

You will also need to download the updated Bootstrap UI (ui-bootstrap-tpls-2.4.0.min.js). I’m using v2.4.0 which you can download from here. Copy it to the javascripts directory.

I’ve also updated the table types, buttons, colours, header, logo etc in the appropriate locations (CSS, Tables, Div’s etc). Here is my full file for reference. You’ll need to update for your colours, branding etc.

powershell-command-executor\O365Utils.js

Finally the O365Utils.js file. This contains the commands that will be displayed along with their options, as well as the connection information for your Microsoft Identity Manager environment.

You will need to change:

  • Line 52 for the address of your MIM Sync Server
  • Line 55 for the addresses of your MIM Service Server
  • Line 141 on-wards for what commands and parameters for those commands you want to make available in the UI

Here is an example with a couple of AzureAD commands, a MIM Sync and a MIM Service command.

Show me my PowerShell UI Website

Now that we have everything configured let’s start the site and browse to it. If you haven’t stopped the NodeJS site from earlier go to the command window and press Cntrl+C a couple of times. Run “c:\Program Files\nodejs\node.exe” bin\www again from the C:\nodejs\powershell-command-executor-ui-master directory unless you have restarted the host and now have NodeJS in your environment path.

In a browser on the same host go to http://localhost:3000 again and you should see the site as it is below.

Branding and styling from the console.html, menu options from the o365Utils.js and when you select a command and execute it data from the associated service …….

… you can see results. From the screenshot below a Get-AzureADUser command for the associated search string executed in milliseconds.

 

Summary

The powershell-command-executor-ui from bitsofinfo is a very extensible and powerful NodeJS website as a front-end to PowerShell.

With a few tweaks and updates the look and feel can be easily changed along with the addition of any powershell commands that you wish to have a UI for.

As it sits though keep in mind you have a UI with hard-coded credentials that can do whatever commands you expose.

Personally I am running one for my use only and I have it hosted in Azure in its own Resource Group with an NSG allowing outgoing traffic to Azure and my MIM environment. Incoming traffic is only allowed from my personal management workstations IP address. I also needed to allow port 3000 into the server on the NSG as well as the firewall on the host. I did that quickly using the command below.

# Enable the WebPort NodeJS is using on the firewall 
netsh advfirewall firewall add rule name="NodeJS WebPort 3000" dir=in action=allow protocol=TCP localport=3000

Follow Darren on Twitter @darrenjrobinson

Completing an Exchange Online Hybrid individual MoveRequest for a mailbox in a migration batch

I can’t remember for certain, however, I would say since at least Exchange Server 2010 Hybrid, there was always the ability to complete a MoveRequest from on-premises to Exchange Online manually (via PowerShell) for a mailbox that was a within a migration batch. It’s really important for all customers to have this feature and something I have used on every enterprise migration to Exchange Online.

What are we trying to achive here?

With enterprise customers and the potential for thousands of mailboxes to move from on-premises to Exchange Online, business analyst’s get their “kind in a candy store” on and sift through data to come up with relationships between mailboxes so these mailboxes can be grouped together in migration batches for synchronised cutovers.

This is the tried and true method to ensure that not only business units, departments or teams cutover around the same timeframe, but, any specialised mailbox and shared mailbox relationships cutover. This then keeps business processes working nicely and with minimal disruption to users.

Sidebar – As of March 2016, it is now possible to have permissions to shared mailboxes cross organisations from cloud to on-premises, in hybrid deployments with Exchange Server 2013 or 2016 on-premises. Cloud mailboxes can access on-premises shared mailboxes which can streamline migrations where no all shared mailboxes need to be cutover with their full access users.

Reference: https://blogs.technet.microsoft.com/mconeill/2016/03/20/shared-mailboxes-in-exchange-hybrid-now-work-cross-premises/

How can we achieve this?

In the past and from what I had been doing for, what I feel is, the last 4 years, required one or two PowerShell cmdlets. The main reference that I’ve recently gone to is Paul Cunningham’s ExchangeServerPro blog post that conveniently is also the main Google search result. The two lines of PowerShell are as follows:

Step 1 – Change the Suspend When Ready to Complete switch to $false or turn it off

Get-MoveRequest mailbox@domain.com | Set-MoveRequest -SuspendWhenReadyToComplete:$false

Step 2 – Resume the move request to complete the cutover of the mailbox

Get-MoveRequest mailbox@domain.com | Resume-MoveRequest

Often I don’t even do step 1 mentioned above. I just resumed the move request. It all worked and go me the result I was after. This was across various migration projects with both Exchange Server 2010 and Exchange Server 2013 on-premises.

Things changed recently!?!? I’ve been working with a customer on their transition to Exchange Online for the last month and we’re now up to moving a pilot set of mailboxes to the cloud. Things were going great, as expected, nothing out of the ordinary. That is, until I wanted to cutover one initial mailbox in our first migration batch.

Why didn’t this work?

It’s been a few months, three quarters of a year, since I’ve done an Office 365 Exchange Online mail migration. I did the trusted Google and found Paul’s blog post. Had that “oh yeah, thats it” moment and remembered the PowerShell cmdlet’s. Our pilot user was cutover! Wrong..

The mailbox went from a status of “synced” to “incrementalsync” and back to “synced” with no success. I ran through the PowerShell again. No buono. Here’s a screen grab of what I was seeing.

SERIOUS FACE > Some names and identifying details have been changed to protect the privacy of individuals

And yes, that is my PowerShell colour theme. I have to get my “1337 h4x0r” going and make it green and stuff.

I see where this is going. There’s a work around. So Lucian, hurry up and tell me!

Yes, dear reader, you would be right. After a bit of back and forth with Microsoft, there is indeed a solution. It still involves two PowerShell cmdlet’s, but, there is two additional properties. One of those properties is actually hidden: -preventCompletion. The other property is a date and time value normally represented like: “28/11/2016 12:00:00 PM”.

Lets cut to the chase; here’s the two cmdlets”

Get-MoveRequest -Identity mailbox@domain.com | Set-MoveRequest -SuspendWhenReadyToComplete:$false -preventCompletion:$false -CompleteAfter 5
Get-MoveRequest -Identity mailbox@domain.com | Resume-MoveRequest

After running that, the mailbox, along with another four to be sure it worked, had all successfully cutover to Exchange Online. My -CompleteAfter value of “5” was meant to be 5 minutes. I would say that could be set with the correct date and time format and have the current date and time + 5minutes added to do it correctly. For me, it worked with the above PowerShell.

Final words

I know it’s been sometime since I moved mailboxes from on-premises to the cloud, however, it’s like riding a bike: you never forget. In this case, I knew the option was there to do it. I pestered Office 365 support for two days. I hope the awesome people there don’t hate me for it. Although, in the long run, it’s good that I did as figuring this out and using this method of manual mailbox cutover for sync’ed migration batches is crucial to any transition to the cloud.

Enjoy!

Lucian

How to assign and remove user Office365 licenses using the AzureADPreview Powershell Module

A couple of months ago the AzureADPreview module was released. The first cmdlet that I experimented with was Set-AzureADUserLicense. And it didn’t work, there was no working examples and I gave up and used GraphAPI instead.

Since then the AzureADPreview has gone through a number of revisions and I’ve been messing around a little with each update. The Set-AzureADUserLicense cmdlet has been my litmus test. Now that I have both removing and assigning Office 365 licenses working I’ll save others the pain of working it out and give a couple of working examples.

 

If like me you have been experimenting with the AzureADPreview module you’ll need to force the install of the newest one. And for whatever reason I was getting an error informing me that it wasn’t signed. As I’m messing around in my dev sandpit I skipped the publisher check.
Install-Module -Name AzureADPreview -MinimumVersion 2.0.0.7 -Force -SkipPublisherCheck
Import-Module AzureADPreview RequiredVersion 2.0.0.7

Removing an Office 365 License from a User

Removing a license with Set-AzureADUserLicense looks something like this.

What if there are multiple licenses ? Similar concept but just looping through each one to remove.

Assigning an Office 365 License to a User

Now that we have the removal of licenses sorted, how about adding licenses ?

Assigning a license with Set-AzureADUserLicense looks something like this;

Moving forward this AzureAD Powershell Module will replace the older MSOL Module as I wrote about here. If you’re writing new scripts it’s a good time to start using the new modules.

Follow Darren on Twitter @darrenjrobinson

 

How to make a copy of a virtual machine running Windows in Azure

How to make a copy of a virtual machine running Windows in Azure

I was called upon recently to help a customer create copies of some of their Windows virtual machines. The idea was to quickly deploy copies of these hosts at any time as opposed to using a system image or point in time copy.

The following PowerShell will therefore allow you to make a copy or clone of a Windows virtual machine using a copy of it’s disks in Azure Resource Manager mode.

Create a new virtual machine from a copy of the disks of another

Having finalized the configuration of the source virtual machine the steps required are as follows.

  1. Stop the source virtual machine, then using Storage Explorer copy it’s disks to a new location and rename them in line with the target name of the new virtual machine.

  2. Run the following in PowerShell making the required configuration changes.

Login-AzureRmAccount
Get-AzureRmSubscription –SubscriptionName "<subscription-name>" | Select-AzureRmSubscription

$location = (get-azurermlocation | out-gridview -passthru).location
$rgName = "<resource-group>"
$vmName = "<vm-name>"
$nicname = "<nic-name>"
$subnetID = "<subnetID>"
$datadisksize = "<sizeinGB>"
$vmsize = (Get-AzureLocation | Where-Object { $_.name -eq "East US"}).VirtualMachineRoleSizes | out-gridview -passthru
$osDiskUri = "https://<storage-acccount>.blob.core.windows.net/vhds/<os-disk-name.vhd>"
$dataDiskUri = "https://<storage-acccount>.blob.core.windows.net/vhds/<data-disk-name.vhd>"

Notes: The URIs above belong to the copies not the original disks and the SubnetID refers to it’s resource ID.

$nic = New-AzureRmNetworkInterface -Name $nicname -ResourceGroupName $rgName -Location $location -SubnetId $subnetID
$vmConfig = New-AzureRmVMConfig -VMName $vmName -VMSize $vmsize
$vm = Add-AzureRmVMNetworkInterface -VM $vmConfig -Id $nic.Id
$osDiskName = $vmName + "os-disk"
$vm = Set-AzureRmVMOSDisk -VM $vm -Name $osDiskName -VhdUri $osDiskUri -CreateOption attach -Windows
$dataDiskName = $vmName + "data-disk"
$vm = Add-AzureRmVMDataDisk -VM $vm -Name $dataDiskName -VhdUri $dataDiskUri -Lun 0 -Caching 'none' -DiskSizeInGB $datadisksize -CreateOption attach
New-AzureRmVM -ResourceGroupName $rgName -Location $location -VM $vm

List virtual machines in a resource group.

$vmList = Get-AzureRmVM -ResourceGroupName $rgName
$vmList.Name

Having run the above. Log on to the new host in order to make the required changes.

Automate ADFS Farm Installation and Configuration

Originally posted on Nivlesh’s blog @ nivleshc.wordpress.com

Introduction

In this multi-part blog, I will be showing how to automatically install and configure a new ADFS Farm. We will accomplish this using Azure Resource Manager templates, Desired State Configuration scripts and Custom Script Extensions.

Overview

We will use Azure Resource Manager to create a virtual machine that will become our first ADFS Server. We will then use a desired state configuration script to join the virtual machine to our Active Directory domain and to install the ADFS role. Finally, we will use a Custom Script Extension to install our first ADFS Farm.

Install ADFS Role

We will be using the xActiveDirectory and xPendingReboot experimental DSC modules.

Download these from

https://gallery.technet.microsoft.com/scriptcenter/xActiveDirectory-f2d573f3

https://gallery.technet.microsoft.com/scriptcenter/xPendingReboot-PowerShell-b269f154

After downloading, unzip the file and  place the contents in the Powershell modules directory located at $env:ProgramFiles\WindowsPowerShell\Modules (unless you have changed your systemroot folder, this will be located at C:\ProgramFiles\WindowsPowerShell\Modules )

Open your Windows Powershell ISE and lets create a DSC script that will join our virtual machine to the domain and also install the ADFS role.

Copy the following into a new Windows Powershell ISE file and save it as a filename of your choice (I saved mine as InstallADFS.ps1)

In the above, we are declaring some mandatory parameters and some variables that will be used within the script

$MachineName is the hostname of the virtual machine that will become the first ADFS server

$DomainName is the name of the domain where the virtual machine will be joined

$AdminCreds contains the username and password for an account that has permissions to join the virtual machine to the domain

$RetryCount and $RetryIntervalSec hold values that will be used to  check if the domain is available

We need to import the experimental DSC modules that we had downloaded. To do this, add the following lines to the DSC script

Import-DscResource -Module xActiveDirectory, xPendingReboot

Next, we need to convert the supplied $AdminCreds into a domain\username format. This is accomplished by the following lines (the converted value is held in $DomainCreds )

Next, we need to tell DSC that the command needs to be run on the local computer. This is done by the following line (localhost refers to the local computer)

Node localhost

We need to tell the LocalConfigurationManager that it should reboot the server if needed, continue with the configuration after reboot,  and to just apply the settings only once (DSC can apply a setting and constantly monitor it to check that it has not been changed. If the setting is found to be changed, DSC can re-apply the setting. In our case we will not do this, we will apply the setting just once).

Next, we need to check if the Active Directory domain is ready. For this, we will use the xWaitForADDomain function from the xActiveDirectory experimental DSC module.

Once we know that the Active Directory domain is available, we can go ahead and join the virtual machine to the domain.

the JoinDomain function depends on xWaitForADDomain. If xWaitForADDomain fails, JoinDomain will not run

Once the virtual machine has been added to the domain, it needs to be restarted. We will use xPendingReboot function from the xPendingReboot experimental DSC module to accomplish this.

Next, we will install the ADFS role on the virtual machine

Our script has now successfully added the virtual machine to the domain and installed the ADFS role on it. Next, create a zip file with InstallADFS.ps1 and upload it to a location that Azure Resource Manager can access (I would recommend uploading to GitHub). Include the xActiveDirectory and xPendingReboot experimental DSC module directories in the zip file as well. Also add a folder called Certificates inside the zip file and put the ADFS certificate and the encrypted password files (discussed in the next section) inside the folder.

In the next section, we will configure the ADFS Farm.

The full InstallADFS.ps1 DSC script is pasted below

Create ADFS Farm

Once the ADFS role has been installed, we will use Custom Script Extensions (CSE) to create the ADFS farm.

One of the requirements to configure ADFS is a signed certificate. I used a 90 day trial certificate from Comodo.

There is a trick that I am using to make my certificate available on the virtual machine. If you bootstrap a DSC script to your virtual machine in an Azure Resource Manager template, the script along with all the non out-of-box DSC modules have to be packaged into a zip file and uploaded to a location that ARM can access. ARM then will download the zip file, unzip it, and place all directories inside the zip file to $env:ProgramFiles\WindowsPowerShell\Modules ( C:\ProgramFiles\WindowsPowerShell\Modules ) ARM assumes the directories are PowerShell modules and puts them in the appropriate directory.

I am using this feature to sneak my certificate on to the virtual machine. I create a folder called Certificates inside the zip file containing the DSC script and put the certificate inside it. Also, I am not too fond of passing plain passwords from my ARM template to the CSE, so I created two files, one to hold the encrypted password for the domain administrator account and the other to contain the encrypted password of the adfs service account. These two files are named adminpass.key and adfspass.key and will be placed in the same Certificates folder within the zip file.

I used the following to generate the encrypted password files

AdminPlainTextPassword and ADFSPlainTextPassword are the plain text passwords that will be encrypted.

$key  is used to convert the secure string into an encrypted standard string. Valid key lengths are 16, 24, 32

For this blog, we will use

$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Open Windows PowerShell ISE and paste the following (save the file with a name of your choice. I saved mine as ConfigureADFS.ps1)

param (
 $DomainName,
 $DomainAdminUsername,
 $AdfsSvcUsername
)

These are the parameters that will be passed to the CSE

$DomainName is the name of the Active Directory domain
$DomainAdminUsername is the username of the domain administrator account
$AdfsSvcUsername is the username of the ADFS service account

Next, we will define the value of the Key that was used to encrypt the password and the location where the certificate and the encrypted password files will be placed

$localpath = "C:\Program Files\WindowsPowerShell\Modules\Certificates\"
$Key = (3,4,2,3,56,34,254,222,1,1,2,23,42,54,33,233,1,34,2,7,6,5,35,43)

Now, we have to read the encrypted passwords from the adminpass.key and adfspass.key file and then convert them into a domain\username format

Next, we will import the certificate into the local computer certificate store. We will mark the certificate exportable and set the password same as the domain administrator password.

In the above after the certificate is imported,  $cert is used to hold the certificate thumbprint

Next, we will configure the ADFS Farm

The ADFS Federation Service displayname is set to “Active Directory Federation Service” and the Federation Service Name is set to fs.adfsfarm.com

Upload the CSE to a location that Azure Resource Manager can access (I uploaded my script to GitHub)

The full ConfigureADFS.ps1 CSE is shown below

Azure Resource Manager Template Bootstrapping

Now that the DSC and CSE scripts have been created, we need to add them in our ARM template, straight after the virtual machine is provisioned.

To add the DSC script, create a DSC extension and link it to the DSC Package that was created to install ADFS. Below is an example of what can be used

The extension will run after the ADFS virtual machine has been successfully created (referred to as ADFS01VMName)

The MachineName, DomainName and domain administrator credentials are passed to the DSC extension.

Below are the variables that have been used in the json file for the DSC extension (I have listed my GitHub repository location)

Next, we have to create a Custom Script Extension to link to the CSE for configuring ADFS. Below is an example that can be used

The CSE depends on the ADFS virtual machine being successfully provisioned and the DSC extension that installs the ADFS role to have successfully completed.

The DomainName, Domain Administrator Username and the ADFS Service Username are passed to the CSE script

The following contains a list of the variables being used by the CSE (the example below shows my GitHub repository location)

"repoLocation": "https://raw.githubusercontent.com/nivleshc/arm/master/",
"ConfigureADFSScriptUrl": "[concat(parameters('repoLocation'),'ConfigureADFS.ps1')]",

That’s it Folks! You now have an ARM Template that can be used to automatically install the ADFS role and then configure a new ADFS Farm.

In my next blog, we will explore how to add another node to the ADFS Farm and we will also look at how we can automatically create a Web Application Proxy server for our ADFS Farm.

Creating Organizational Units, and Groups in AD with GUID

A recent client of Kloud, wanted to have the chance to create new organizational units, and groups, automatically, with a unique ID (GUID) for each organizational unit. The groups created needed to share the GUID of the OU.

In this blog, I will demonstrate how you could achieve the aforementioned, through a simple PowerShell script, naturally.

Before you start however, you may have to run PowerShell (run as Administrator), and execute the following cmdlet:

Set-ExecutionPolicy RemoteSigned

This is to allow PowerShell scripts to run on the computer.

How to define GUID?

To define the GUID in PowerShell, we use the [GUID] type accelerator: [guid]::NewGuid()


Defining the OU, and Groups.

First part of the script consists of the following parameters:


param()
#Define GUID

$Guid = [guid]::NewGuid()

This is to define the GUID, in which a unique ID resembling this “0fda3c39-c5d8-424f-9bac-654e2cf045ed” will be added to the OU, and groups within that OU.


#Define OU Name

$OUName = (Read-Host Input OU name';) + '-' + $Guid


Defining the name of the OU. This needs an administrator’s input.



#Define Managers Group Name

$MGroupPrefix = (Read-Host -Prompt 'Input Managers Group name')

if ($MGroupPrefix) {write-host 'The specified Managers Name will be used.' -foreground 'yellow'}

else {

$MGroupPrefix = 'Managers';

Write-Host 'Default Managers Group name will be used.' -foreground 'yellow'

}

$MGroupPrefix = $MGroupPrefix + '-' + $Guid

$MGroupPrefix

This is to define the name of the “First Group”, it will also be followed by a “-“ and a GUID.

If the Administrator selects a name, then, the selected name will be used.


#Define Supervisors Group Name

$SGroupPrefix = (Read-Host -Prompt 'Input Supervisors Group name')

if ($SGroupPrefix) {write-host 'The specified Supervisors Group name will be used.' -foreground 'yellow'}

else {

$SGroupPrefix = 'Supervisors'

Write-Host 'Default Supervisors name will be used.' -foreground 'yellow'

}

$SGroupPrefix = $SGroupPrefix + '-' + $Guid

$SGroupPrefix

This will define the second group name “Supervisors”, again, it will be followed by “-” and a GUID.

In this section however, I didn’t specify a name to demonstrate how the default name “Supervisors” + “-” + GUID come into place, as define in “else” function. Simply press “enter”.

Defining the OU Name, group name, and path

#Defining Main OU Name, OUs, and path

$OUPath = 'OU=Departments,DC=contoso,DC=lab'

$GroupsOUName = 'Groups'

$GroupOUPath = 'OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'

$UsersOUName = 'Users'

$UsersOUPath = 'OU=' + $OUName + ',OU=Dpartments,DC=contoso,DC=lab'

The above parameters define the OU Path in which the main OU will be created, e.g. Finance.

It will also create two other OUs, “Groups” and Users” in the specified path.

Defining the Groups (that will be created in “Groups” OU), the group names and path.


#Defining Groups, groups name and path

$MGroupPath = 'OU=Groups, OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'

$MGroupSamAccountName = 'Managers-' + $Guid

$MGroupDisplayName = 'Managers'

$SGroupPath = 'OU=Groups, OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab' 

$SGroupSamAccountName = 'Supervisors-' + $Guid

$SGroupDisplayName = 'Supervisors'

The above parameters define the Groups path, the SamAccountName and Display Name, that have been created in the main OU, e.g. Finance.

Creating the Organizational units, and groups.

As you may have noticed, everything that we have defined above is just parameters. Nothing will be created without calling the cmdlets.


#Creating Tenant OU, Groups &amp;amp;amp;amp; Users OU, and Admins Groups

Import-Module ActiveDirectory

New-ADOrganizationalUnit -Name $OUName -Path $OUPath

New-ADOrganizationalUnit -Name $GroupsOUName -Path $GroupOUPath

New-ADOrganizationalUnit -Name $UsersOUName -Path $UsersOUPath

New-ADGroup -Name $FirstGroupPrefix -SamAccountName $FirstGroupSamAccountName -GroupCategory Security -GroupScope Global -DisplayName $FirstGroupDisplayName -Path $FirstGroupPath -Description 'Members of this group are Managers of $OUName'

New-ADGroup -Name $SecondGroupPrefix -SamAccountName $SecondGroupSamAccountName -GroupCategory Security -GroupScope Global -DisplayName $SecondGroupDisplayName -Path $SecondGroupPath -Description Members of this group are Supervisors of $OUName'

Once completed you will have the following OU/Department


And in Groups OU, you will have two groups, Managers (as defined in the previous step) and “SupervisorsGroup” the default name, as the Administrator did not specify a name for the Group.


Notice how the GUID is shared across the Departments, and the Groups.

Putting it all together:

Note: The SYNOPSIS, DESCRIPTION, EXAMPLE, and NOTES are used for get-help.


<#

.SYNOPSIS
This is a Powershell script to create Organizations Units (OU), and groups.

.DESCRIPTION
The script will create:
1) An OU, defined by the administrator. The OU name is define by the administrator, followed by a unique ID generated through GUID. It will be created in a specific OU path.
2) Two OUs consisting of 'Groups' and 'Users'.
3) Two Groups; 'First Group' and 'Second Group' These names are the default names followed with GUID. An administrator can define Group names as desired. GUID will be added by default.

.Departments
./CreateOUs.ps1

.NOTES
This PowerShell script should be ran as administrator.

#>

param()

#Define GUID
$Guid = [guid]::NewGuid()

#Define OU Name
$OUName  = (Read-Host 'Input OU name') + '-' + $Guid

#Define First Group Name
$MGroupPrefix = (Read-Host -Prompt 'Input Managers Group name')
if ($MGroupPrefix) {write-host 'The specified Managers Name will be used.' -foreground 'yellow'}
else {
$MGroupPrefix = 'Managers'
Write-Host 'Default Managers Group name will be used' -foreground 'yellow'
}
$MGroupPrefix = $MGroupPrefix + '-' + $Guid

$MGroupPrefix
#Define Second Group Name
$SGroupPrefix = (Read-Host -Prompt 'Input Supervisors Group name')
if ($SGroupPrefix) {write-host 'The specified Supervisors Group name will be used.' -foreground 'yellow'}
else {
$SGroupPrefix = 'SupervisorsGroup'
Write-Host 'Default SupervisorsGroup name will be used' -foreground 'yellow'
}

$SGroupPrefix = $SGroupPrefix + '-' + $Guid

$SGroupPrefix

#Defining Main OU Name, OUs, and path
$OUPath  = 'OU=Departments,DC=contoso,DC=lab'
$GroupsOUName = 'Groups'
$GroupOUPath = 'OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'
$UsersOUName = 'Users'
$UsersOUPath = 'OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'

#Defining Groups, groups name and path
$MGroupPath = 'OU=Groups, OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'
$MGroupSamAccountName = 'Managers-' + $Guid
$MGroupDisplayName = 'Managers'
$SGroupPath = 'OU=Groups, OU=' + $OUName + ',OU=Departments,DC=contoso,DC=lab'
$SGroupSamAccountName = 'Supervisors-' + $Guid
$SGroupDisplayName = 'Supervisors'

#Creating Tenant OU, Groups &amp;amp; Users OU, and Admins Groups
Import-Module ActiveDirectory

New-ADOrganizationalUnit -Name $OUName -Path $OUPath

New-ADOrganizationalUnit -Name $GroupsOUName -Path $GroupOUPath

New-ADOrganizationalUnit -Name $UsersOUName -Path $UsersOUPath

New-ADGroup -Name $MGroupPrefix -SamAccountName $MGroupSamAccountName -GroupCategory Security -GroupScope Global -DisplayName $MGroupDisplayName -Path $MGroupPath -Description 'Members of this group are Managers of $OUName'

New-ADGroup -Name $SGroupPrefix -SamAccountName $SGroupSamAccountName -GroupCategory Security -GroupScope Global -DisplayName $SGroupDisplayName -Path $SGroupPath -Description 'Members of this group are Supervisors of $OUName'


feature

Leveraging the Microsoft Graph API with PowerShell and OAuth 2.0

Background

Microsoft Graph is the evolvement of API’s into Microsoft Cloud Services. For me not being a developer, a key difference is interacting with with Graph API using OAuth 2.0 via PowerShell. Through a number of my previous posts I’ve interacted with the Graph API using client libraries such as the Microsoft.IdentityModel.Clients.ActiveDirectory library. This post details using PowerShell to talk directly to Graph API and managing Authentication and Authorization using OAuth 2.0 and Azure WebApp.

Leveraging the Graph API opens up access to the continually evolving Azure services as shown in the graphic below. Source graph.microsoft.io

Getting started with the Graph API, PowerShell and OAuth 2.0

The key difference between using a client library and going direct is you need to register and configure an Azure WebApp. It is super simple to do. Jump on over to the Office 365 App Registration Tool here. Sign in with an account associated with the Azure Tenant you are going to interact with. Depending on what you’re doing you’ll need to select the appropriate access.

Here’s the settings I selected for access to user profiles. Give the WebApp a name. This is the name that you’ll see in the OAuth Authorization step later on. I’ve highlighted the other key settings. Don’t worry about the SignIn and RedirectURL’s other than configuring HTTPS as we’ll be using PowerShell to access the WebApp.

Once you’ve registered the WebApp successfully you’ll get a Client ID and Client Secret. RECORD/WRITE THESE DOWN NOW as you won’t get access to your Client Secret again.

If you want to change what your WebApp has access to after creating it you can access it via the Classic Azure Portal. Select your Active Directory => Select Applications => Select the WebApp you created earlier  => Select Configure => (scroll to the bottom) Select Add Application. Depending on what you have in your subscription you can add access to your services as shown below.

Authenticating & Authorizing

In order to access the Graph API we need to get our Authorization Code. We only need to do this the first time.

This little script (modify with your Client ID and your Client Secret obtained earlier when we registered our WebApp) will prompt you to authenticate.

Using the account associated with the Web App you registered in the previous step authenticate.

You’ll be requested to authorize your application.

You will then have your AuthCode.

One thing to note above is admin_consent. This is from the URL passed to get the Authorization Code. &prompt=admin_consent is giving Admin Consent to all entities configured on the WebApp over just access for and to a single user.

Accessing the Graph API with OAuth 2.0 Access Token

Using your ClientID, ClientSecret and AuthCode you can now get your access token. I tripped up here and was getting Invoke-RestMethod : {“error”:”invalid_client”,”error_description”:”AADSTS70002: Error validating credentials.
AADSTS50012: Invalid client secret is provided.  Tracing back my steps and looking at my “Client Secret” I noticed the special characters in it that I hadn’t URL Encoded. After doing that I was successfully able to get my access token.

Looking at our AuthZ we can see that the Scope is what was selected when registering the WebApp.

Now using the Access Token I can query the Graph API. Here is getting my AAD Object.

Summary

In my case I now can access all users via the API. Here is what’s available. Using the Access Token and modifying the Invoke-RestMethod URI and Method (including -Body if you are doing a Post/Patch action) you are ready to rock and roll and all via PowerShell.

Your Access Token is valid for an hour. Before then you will need to refresh it. Just run the $Authorization = Invoke-RestMethod ….. line again. 

Follow Darren on Twitter @darrenjrobinson

 

powerbi-env20visual-1

Leveraging the PowerBI Beta API for creating PowerBI Tables with Relationships via PowerShell

If anyone actually reads my posts you will have noticed that I’ve been on a bit of a deep dive into PowerBI and how I can use it to provide visualisation of data from Microsoft Identity Manager (here via CSV, and here via API). One point I noticed going direct to PowerBI via the API (v1.0) though was how it is not possible to provide relationships (joins) between tables within datasets (you can via PowerBI Desktop). After a lot of digging and searching I’ve worked out how to actually define the relationships between tables in a dataset via the API and PowerShell. This post is about that.

Overview

In order to define relationships between tables in a dataset (via the API), there are a couple of key points to note:

  • You’ll need to modify the PowerBIPS PowerShell Module to leverage the Beta API
    • see the screenshot in the “How to” section

Prerequisites

To use PowerBI via PowerShell and the PowerBI API you’ll need to get:

How to leverage the PowerBI Beta API

In addition to the prerequisites above, in order to leverage the PowerShell PowerBI module for the PowerBI Beta API I just changed the beta flag $True in the PowerBI PowerShell (PowerBIPS.psm1) module to make all calls on the Beta API. See screenshot below. It will probably be located somewhere like ‘C:\Program Files\WindowsPowerShell\Modules\PowerBIPS

Example Dataset Creation

The sample script below will create a New Dataset in  PowerBI with two tables. Each table holds “Internet of Things” environmental data. One for data from an Seeed WioLink IoT device located outside and one inside. Both tables contain a column with DateTime that is extracted before the sensors are read and then their data added to their respective tables referencing that DateTime.

That is where the ‘relationships‘ part of the schema comes in. The options for the direction of the relationship filter are OneDirection (default), BothDirections and Automatic. For this simple example I’m using OneDirection on DateTime.

To put some data into the tables here is my simple script. I only have a single IoT unit hooked up so I’m fudging the more constant Indoor Readings. The Outside readings are coming from a Seeed WioLink unit.

Visualisation

So there we have it. Something I thought would have been part of the minimum viable product (MVP) release of the API is available in the Beta and using PowerShell we can define relationships between tables in a dataset and use that in our visualisation.

 

Follow Darren on Twitter @darrenjrobinson

Office365 Licensing Management Agent for Microsoft Identity Manager

Licensing for Office365 has always been a moving target for enterprise customers. Over the years I’ve implemented a plethora of solutions to keep licensing consistent with entitlement logic. For some customers this is as simple as everyone gets say, an E3 license. For other institutions there are often a mix of ‘E’ and ‘K’ licenses depending on EmployeeType.

Using the Granfeldt PowerShell Management Agent to import Office365 Licensing info

In this blog post I detail how I’m using Søren Granfeldt’s extremely versatile PowerShell Management Agent yet again. This time to import Office365 licensing information into Microsoft Identity Manager.

I’m bringing in the licenses associated with users as attributes on the user account. I’m also bringing in the licenses from the tenant as their own ObjectType into the Metaverse. This includes the information about each license such as how many licenses have been purchased, how many licenses have been issued etc.

Overview

I’m not showing assigning licenses. In the schema I have included the LicensesToAdd and LicensesToRemove attributes. Check out my Adding/Removing User Office365 Licences using PowerShell and the Azure AD Graph RestAPI post to see how to assign and remove licenses using Powershell. From that you can workout your logic to implement an Export flow to manage Office365 licenses.

Getting Started with the Granfeldt PowerShell Management Agent

If you don’t already have it, what are you waiting for. Go get it from here. Søren’s documentation is pretty good but does assume you have a working knowledge of FIM/MIM and this blog post is no different.

Three items I had to work out that I’ll save you the pain of are;

  • You must have a Password.ps1 file. Even though we’re not doing password management on this MA, the PS MA configuration requires a file for this field. The .ps1 doesn’t need to have any logic/script inside it. It just needs to be present
  • The credentials you give the MA to run this MA are the credentials for the account that has permissions to the Office365 Tenant. Just a normal account is enough to enumerate it, but you’ll need additional permissions to assign/remove licenses.
  • The path to the scripts in the PS MA Config must not contain spaces and be in old-skool 8.3 format. I’ve chosen to store my scripts in an appropriately named subdirectory under the MIM Extensions directory. Tip: from a command shell use dir /x to get the 8.3 directory format name. Mine looks like C:\PROGRA~1\MICROS~2\2010\SYNCHR~1\EXTENS~2\O365Li~1

Schema.ps1

My Schema is based around the core Office365 Licenses function. You’ll need to create a number of corresponding attributes in the Metaverse Schema on the Person ObjectType to flow the attributes into. You will also need to create a new ObjectType in the Metaverse for the O365 Licenses. I named mine LicensePlans. Use the Schema info below for the attributes that will be imported and the attribute object types to make sure what you create in the Metaverse aligns, so you can import the values. Note the attributes that are multi-valued.

Import.ps1

The logic which the Import.ps1 implements I’m not going to document here as this post goes into all the details Enumerating all Users/Groups/Contacts in an Azure tenant using PowerShell and the Azure Graph API ‘odata.nextLink’ paging function

Password Script (password.ps1)

Empty as not implemented

Export.ps1

Empty as not implemented

Management Agent Configuration

As per the tips above, the format for the script paths must be without spaces etc. I’m using 8.3 format and I’m using an Office 365 account to connect to Office365 and import the user and license data.

As per the Schema script earlier in this post I’m bringing user licensing metadata as well as the Office365 Tenant Licenses info.

Attributes to bring through aligned with what is specified in the Schema file.

Flow through the attributes to the attributes I created in the Metaverse on the Person ObjectType and to the new ObjectType LicensePlans.

Wiring it up

To finish it up you’ll need to do the usual tasks of creating run profiles, staging the connector space from Office365 and importing into the Metaverse.

Enjoy.

Follow Darren on Twitter @darrenjrobinson

How to quickly recover from a FAILED AzureRM Virtual Machine using Powershell

Problem

I have a development sandpit in Azure which I use quite a lot to test and mess with different ideas and concepts. This week when shutting it down things didn’t go that smoothly. All but one virtual machine finally stopped and de-allocated, but one virtual machine just didn’t make it. I tried resizing the VM. I tried changing the configuration of it and obviously tried starting it up many times via the portal and Powershell all without any success.

I realised I was at the point where I just needed to build a new VM but I wanted to attach the OS vhd from the failed VM to it so I could recover my work and not have to re-install the applications and tools on the OS drive.

Solution

I created a little script that does the basics of what I needed to recover a simple single HDD VM and keep most of the core settings. If your VM has multiple disks you can probably attach the other HDD’s manually after you have a working VM again.

Via the AzureRM Portal select your busted VM and get the values highlighted in the screenshot below for VM Name, VM Resource Group and VM VNet Subnet.

The Script then:

  • Takes the values for the information you got above as variables for the;
    • Broken VM Name
    • Broken VM Resource Group
    • Broken VM VNet Subnet
  • Queries the Failed VM and obtains the
    • VM Name
    • VM Location
    • VM Sizing
    • VM HDD information (location etc)
    • VM Networking
  • Copies the VHD from the failed virtual machine to a new VHD file so you don’t have the bite the bullet and kill the failed VM off straight away to release the vhd from the VM.
    • Names the copied VHD <BustedVMName-TodaysDate(YYYYMMDD)>
  • Creates a new Virtual Machine with;
    • Original Name suffixed by ‘2’
    • Same VM Sizing as the failed VM
    • Same VM Location as the failed VM
    • Assigns the copy of the failed VM OS Disk to the new VM
    • Creates a new Public IP for the VM named <NEWVMName-IP2>
    • Creates a new NIC for the VM named <NEWVMName-NIC2>

Summary

With only 3 inputs using this script you can quickly recovery your busted and broken failed AzureRM virtual machine and be back up and running quickly. Once you’ve verified your VM is all good, delete the failed one.

Take this script, change lines 2, 4 and 6 for your failed or VM you want to clone to reflect your Resource Group, Virtual Machine Name and Virtual Machine Subnet. Step through it and let it loose.

Follow Darren on Twitter @darrenjrobinson