Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Platform Features

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

ManagedServiceIdentity

Back in Platform Features under General Settings select Application Settings. 

General Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

App Settings

Under Platform Features select Console.

DevelopmentTools

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

Endpoint

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

NewFunction

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Vars

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Vault

Select Add new, Select Principal and locate your Function App and click Select.

Access Policy 1

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Policy - GET

We now have our Function App enabled to access the Key Vault.

Access Policy 2

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Vault Secret URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Ok Google Email me the status of all vms – Part 2

First published at https://nivleshc.wordpress.com

In my last blog, we configured the backend systems necessary for accomplishing the task of asking Google Home “OK Google Email me the status of all vms” and it sending us an email to that effect. If you haven’t finished doing that, please refer back to my last blog and get that done before continuing.

In this blog, we will configure Google Home.

Google Home uses Google Assistant to do all the smarts. You will be amazed at all the tasks that Google Home can do out of the box.

For our purposes, we will be using the platform IF This Then That or IFTTT for short. IFTTT is a very powerful platform as it lets you create actions based on triggers. This combination of triggers and actions is called a recipe.

Ok, lets dig in and create our IFTTT recipe to accomplish our task

1.1   Go to https://ifttt.com/ and create an account (if you don’t already have one)

1.2   Login to IFTTT and click on My Applets menu from the top

IFTTT_MyApplets_Menu

1.3   Next, click on New Applet (top right hand corner)

1.4   A new recipe template will be displayed. Click on the blue + this choose a service

IFTTT_Reicipe_Step1

1.5   Under Choose a Service type “Google Assistant”

IFTTT_ChooseService

1.6   In the results Google Assistant will be displayed. Click on it

1.7   If you haven’t already connected IFTTT with Google Assistant, you will be asked to do so. When prompted, login with the Google account that is associated with your Google Home and then approve IFTTT to access it.

IFTTT_ConnectGA

1.8   The next step is to choose a trigger. Click on Say a simple phrase

IFTTT_ChooseTrigger

1.9   Now we will put in the phrases that Google Home should trigger on.

IFTTT_CompleteTrigger

For

  • What do you want to say? enter “email me the status of all vms
  • What do you want the Assistant to say in response? enter “no worries, I will send you the email right away

All the other sections are optional, however you can fill them if you prefer to do so

Click Create trigger

1.10   You will be returned to the recipe editor. To choose the action service, click on + that

IFTTT_That

1.11  Under Choose action service, type webhooks. From the results, click on Webhooks

IFTTT_ActionService

1.12   Then for Choose action click on Make a web request

IFTTT_Action_Choose

1.13   Next the Complete action fields screen is shown.

For

  • URL – paste the webhook url of the runbook that you had copied in the previous blog
  • Method – change this to POST
  • Content Type – change this to application/json

IFTTT_CompleteActionFields

Click Create action

1.13   In the next screen, click Finish

IFTTT_Review

 

Woo hoo. Everything is now complete. Lets do some testing.

Go to your Google Home and say “email me the status of all vms”. Google Home should reply by saying “no worries. I will send you the email right away”.

I have noticed some delays in receiving the email, however the most I have had to wait for is 5 minutes. If this is unacceptable, in the runbook script, modify the Send-MailMessage command by adding the parameter -Priority High. This sends all emails with high priority, which should make things faster. Also, the runbook is currently running in Azure. Better performance might be achieved by using Hybrid Runbook Workers

To monitor the status of the automation jobs, or to access their logs, in the Azure Automation Account, click on Jobs in the left hand side menu. Clicking on any one of the jobs shown will provide more information about that particular job. This can be helpful during troubleshooting.

Automation_JobsLog

There you go. All done. I hope you enjoy this additional task you can now do with your Google Home.

If you don’t own a Google Home yet, you can do the above automation using Google Assistant as well.

Ok Google Email me the status of all vms – Part 1

First published at https://nivleshc.wordpress.com

Technology is evolving at a breathtaking pace. For instance, the phone in your pocket has more grunt than the desktop computers of 10 years ago!

One of the upcoming areas in Computing Science is Artificial Intelligence. What seemed science fiction in the days of Isaac Asimov, when he penned I, Robot seems closer to reality now.

Lately the market is popping up with virtual assistants from the likes of Apple, Amazon and Google. These are “bots” that use Artificial Intelligence to help us with our daily lives, from telling us about the weather, to reminding us about our shopping lists or letting us know when our next train will be arriving. I still remember my first virtual assistant Prody Parrot, which hardly did much when you compare it to Siri, Alexa or Google Assistant.

I decided to test drive one of these virtual assistants, and so purchased a Google Home. First impressions, it is an awesome device with a lot of good things going for it. If only it came with a rechargeable battery instead of a wall charger, it would have been even more awesome. Well maybe in the next version (Google here’s a tip for your next version 😉 )

Having played with Google Home for a bit, I decided to look at ways of integrating it with Azure, and I was pleasantly surprised.

In this two-part blog, I will show you how you can use Google Home to send an email with the status of all your Azure virtual machines. This functionality can be extended to stop or start all virtual machines, however I would caution against NOT doing this in your production environment, incase you turn off some machine that is running critical workloads.

In this first blog post, we will setup the backend systems to achieve the tasks and in the next blog post, we will connect it to Google Home.

The diagram below shows how we will achieve what we have set out to do.

Google Home Workflow

Below is a list of tasks that will happen

  1. Google Home will trigger when we say “Ok Google email me the status of all vms”
  2. As Google Home uses Google Assistant, it will pass the request to the IFTTT service
  3. IFTTT will then trigger the webhooks service to call a webhook url attached to an Azure Automation Runbook
  4. A job for the specified runbook will then be queued up in Azure Automation.
  5. The runbook job will then run, and obtain a status of all vms.
  6. The output will be emailed to the designated recipient

Ok, enough talking 😉 lets start cracking.

1. Create an Azure AD Service Principal Account

In order to run our Azure Automation runbook, we need to create a security object for it to run under. This security object provides permissions to access the azure resources. For our purposes, we will be using a service principal account.

Assuming you have already installed the Azure PowerShell module, run the following in a PowerShell session to login to Azure

Import-Module AzureRm
Login-AzureRmAccount

Next, to create an Azure AD Application, run the following command

$adApp = New-AzureRmADApplication -DisplayName "DisplayName" -HomePage "HomePage" -IdentifierUris "http://IdentifierUri" -Password "Password"

where

DisplayName is the display name for your AD Application eg “Google Home Automation”

HomePage is the home page for your application eg http://googlehome (or you can ignore the -HomePage parameter as it is optional)

IdentifierUri is the URI that identifies the application eg http://googleHomeAutomation

Password is the password you will give the service principal account

Now, lets create the service principle for the Azure AD Application

New-AzureRmADServicePrincipal -ApplicationId $adApp.ApplicationId

Next, we will give the service principal account read access to the Azure tenant. If you need something more restrictive, please find the appropriate role from https://docs.microsoft.com/en-gb/azure/active-directory/role-based-access-built-in-roles

New-AzureRmRoleAssignment -RoleDefinitionName Reader -ServicePrincipalName $adApp.ApplicationId

Great, the service principal account is now ready. The username for your service principal is actually the ApplicationId suffixed by your Azure AD domain name. To get the Application ID run the following by providing the identifierUri that was supplied when creating it above

Get-AzureRmADApplication -IdentifierUri {identifierUri}

Just to be pedantic, lets check to ensure we can login to Azure using the newly created service principal account and the password. To test, run the following commands (when prompted, supply the username for the service principal account and the password that was set when it was created above)

$cred = Get-Credential 
Login-AzureRmAccount -Credential $cred -ServicePrincipal -TenantId {TenantId}

where Tenantid is your Azure Tenant’s ID

If everything was setup properly, you should now be logged in using the service principal account.

2. Create an Azure Automation Account

Next, we need an Azure Automation account.

2.1   Login to the Azure Portal and then click New

AzureMarketPlace_New

2.2   Then type Automation and click search. From the results click the following.

AzureMarketPlace_ResultsAutomation

2.3   In the next screen, click Create

2.4   Next, fill in the appropriate details and click Create

AutomationAccount_Details

3. Create a SendGrid Account

Unfortunately Azure doesn’t provide relay servers that can be used by scripts to email out. Instead you have to either use EOP (Exchange Online Protection) servers or SendGrid to achieve this. SendGrid is an Email Delivery Service that Azure provides, and you need to create an account to use it. For our purposes, we will use the free tier, which allows the delivery of 2500 emails per month, which is plenty for us.

3.1   In the Azure Portal, click New

AzureMarketPlace_New

3.2   Then search for SendGrid in the marketplace and click on the following result. Next click Create

AzureMarketPlace_ResultsSendGrid

3.3   In the next screen, for the pricing tier, select the free tier and then fill in the required details and click Create.

SendGridAccount_Details

4. Configure the Automation Account

Inside the Automation Account, we will be creating a Runbook that will contain our PowerShell script that will do all the work. The script will be using the Service Principal and SendGrid accounts. To ensure we don’t expose their credentials inside the PowerShell script, we will store them in the Automation Account under Credentials, and then access them from inside our PowerShell script.

4.1   Go into the Automation Account that you had created.

4.2   Under Shared Resource click Credentials

AutomationAccount_Credentials

4.3    Click on Add a credential and then fill in the details for the Service Principal account. Then click Create

Credentials_Details

4.4   Repeat step 4.3 above to add the SendGrid account

4.5   Now that the Credentials have been stored, under Process Automation click Runbooks

Automation_Runbooks

Then click Add a runbook and in the next screen click Create a new runbook

4.6   Give the runbook an appropriate name. Change the Runbook Type to PowerShell. Click Create

Runbook_Details

4.7   Once the Runbook has been created, paste the following script inside it, click on Save and then click on Publish

Import-Module Azure
$cred = Get-AutomationPSCredential -Name 'Service Principal account'
$mailerCred = Get-AutomationPSCredential -Name 'SendGrid account'

Login-AzureRmAccount -Credential $cred -ServicePrincipal -TenantID {tenantId}

$outputFile = $env:TEMP+ "\AzureVmStatus.html"
$vmarray = @()

#Get a list of all vms 
Write-Output "Getting a list of all VMs"
$vms = Get-AzureRmVM
$total_vms = $vms.count
Write-Output "Done. VMs Found $total_vms"

$index = 0
# Add info about VM's to the array
foreach ($vm in $vms){ 
 $index++
 Write-Output "Processing VM $index/$total_vms"
 # Get VM Status
 $vmstatus = Get-AzurermVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Status

# Add values to the array:
 $vmarray += New-Object PSObject -Property ([ordered]@{
 ResourceGroupName=$vm.ResourceGroupName
 Name=$vm.Name
 OSType=$vm.StorageProfile.OSDisk.OSType
 PowerState=(get-culture).TextInfo.ToTitleCase(($vmstatus.statuses)[1].code.split("/")[1])
 })
}
$vmarray | Sort-Object PowerState,OSType -Desc

Write-Output "Converting Output to HTML" 
$vmarray | Sort-Object PowerState,OSType -Desc | ConvertTo-Html | Out-File $outputFile
Write-Output "Converted"

$fromAddr = "senderEmailAddress"
$toAddr = "recipientEmailAddress"
$subject = "Azure VM Status as at " + (Get-Date).toString()
$smtpServer = "smtp.sendgrid.net"

Write-Output "Sending Email to $toAddr using server $smtpServer"
Send-MailMessage -Credential $mailerCred -From $fromAddr -To $toAddr -Subject $subject -Attachments $outputFile -SmtpServer $smtpServer -UseSsl
Write-Output "Email Sent"

where

  • ‘Service Principal Account’ and ‘SendGrid Account’ are the names of the credentials that were created in the Automation Account (include the ‘ ‘ around the name)
  • senderEmailAddress is the email address that the email will show it came from. Keep the domain of the email address same as your Azure domain
  • recipientEmailAddress is the email address of the recipient who will receive the list of vms

4.8   Next, we will create a Webhook. A webhook is a special URL that will allow us to execute the above script without logging into the Azure Portal. Treat the webhook URL like a password since whoever possesses the webhook can execute the runbook without needing to provide any credentials.

Open the runbook that was just created and from the top menu click on Webhook

Webhook_menu

4.9   In the next screen click Create new webhook

4.10  A security message will be displayed informing that once the webhook has been created, the URL will not be shown anywhere in the Azure Portal. IT IS EXTREMELY IMPORTANT THAT YOU COPY THE WEBHOOK URL BEFORE PRESSING THE OK BUTTON.

Enter a name for the webhook and when you want the webhook to expire. Copy the webhook URL and paste it somewhere safe. Then click OK.

Once the webhook has expired, you can’t use it to trigger the runbook, however before it expires, you can change the expiry date. For security reasons, it is recommended that you don’t keep the webhook alive for a long period of time.

Webhook_details

Thats it folks! The stage has been set and we have successfully configured the backend systems to handle our task. Give yourselves a big pat on the back.

Follow me to the next blog, where we will use the above with IFTTT, to bring it all together so that when we say “OK Google, email me the status of all vms”, an email is sent out to us with the status of all the vms 😉

I will see you in Part 2 of this blog. Ciao 😉

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire.  When configuring GALSync ([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to.  Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true 
  1. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also. Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true 

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

Receive Push Notifications from Microsoft Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

IMG-5194

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Exchange Online – Mapi over Http Transition

Microsoft has announced that from 31st October 2017, outlook clients using RPC over Http protocol to connect to Office 365 will be no longer supported. Only Mapi over Http clients will be in action onwards. This announcement has left many administrators thinking, What exactly does that mean for my organization? What actions are required to avoid any business impact? Is it time to update outlook clients and upto what level? And last but not the least how can I verify if all necessary steps have been taken to ensure business as usual. Lets try to answer these questions one by one.

So what does this announcement means for any organization? In simple words, any outlook client which still use RPC over Http to connect to Office 365 will be retired and hence would require to be updated if possible. This means that outlook 2007 and earlier versions will be retired and will be no longer able to connect to exchange online. So, this would require following actions from Office 365 administrators.

  1. Update Outlook 2007 or earlier versions of outlook to latest outlook version.
  2. For outlook 2010 and higher minimum required updates are following :
Office version Update Build number
Office 2016 The December 8, 2015 update
  • Subscription: 16.0.6568.20xx
  • MSI: 16.0.4312.1001
Office 2013 Office 2013 Service Pack 1 (SP1) and the December 8, 2015 update 15.0.4779.1002
Office 2010 Office 2010 Service Pack 2 (SP2) and the December 8, 2015 update 14.0.7164.5002

Note The December 8, 2015 updates for Office are listed in Microsoft Knowledge Base article, 3121650: “December 8, 2015, update for Office”. It is  recommended that you keep outlook clients updated with the most recent product updates as several MAPI over HTTP issues have been fixed since December 2015.

Additionally, you may have to make sure that Outlook clients aren’t using a registry key to disable MAPI over HTTP. For more information, see Microsoft Knowledge Base article, 2937684 : “Outlook 2013 or 2016 may not connect using MAPI over HTTPs as expected”.

Now while you make all efforts to ensure you meet the deadline and take all necessary steps to update your environment, you do not need assurance that you have completed your job. A simple report providing information from office 365 about outlook clients connecting to your tenant should do the job. Lets get this report now following below steps.

To retrieve this information, enable owner access auditing for each mailbox, and then query the audit log for the Outlook version that’s used to log on to the mailbox. To do this, follow these steps:

  1. Connect to Exchange Online using remote PowerShell.
  2. Enable mailbox auditing for the owner. To do this, run one of the following commands:
    • For one mailbox:

    • For all mailboxes:

Note: Mailbox auditing may take up to 24 hours to get enabled.

  1. Search the audit log. To do this, run one of the following commands:
    • For one mailbox:

    • For all mailboxes and export results to a .csv file

The above powershell command will produce a comprehensive report which you can use as a guide line to ensure that all your clients are ready for switch to Mapi over Http. Here is a sample output.

Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.

e.g Graph API Read/Write Permissions

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertifcate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certifcate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).

Synchronizing Passwords from Active Directory to the IBM/Lotus Domino Identity Vault using Microsoft Identity Manager – Part 3

Introduction

As the title suggests this is Part 3, and the final part in a three-part post on configuring FIM/MIM to synchronise users passwords from AD to the Domino ID Vault via PCNS and FIM/MIM.
Part 1 here detailed the creation of a PowerShell Management Agent to join users from Domino to the MIM Sync Metaverse.
Part 2 here detailed the creation and configuration of the Domino Agents to receive password changes via the PS MA into the ID Vault.

This post will wrap it all up with the details on calling the Domino Agents on password sync events (from PCNS via MIM)

Prerequisites

You will need the IBM Notes client installed and configured on your MIM Sync Server in order to put a document in the database we created in Part 2 and start the agent to process the document(s).

Overview

Essentially this is the process;

  • Password changed for a user (either by an admin, or by the user via their domain joined workstation, password reset or any other password change mechanism)
  • Password change is captured by the AD PCNS Filter installed and configured on each (writeable) Domain Controller
  • The DC using the PCNS Config in the domain locates the MIM Sync Server to send the password change too
  • The MIM Sync Server has the associated AD Domain configured as a Password Sync Source
  • Our new PowerShell ID Vault Notes MA is configured as a Password Target
  • MIM Sync passes off the password change event for MIM joined users to the ID Vault Password Change MA which initiates the Password.ps1 script (below)
  • The password.ps1 script creates a document (that contains the details for the password change) in our ID Vault Password Sync Database we created in Part 2 of this series and then tells the MIMPwdTrigger Agent to start
  • The MIMPwdTrigger Agent picks up the document, passes it to the MIMPasswordSync Agent which sends the password change to the ID Vault

Domino PowerShell Management Agent Password.ps1 Script

Put this Password.ps1 script in the same location you put the Schema, Import and Export scripts earlier.

Testing Password Sync End to End (Active Directory to the ID Vault)

The following screen shots show me tracing through the logs for a password change as it makes it way from the AD Domain Controller to MIM Sync to the MA to the MA Password script to the Notes DB as a document triggered to be process by the Notes Agent and the user updated in the ID Vault.

First the password change event is initiated to the MIM Sync Service by the Domain Controller that captured the password change.

PCNS provides all the details for the password change.

The MIM Sync Server determines where to send the change which includes our PS Notes MA.

Our PS Notes MA logged the process.

Notes MA LOG

=============================================================

Display Name: Jane XXX/xxx/xxxxx-Aus

Action: Set

Old pwd:

New pwd: Password123456

Unlock: False

Force change: False

Validate: False

Database: System.__ComObject

As did the Notes Agent as it process the change.

Notes Agent Log

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Reseting password …

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:22 PM: Server: xxxNotes1/xxxxx-Aus User:Jane xxx/xxx/xxxxx-Aus

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Return value: true

MIMPasswordSync|mimpasswordsync: 08/03/2017 02:56:23 PM: Removed User ID Vault change document from ‘xxxNotes1/xxxxx-Aus’

And finally we see the change reflected in the ID Vault. Looking at the time-stamps along the way we see that it all happened in approximately 2 seconds.

Summary

This three-part blog post has shown how to get passwords from Active Directory to the MIM Sync connected source across to IBM Domino and into the ID Vault using the Granfeldt PowerShell Management Agent and some configuration with a Database in Domino with two Domino Agents.

What have you synchronised passwords too using FIM/MIM ?

UPDATED: Identifying Active Directory Users with Pwned Passwords using Microsoft/Forefront Identity Manager

Earlier this week I posted this blog post that showed a working example of using a custom Pwned Password FIM/MIM Management Agent to flag a boolean attribute in the MIM Service to indicate whether a users password is in the pwned password dataset or not. If you haven’t read that post this won’t make a lot of sense, so read that then come back.

The solution when receiving a new password for a user (via Microsoft Password Change Notification Service) was checking against the Have I Been Pwned API. The disclaimer at the start of the blog post detailed why this is a bad idea for production credentials. The intent was to show a working example of what could be achieved.

This update post shows a working solution that you can implement internal to a network. Essentially taking the Pwned Password Datasets available here and loading them into a local network SQL Server and then querying that from the FIM/MIM Pwned Password Management Agent rather than calling the external public API.

Creating an SQL Server Database for the Pwned Passwords

On my SQL Server using SQL Server Management Studio I right-clicked on Databases and chose New Database. I gave it the name PwnedPasswords and told it where I wanted my DB and Logs to go to.

Then in a Query window in SQL Server Management Studio I used the following script to created a table (dbo.pwnedPasswords).

use PwnedPasswords;
 CREATE TABLE dbo.pwnedPasswords
( password_id int NOT IDENTITY(1,1) NULL,
 passwords varchar(max) NOT NULL,
 CONSTRAINT passwords_pk PRIMARY KEY (password_id)
);

Again using a query window in SQL Server Management Studio I used the following script to create an index for the passwords.

USE [PwnedPasswords]USE [PwnedPasswords]
GO
SET ANSI_PADDING ON

GO
CREATE UNIQUE NONCLUSTERED INDEX [PasswordIndex] ON [dbo].[pwnedPasswords]( [password_id] ASC)INCLUDE ( [passwords]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
GO

The last thing I did on the DB was to take the MIM Sync Server Active Directory Service Account (that was already in the SQL Server Logins) and give that account Reader Access to my new PwnedPasswords Database. I gave this account access as I’m using Integrated Authentication for login to SQL and as the MA is initiated by the MIM Sync Service Account, that is the account that needs the access.

Getting the Pwned Password Datasets into the new Database

I’m far from a DBA. I’m an identity guy. So using tools I was most familiar with (PowerShell) I created a simple script to open the password dump files as a stream (as Get-Content wasn’t going to handle the file sizes), read in the lines, convert the format and insert the rows into SQL. I performed the inserts in batches of 1000 and I performed it locally on the SQL Server.

In order to get the content from the dump file, add another column and get it in a format quickly to insert into the SQL DB I used the Out-DataTable function available from here.

The script could probably be improved as I only spend about 20-30 minutes on it. It is opening and closing a connection to the SQL DB each time it inserts 1000 rows. That could be moved outside the Insert2DB Function and maybe the batch size increased. Either way it is a starting point and I used it to write millions of rows into the DB successfully.

Updated FIM/MIM Pwned Passwords Management Agent Password.ps1 script

This then is the only other change to the solution. The Password.ps1 script rather than querying the PwnedPasswords API queries the SQL DB and sets the pwned boolean flag accordingly.

Summary

This enhancement shows a working concept that will be more appealing to Security Officers within corporate organisations if you have an appetite to know what your potential exposure is based on your Active Directory Users Passwords.

SharePoint Online forecast storage requirements

Background

Office 365 users get quite a bit of storage on SharePoint Online for content, be it files, metadata, etc. But still to manage the storage and forecast as to when additional storage has to be added becomes a challenge with very limited analytics available in SharePoint Online. Since adding more storage cost money so adding before you actually require or a bit too late would not be ideal.

SharePoint Online provides two ways to track the storage one from the admin center and other one from within the site collection using Storage metrics.

In SharePoint Online tenant’s admin center it gives us a bar matrix with details about the total storage, used storage and storage available. Here customers can manage the total amount of space allocated to each site collection, the total amount of space utilized and available.

To check Storage Metrics to the Site collection under site settings we can find the Storage metrics link, storage metrics provides us with the space utilization breakdown by each sub site, library and list in that Site Collection.

As shown in the image below sub sites, libraries, lists etc. are listed with size, % of parent. This can help in finding resources which are consuming most resources.

There are some third party storage monitoring utilities available to connect to SharePoint online to determine usage patterns and trends.

But if we need to forecast storage usage, find out the site collection which is growing the most over a period of time in a tenant, with more than hundreds of site collection then it becomes a challenge to keep that track.

Solution

We can use PowerShell script and get the storage information from the tenant, store it in a SharePoint list on a periodic basis. Once we have this data available then on this data Power BI can be used to build a report which shows the growth % for each site collection over a period and get some insight to the storage.

By using PowerShell, we get heaps of information from SharePoint tenant using the command Get-SPOSite but we will here use it to get site collection list, storage information.

Get-SPOSite -Detailed -Limit All | select *

Then once the information is retrieved from SPO, use PowerShell to iterate over the data and save the relevant data to a SharePoint list.

SharePoint list Columns: Site Name/Title, Site Url, Storage, Report Run Date.

Data in SharePoint list can be filtered, grouped and sorted in list views to make it more usable for business to analyze and predict storage growth and requirements.

In order to get detailed analytics on the data, use Power BI reports which can help in putting data together in form of reports and predict the growth percentage for each site collection. PowerBI provides many connectors one of them is for SharePoint Online to easily connect with lists. To set up connector in Power BI check this Kloud Blog.

Below is the Power BI report that was built on the SharePoint list data which was pulled in using PowerShell, Data in the report below is sorted by sites with maximum storage increase within a selected interval.

Using Power BI more smarts can be added like filters, graphs and make the data more usable.

Report details

  • x-axis : List of site collection
  • y-axis : Storage value’s in MB

As we hover the curser over the graph it shows the Max and Min value for the storage over the selected period of time. The graph captures the sites in decreasing order of growth value by storage.

Tabular view for the report.

PowerShell script can be scheduled to run periodically and gather the data by weekly or fortnightly we can calculate the average weekly or fortnightly storage requirement for the tenant.

Secondly data captured is used to forecast weekly storage requirement. From SPO tenant we get details about total storage, used storage, storage available.

Average weekly storage requirement can be taken from list view grouped by week and aggregate of the storage as shown below in the image.

Once we have the above information then we get the average requirement for week is 24,349 MB i.e. 24 GB (difference in storage growth in two weeks) in this case, this can be used to predict the weekly storage requirement which can help forecast as to when the tenant storage is going to run out and additional storage would be required.