Exchange Online & Splunk – Automating the solution

NOTES FROM THE FIELD:

I have recently been consulting on, what I think is a pretty cool engagement to integrate some Office365 mailbox data into the Splunk reporting platform.

I initially thought about using a .csv export methodology however through trial & error (more error than trial if I’m being honest), and realising that this method still required some manual interaction, I decided to embark on finding a fully automated solution.

The final solution comprises the below components:

  • Splunk HTTP event collector
    • Splunk hostname
    • Token from HTTP event collector config page
  • Azure automation account
    • Azure Run As Account
    • Azure Runbook
    • Exchange Online credentials (registered to Azure automation account

I’m not going to run through the creation of the automation account, or required credentials as these had already been created, however there is a great guide to configuring the solution I have used for this customer at  https://www.splunk.com/blog/2017/10/05/splunking-microsoft-cloud-data-part-3.html

What the PowerShell script we are using will achieve is the following:

  • Connect to Azure and Exchange Online – Azure run as account authentication
  • Configure variables for connection to Splunk HTTP event collector
  • Collect mailbox data from the Exchange Online environment
  • Split the mailbox data into parts for faster processing
  • Specify SSL/TLS protocol settings for self-signed cert in test environment
  • Create a JSON object to be posted to the Splunk environment
  • HTTP POST the data directly to Splunk

The Code:

#Clear Existing PS Sessions
Get-PSSession | Remove-PSSession | Out-Null
#Create Split Function for CSV file
function Split-array {
param($inArray,[int]$parts,[int]$size)
if($parts) {
$PartSize=[Math]::Ceiling($inArray.count/$parts)
}
if($size) {
$PartSize=$size
$parts=[Math]::Ceiling($inArray.count/$size)
}
$outArray=New-Object’System.Collections.Generic.List[psobject]’
for($i=1;$i-le$parts;$i++) {
$start=(($i-1)*$PartSize)
$end=(($i)*$PartSize)-1
if($end-ge$inArray.count) {$end=$inArray.count-1}
$outArray.Add(@($inArray[$start..$end]))
}
return,$outArray
}
function Connect-ExchangeOnline {
param(
$Creds
)
#Connect to Exchange Online
$Session=New-PSSession –ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/-Credential $Credentials-Authentication Basic -AllowRedirection
$Commands=@(“Add-MailboxPermission”,”Add-RecipientPermission”,”Remove-RecipientPermission”,”Remove-MailboxPermission”,”Get-MailboxPermission”,”Get-User”,”Get-DistributionGroupMember”,”Get-DistributionGroup”,”Get-Mailbox”)
Import-PSSession-Session $Session-DisableNameChecking:$true-AllowClobber:$true-CommandName $commands|Out-Null
}
#Create Variables
$SplunkHost = “Your Splunk hostname or IP Address”
$SplunkEventCollectorPort = “8088”
$SplunkEventCollectorToken = “Splunk Token from Http Event Collector”
$servicePrincipalConnection = Get-AutomationConnection -Name ‘AzureRunAsConnection’
$credentials = Get-AutomationPSCredential -Name ‘Exchange Online’
#Connect to Azure
Add-AzureRMAccount -ServicePrincipal -Tenant $servicePrincipalConnection.TenantID -ApplicationId $servicePrincipalConnection.ApplicationID -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
#Connect to Exchange Online
Connect-ExchangeOnline -Creds $credentials
#Invoke Script
$mailboxes = Get-Mailbox -resultsize unlimited | select-object -property DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Get Current Date & Time
$time = get-date -Format s
#Convert Timezone to Australia/Brisbane
$bnetime = [System.TimeZoneInfo]::ConvertTimeBySystemTimeZoneId($time, [System.TimeZoneInfo]::Local.Id, ‘E. Australia Standard Time’)
#Adding Time Column to Output
$mailboxes = $mailboxes | Select-Object @{expression = {$bnetime}; Name = ‘Time’}, DisplayName, PrimarySMTPAddress, IsMailboxEnabled, ForwardingSmtpAddress, GrantSendOnBehalfTo, ProhibitSendReceiveQuota, AddressBookPolicy
#Create Split Array for Mailboxes Spreadsheet
$recipients = Split-array -inArray $mailboxes -parts 5
#Create JSON objects and HTTP Post to Splunk HTTP Event Collector
foreach ($recipient in $recipients) {
foreach($rin$recipient) {
#Create SSL Validation Bypass for Self-Signed Certificate in Testing
$AllProtocols = [System.Net.SecurityProtocolType]’Ssl3,Tls,Tls11,Tls12′
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols
[System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}
#Get JSON string to post to Splunk
$StringToPost = “{ `”Time`”: `”$($r.Time)`”, `”DisplayName`”: `”$($r.DisplayName)`”, `”PrimarySMTPAddress`”: `”$($r.PrimarySmtpAddress)`”, `”IsMailboxEnabled`”: `”$($r.IsMailboxEnabled)`”, `”ForwardingSmtpAddress`”: `”$($r.ForwardingSmtpAddress)`”, `”GrantSendOnBehalfTo`”: `”$($r.GrantSendOnBehalfTo)`”, `”ProhibitSendReceiveQuota`”: `”$($r.ProhibitSendReceiveQuota)`”, `”AddressBookPolicy`”: `”$($r.AddressBookPolicy)`” }”
$uri = “https://” + $SplunkHost + “:” + $SplunkEventCollectorPort + “/services/collector/raw”
$header = @{“Authorization”=”Splunk ” + $SplunkEventCollectorToken}
#Post to Splunk Http Event Collector
Invoke-RestMethod -Method Post -Uri $uri -Body $StringToPost -Header $header
}
}
Get-PSSession | Remove-PSSession | Out-Null

 

The final output that can be seen in Splunk looks like the following:

11/13/17
12:28:22.000 PM
{ [-]
AddressBookPolicy:
DisplayName: Shane Fisher
ForwardingSmtpAddress:
GrantSendOnBehalfTo:
IsMailboxEnabled: True
PrimarySMTPAddress: shane.fisher@xxxxxxxx.com.au
ProhibitSendReceiveQuota: 50 GB (53,687,091,200 bytes)
Time: 11/13/2017 12:28:22
}Show as raw text·         AddressBookPolicy =  

·         DisplayName = Shane Fisher

·         ForwardingSmtpAddress =  

·         GrantSendOnBehalfTo =  

·         IsMailboxEnabled = True

·         PrimarySMTPAddress = shane.fisher@xxxxxxxx.com.au

·         ProhibitSendReceiveQuota = 50 GB (53,687,091,200 bytes)

I hope this helps some of you out there.

Cheers,

Shane.

 

 

 

MIM configuration version control with Git

The first question usually asked when something goes wrong: What changed?

Some areas of FIM/MIM make it easy to answer that question, some more difficult. If the Reporting Services components haven’t been installed (pretty common), history within the Portal/Service is only retained for 30 days by default, but also contains all data changes not just configuration changes. So, how do we track configuration change?

I was inspired by colleague Darren Robinson’s post “Automate the nightly backup of your Development FIM/MIM Sync and Portal Servers Configuration“, but wanted more detail, automatic differences, and handy visualisation. This is my first rough version and hasn’t been deployed ‘in anger’ at a client, so I expect I haven’t found all the pros/cons as yet. It also doesn’t implement all the recommendations from Microsoft (Check FIM Service Backup and Restore and FIM 2010: Planning Disaster recovery for details).

Approach

Similar to Darren’s post, we’ll export various Sync and MIM Service config to text files, then use a local git repository (no, not GitHub) to store and track the differences.

Assumptions

The script is written with the assumption that you have an all-in-one MIM-in-a-box. I’ll probably extend it at some point to cater for expanded installations. I’m also assuming PowerShell 5 for easier module package management, but it is not a strict requirement.

Pre-requisites

You will need:

  • “Allow log on locally” (and ideally, “Allow log on through Remote Desktop Services”) rights on your FIM/MIM all-in-one server, with access to create directories and files under C:\MIMBackup (or a similar backup location)
    New-Item -ItemType Directory -Path C:\MIMBackup
  • Access to your FIM/MIM Synchronisation Service with MIM Sync Admin rights (can you open the Synchronisation Service Console?). Yes, Admin. I’d love to do this with minimum privileges, but it just doesn’t seem achievable with the permissions available
  • Access to your FIM/MIM Service with either membership of the Administrators set, or a custom set created with Read access to members of set “All Resources”
  • Portable Git for Windows (https://github.com/git-for-windows/git/releases/latest)
    The Portable version is great, doesn’t require administrative access to install/use, doesn’t impact other installation of Git (if any), and is easy to update/maintain with no impact on any other software. Perfect for use in existing environments, and good for change control

    Unpack it into C:\MIMBackup\PortableGit
  • Lithnet FIM/MIM Service PowerShell Module (https://github.com/lithnet/resourcemanagement-powershell)
    The ‘missing commandlets’ for FIM/MIM. Again, they don’t have to be installed with administrative access and can be copied to specific use locations so that other installations/copies will not be affected by version differences/updates

    New-Item -ItemType Directory -Path C:\MIMBackup\Modules
    Save-Module -Name LithnetRMA -Path C:\MIMBackup\Modules
  • Lithnet PowerShell Module for FIM/MIM Synchronization Service (https://github.com/lithnet/miis-powershell)
    More excellent cmdlets for working with the Synchronisation service

    Save-Module -Name LithnetMIISAutomation -Path C:\MIMBackup\Modules
  • FIMAutomation Module (or PSSnapin)
    The ‘default’ PowerShell commandlets for FIM/MIM. Not the fastest tools available, but they do make exporting the FIM/MIM Service configuration easy. If you create a module from the PSSnapin [Check my previous post], you don’t need any special tricks to install it

    Store the module in C:\MIMBackup\Modules\FIMAutomation
  • The Backup-MIMConfig.ps1 script
    C:\MIMBackup\PortableGit\cmd\git.exe clone https://gist.github.com/Froosh/bd17ff4675f945dc7dc3bbb6bbda036d C:\MIMBackup\Backup-MIMConfig

Prepare the Git repository

New-Alias -Name Git -Value C:\MIMBackup\PortableGit\cmd\git.exe
Set-Location -Path C:\MIMBackup\MIMConfig
git init
git config --local user.name "MIM Config Backup"
git config --local user.email "MIMConfigBackup@$(hostname)"

Since the final script will likely be running as a service account, I’m cheating a little and using a default identity that will be used by all users to commit changes to the git repository. Alternatively, you can log in as the service account and set the user.name and user.email in ‘normal’ git per-user mode.

git config user.name "Service Account"
git config user.email "ServiceAccount@$(hostname)"

Give it a whirl!

C:\MIMBackup\Backup-MIMConfig\Backup-MIMConfig.ps1

Now, make a change to your config, run the script again, and look at the changes in Git GUI.

Set-Location -Path C:\MIMBackup\MIMConfig
C:\MIMBackup\PortableGit\cmd\gitk.exe

As you can see here, I changed the portal timezone config:

TimezoneChangeLarge

Finally, the whole backup script

Restoring deleted OneDrive sites in Office365

A customer has requested whether it was possible to restore a OneDrive site that had been deleted when the user’s account was marked for deletion in AD. After a bit of research, I was able to restore the site back and retrieved the files (luckily it was deleted less than 30 days ago).

Read More

Enabling and using Managed Service Identity to access an Azure Key Vault with Azure PowerShell Functions

Introduction

At the end of last week (14 Sept 2017) Microsoft announced a new Azure Active Directory feature – Managed Service Identity. Managed Service Identity helps solve the chicken and egg bootstrap problem of needing credentials to connect to the Azure Key Vault to retrieve credentials. When used in conjunction with Virtual Machines, Web Apps and Azure Functions that meant having to implement methods to obfuscate credentials that were stored within them. I touched on one method that I’ve used a lot in this post here whereby I encrypt the credential and store it in the Application Settings, but it still required a keyfile to allow reversing of the encryption as part of the automation process. Thankfully those days are finally behind us.

I strongly recommend you read the Managed Service Identity announcement to understand more about what MSI is.

This post details using Managed Service Identity in PowerShell Azure Function Apps.

Enabling Managed Service Identity on your Azure Function App

In the Azure Portal navigate to your Azure Function Web App. Select it and then from the main-pane select the Platform Features tab then select Managed service identity.

Platform Features

Turn the toggle the switch to On for Register with Azure Active Directory then select Save.

ManagedServiceIdentity

Back in Platform Features under General Settings select Application Settings. 

General Settings

Under Application Settings you will see a subset of the environment variables/settings for your Function App. In my environment I don’t see the Managed Service Identity variables there. So lets keep digging.

App Settings

Under Platform Features select Console.

DevelopmentTools

When the Console loads, type Set. Scroll down and you should see MSI_ENDPOINT and MSI_SECRET.

NOTE: These variables weren’t immediately available in my environment. The next morning they were present. So I’m assuming there is a back-end process that populates them once you have enabled Managed Service Identity. And it takes more than a couple of hours 

Endpoint

Creating a New Azure Function App that uses Managed Service Identity

We will now create a new PowerShell Function App that will use Managed Service Identity to retrieve credentials from an Azure Key Vault.

From your Azure Function App, next to Functions select the + to create a New Function. I’m using a HttpTrigger PowerShell Function. Give it a name and select Create.

NewFunction

Put the following lines into the top of your function and select Save and Run.

# MSI Variables via Function Application Settings Variables
# Endpoint and Password
$endpoint = $env:MSI_ENDPOINT
$endpoint
$secret = $env:MSI_SECRET
$secret

You will see in the output the values of these two variables.

Vars

Key Vault

Now that we know we have Managed Service Identity all ready to go, we need to allow our Function App to access our Key Vault. If you don’t have a Key Vault already then read this post where I detail how to quickly get started with the Key Vault.

Go to your Key Vault and select Access Polices from the left menu list.

Vault

Select Add new, Select Principal and locate your Function App and click Select.

Access Policy 1

As my vault contains multiple credential types, I enabled the policy for Get for all types. Select Ok. Then select Save.

Policy - GET

We now have our Function App enabled to access the Key Vault.

Access Policy 2

Finally in your Key Vault, select a secret you want to retrieve via your Function App and copy out the Secret Identifier from the Properties.

Vault Secret URI

Function App Script

Here is my Sample PowerShell Function App script that will connect to the Key Vault and retrieve credentials. Line 12 should be the only line you need to update for your Key Vault Secret that you want to retrieve. Ensure you still have the API version at the end (which isn’t in the URI you copy from the Key Vault) /?api-version=2015-06-01

When run the output if you have everything correct will look below.

KeyVault Creds Output

Summary

We now have the basis of a script that we can use in our Azure Functions to allow us to use the Managed Service Identity function to connect to an Azure Key Vault and retrieve credentials. We’ve limited the access to the Key Vault to the Azure Function App to only GET the credential. The only piece of information we had to put in our Function App was the URI for the credential we want to retrieve. Brilliant.

Easier portability of the FIMAutomation powershell snap-in

I am a fan of Ryan Newington’s MIM PowerShell modules, I think they are like the missing tools that Microsoft should have provided in the box from day one. Sometimes though, for various reasons, we may not have approval or access to use 3rd party or open source code, or other tools may expect exports to be in a specific format.

Using the FIMAutomation PSSnapin is easy … on servers with the MIM Service installed. There are several documented methods for copying the FIMAutomation PSSnapin files to another machine and registering them, but they all require local admin access and access to the required files on the server.

For example, I’m working in a locked-down development environment with no file copy in/out, no internet access, just work with what is currently there. I have limited access on the servers and management VM, with only access to 7-Zip (thankfully!) and a recent hotfix roll-up for MIM (4.4.1459.0)

Handily, PowerShell has already addressed this problem. Working from Learn how to load and use PowerShell snap-ins it seems simple enough to create a module to wrap around the snap-in.

Opening the hotfix roll-up FIMService_x64_KB4012498.msp in 7-Zip reveals a CAB with the files we need.

 

 

 

 

 

 

 

 

 

We’re looking for:

  • Microsoft.IdentityManagement.Logging.dll
  • Microsoft.ResourceManagement.Automation.dll
  • Microsoft.ResourceManagement.dll

 

 

 

 

 

 

But there are no exact matches, so I guessed a little, extracted these files and renamed them:

  • Common.Microsoft.IdentityManagement.Logging.dll
  • Common.Microsoft.RM.Automation.dll
  • Common.Microsoft.RM.dll

Now to bundle them as a module and get on with the real work, which PowerShell makes ridiculously easy. Make a new directory $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation, drop in the DLLs, then run the following few commands:

Push-Location -Path $HOME\Documents\WindowsPowerShell\Modules\FIMAutomation

New-ModuleManifest -Path .\FIMAutomation.psd1 -RootModule Microsoft.ResourceManagement.Automation.dll -RequiredAssemblies (dir *.dll)

Import-Module .\FIMAutomation.psd1

Pop-Location

Export-FIMConfig -Uri http://mimservice:5725 -PortalConfig

Awesome, success!

If you’re not working from such a constrained environment, I’ve made a version of the wrapper module available below; you’ll have to source the MIM DLLs yourself though, as I don’t have any special distribution rights 🙂

This is a pretty niche problem, not something you’ll see everyday, but is also a useful approach to other legacy PSSnapin problems.

Quickly creating and using an Azure Key Vault with PowerShell

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault

Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name

Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault

Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credential

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Configuring Remote PowerShell to a Remote Active Directory Forest for FIM/MIM GalSync

Introduction

Windows Remote Management (aka Remote PowerShell) is a wonderful thing; when it works straight out of the box when you’re in the same domain. Getting it working across Forests though can feel like jumping through hoop after hoop, and sometimes like the hoops are on fire.  When configuring GALSync ([Exchange] Global Address List Synchronisation) with FIM/MIM this always means across AD Forests. The graphic below shows the simplest relationship. If there is a firewall(s) in between then you’ll have additional hoops to jump through.

GALSync

This article here is the most definitive I’ve found about what is required, but it isn’t easily found even when you know it exists. In the last few months I’ve had to set up GALSync with FIM/MIM a number of times, and I have visibility that I’ll be needing to do it again in the future. So here is my consolidated version of the process using PowerShell to make the configuration changes. If nothing else it’ll help me find it quickly next time I need to do it.

This post assumes you have the other prerequisites all sorted. They are pretty clear in the linked article above such as a One-way Cross Forest Trust, connectivity on the necessary ports if there are firewalls in-between FIM/MIM and the Exchange CAS Server and Domain Controllers in the remote environment.

Configuring Remote PowerShell for FIM/MIM GALSync

My tip is to start from the MIM Sync Server.

  1. Get the details for the Service Account that you have/will specify on your GALSync Active Directory Management Agent that connects to the Remote Forest.
  2. Have that account be given (temporarily) Remote Desktop permissions to the Remote Exchange CAS Server that you will be configuring the Active Directory Management Agent to connect to.  Or use another Admin account that has permissions to Remote Desktop into the CAS Server, then …
  3. … start a Remote Terminal Services Session to the Exchange CAS Server in the Remote Forest

On the Exchange CAS Server (non SSL WinRM)

  • WinRM must have Kerberos authentication enabled
    • Kerberos requires TCP and UDP port 88 to be opened from the FIM/MIM server to ALL Domain Controllers in the target Forest. Run the following two commands in an elevated (Administrator) Powershell ISE/Shell session to enable Kerberos
      • set-item wsman:\localhost\service\auth\Kerberos -value true
      • set-item wsman:\localhost\service\AllowUnencrypted -value true 
  1. then on the MIM Sync Server perform the following …

On the MIM Sync Server (non SSL WinRM)

  • WinRM on the MIM Sync Server must have Kerberos authentication enabled also. Run the following commands in an elevated (Administrator) Powershell ISE/Shell session. The first is to enable Kerberos.
    • set-item wsman:\localhost\client\auth\Kerberos -value true
  • Add the Exchange Server to the list of trusted hosts on the FIM Server
    • Set-item wsman:localhost\client\trustedhosts -value ExchangeCASFQDN
  • Allow unencrypted traffic
    • set-item wsman:\localhost\client\AllowUnencrypted -value true 

Verification (from the MIM Sync Server)

  1. Using PowerShell ISE select File => New Remote Powershell Tab
  2. enter the ExchangeCASFQDN for the Computer field
  3. enter the  Service Account that you have specified on your GALSync Active Directory Management Agent that connects to the Remote Forest for User name in the format NetBIOSDOMAINName\Username
  4. If you have done everything correctly you will get a remote powershell command prompt on the Exchange CAS host.
  5. To confirm you have all your other Exchange Dependencies correct (and your AD MA Service account has the necessary permissions in Exchange) run the following script line-by-line. If you have configured Remote PowerShell correctly and have met all the prerequisites you should have are remote session into Exchange.
Set-ExecutionPolicy RemoteSigned
$Creds = Get-Credential
# NBDomain\ADMAServiceAccountUser
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://.customer.com/PowerShell/ -Credential $Creds -Authentication kerberos
Import-PSSession $Session
# Get a list of Exchange Servers
Get-ExchangeServer
# Get a list of Mailboxes
Get-Mailbox
# Get a list of Mail Users
Get-MailUser

# Close and remove the session 
Remove-PSSession $Session

Cleanup

Remove Remote Desktop permissions from the Active Directory Management Agent Service Account if you enabled it to configure the Exchange CAS Server.

Receive Push Notifications from Microsoft Identity Manager on your Mobile/Tablet/Computer

Background

Recently in a FIM/MIM environment a daily automated process was executing but the task it was performing was dependent on an upstream process that generates a feed, and the schedule for that feed had changed (without notice to me). Needless to say FIM/MIM wasn’t getting the information it needed to process. This got me thinking about notifications.

If you’re anything like me you probably have numerous email accounts and your subconscious has all but programmed itself to ignore “new email” notifications. However Push Notifications I typically do notice. Whilst in the example above I did have some error handling in place if the process completely failed (it is a development environment), I didn’t have anything for partial failures. Anyway it did get me thinking that I’d like to receive a notification if something that should happen didn’t.

Overview

This post details using push notifications to advise when expected events don’t transpire. In this particular example, I have an Azure Function App that connects once a day to a FTP Server and retrieves a series of exports and puts them on my FIM/MIM Synchronisation Sever. The Push Notification service I am using is Push Bullet. Push Bullet for free accounts (without a Pro subscription) are limited to 500 pushes per month. That should be more than enough. If I’ve got errors in excess of 500 per month I’ve got much bigger problems.

Getting Started

First up you will need to sign up for Push Bullet. It is very straight forward if you have a Facebook or Google account. As you’re probably wanting multiple people to receive the notifications it would pay to set up a shared Google Account that your team can use to connect to with their devices. Now you have an account head to your new Account Settings page and create an Access Token. Record it for use in the scripts below.

Connecting to the API

Test you can access the Push Bullet API using your Access Token and PowerShell. Update the following script for your Access Token in line 3 and execute. You should see information returned associated with your new Push Bullet account.

Next you will want to install the Push Bullet App on the device(s) you want to get the notification(s) on. I installed it on my Apple iPhone and also installed the Chrome Browser extension.

Using PowerShell we can then query to get the devices connected to the account. In the same PowerShell session you tested the API with above run this API call

$devices = Invoke-RestMethod -Method Get -Headers $header -Uri ($apiURI +"v2/devices")
$devices

This will return your registered devices.

Devices

If we want a notification to target a particular device we need to provide the Iden value associated with that device. If we don’t specify a target, the push notification will hit all devices. In my example above with two devices registered my iPhone was device two. So the target Iden I could get with;

$iphoneIden = $devices.devices[1].iden

Push Bullet allows for different notification types (Note, Link and File). Note is the one that’ll I’ll be using. More info on the other types here.

Sending Test Notification

To perform a notification test, update the following script for your Access Token (line 3). I’ve omitted the Device Identifier to send the message to all devices. I also had to logout of the iOS Push Bullet App and log in again to get the notifications to show.

Success. I received the notification on my iPhone and also in my Chrome browser.

IMG-5194

Implementation

Getting back to my requirement of being notified when a job didn’t find what it expected, I updated my PowerShell Function App that is based off this blog post here to evaluate what it processed and if it didn’t find what is expected, it sends me a notification. I already had some error handling in my implementation based off that blog post but it was based on full failure, not partial (which is what I was experiencing whereby only one part of the process wasn’t returning data).

NOTE: I had to also add the ServerCertificateValidationCallback line into my Function App script before calling the API POST to send the notification as I was getting the dreaded following PowerShell Invoke-RestMethod / Invoke-WebRequest error when sending the notification via the Function App. I didn’t get that error on my dev workstation which is a bit weird.

Invoke-WebRequest : The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure 
channel.

If you also receive the error above (or you will be sending Push Notifications via Azure Function Apps) insert this line before your invoke-restmethod call.

 [System.Net.ServicePointManager]::ServerCertificateValidationCallback = {$true}

Summary

Essentially this is my first foray into enabling anything for Push Notifications and this post is food for thought on what can be easily enabled within FIM/MIM to give timely visibility to automated scheduled functions when they don’t perform as expected. It was incredibly simple to set up and get working. I see myself enabling more FIM/MIM functions with Push Notifications in the future.

Certificate Management using PowerShell and Lambda Functions

Certificate Management

1. Why Certificate Management is required.

Certificates installed on client machines are one of the critical resources in the client’s infrastructure. Monitoring certificates is critical to any company willing to successfully provide Certificate Management service. The process of manually reporting certificate details is tedious is time consuming, so it better to automate it.

The following document will explain the steps to configure AWS services to provide certificate management for customers with AWS hosted infrastructure.

2. Solution Requirements

Mentioned below are the requirements that should be fulfilled to provide proper certificate management services.

  • The Solution should scan all instances for installed Certificates
  • The solution should be able to generate Reports which include the following information
    • Instance ID of the machine the certificate is installed on
    • Subject of the Certificate
    • Days till expiration
  • The solution should be able to generate an alert if the “days to expire” falls below 30 Days.
  • The alert should be able to create a ticket which can later be worked on to rectify the situation

Note: The requirements for certificate management are prone to change based on the requirements for the clients.

3. Assumptions:

The following assumptions are considered while configuring certificate management:

  • All the EC2 instances are running proper windows PowerShell.
  • All the EC2 instances have SSM agent installed.
  • The personnel responsible for the configuration of certificate management have some understanding of IAM Roles, S3 buckets and lambda functions

4. Solutions Description:

The following services provided by Amazon will be utilized to provide certificate management

  • Windows PowerShell
  • AWS S3
  • AWS Lambda
  • AWS IAM Roles
  • Maintenances Windows

4.1      Windows PowerShell

PowerShell will be utilized to generate information about the certificate and instance id for the AMI.

Mentioned below script needs to be executed on all AMI’s to generate the certificate information.

Set-Location Cert:\LocalMachine\my #Sets the location
$CertificateDetails = get-childitem # Get all the machine certificates
$instanceId = Invoke-WebRequest -usebasicparsing -Uri http://169.254.169.254/latest/meta-data/instance-id # Gets the instance ID
foreach ($i in $CertificateDetails) # writes the output with certificate Thumbprint , Subject, Instance ID and Expiration Date
{ 
Write-host "Thumbprint=" $i.Thumbprint " Expiration Date="$i.NotAfter " InstanceID ="$instanceID.Content" Subject="$i.Subject 
}

The following output is generated as a result of the PowerShell script.

 

Thumbprint= ****************** Expiration Date= 1/1/2040 10:59:59 AM InstanceID = i-******* Subject= CN=SubjectName Blah Blah Blah
Thumbprint= ****CYXZ********** Expiration Date= 1/1/2040 10:59:59 AM InstanceID = i-******* Subject= CN=SubjectName Blah Blah Blah
Thumbprint= ****************** Expiration Date= 1/1/2040 10:59:59 AM InstanceID = i-******* Subject= CN=SubjectName Blah Blah Blah
Thumbprint= ****************** Expiration Date= 1/1/2040 10:59:59 AM InstanceID = i-******* Subject= CN=SubjectName Blah Blah Blah
Thumbprint= ****************** Expiration Date= 1/1/2040 10:59:59 AM InstanceID = i-******* Subject= CN=SubjectName Blah Blah Blah

Where * represents different values for different certificates.

4.1      AWS S3

The result of the PowerShell script will be posted to an S3 bucket for further use.

The EC2 instances will need write access to the nominated S3 bucket for certificate Maintenance.

S3 Bucket Name: CertificateManagement

4.2      AWS Lambda Functions

Lambda Functions will be used to perform the following activities.

  • Acquire the result of the PowerShell script from the S3 bucket
  • Process the data to determine the Days till expiration of all the certificates
  • Generate a Report
  • Email the report to the relevant recipient

The Lambda Functions would need read access to the S3 bucket and access to AWS SES to send emails to recipients.

Mentioned below is the Lambda Functions that performs the mentioned above tasks.

import boto3
import codecs
import pprint
from datetime import datetime, date, time
def lambda_handler(event,Context):
    s3 = boto3.resource('s3')
    mybucket = s3.Bucket('CertificateManagement')
    result=["this is the result","\n"]
    for file_key in mybucket.objects.all():
        print("\n")
        body = file_key.get()['Body'].read().decode('utf-8')
        output_lines=body.splitlines() #splits data into lines.
        resulthtml = ["<h1>Client Certificate Report</h1>"] # Adds heading to the email body
        resulthtml.append('<html><body><table border="1">') # Creates a table
        resulthtml.append('<tr><b><td>InstanceID</td><td>Thumbprint</td><td>Subject</td><td>Days to Expiration</td></b></tr>')
        try:
            for line in output_lines:
                x=1
                output_word=line.split() # splits every line into words
                cnlist=output_word[10:] # gets the complete subject name and adds into an array
                cnstr=" ".join(cnlist) # joins the array to create a string.
                dt = datetime.strptime(str(output_word[4]), "%m/%d/%Y") # Converts the Date values string into a date format.
                ct=datetime.now()
                difference=dt-ct # Gets the difference between the date value and current date.
                Values = str(difference) # converts the difference into string
                Valuelist=Values.split() # converts the string into arrays.
                Days = Valuelist[0] # selcts the first value in the array which is Number of days.
                result.append(("Certificate with '{}' will expire in '{}'").format(cnstr,difference)) # to display output when testing the script in the lambda section.
                resulthtml.append(("<td>'{}'</td><td>'{}'</td><td>'{}'</td><td>'{}'</td></tr>").format(output_word[9],output_word[1],cnstr,Days)) # for the HTML email to be sent.
            resulthtml.append('</table></body></html>')
        except:
            print("some errors encountered")
        for i in result: # prints the result in the testing window in the Lambda console
            print(i)
    myhtmllist = str(''.join(resulthtml)) # converts all the resulthtml into string to be used in Email.
    myhtmllist = myhtmllist.replace("'","") #removes the "'" from the data. 
    textbody="this is a text body default"
    sender = "syed.naqvi@kloud.com.au"
    recipient = "syed.naqvi@kloud.com.au"
    awsregion = "us-east-1"
    subject = "Certificate Update list"
    charset = "UTF-8"
    client = boto3.client('ses',region_name=awsregion)
    try:
        response = client.send_email(
           Destination={
               'ToAddresses': [
                   recipient,
                ],
            },
         Message={
                  'Body': {
                      'Html': {
                        'Charset': charset,
                        'Data': myhtmllist,
                             },
                    'Text': {
                     'Charset': charset,
                     'Data': mylist,
                    },
                },
                'Subject': {
                    'Charset': charset,
                    'Data': subject,
                },
            },
            Source=sender,
        )
    # Display an error if something goes wrong.
    except Exception as e:
        print( "Error: ", e)
    else:
       print("Email sent!")

 4.1 AWS IAM Roles

Roles will be used to grant

  • AWS S3 write access to all the EC2 instances as they will submit the output of the PowerShell script in the S3 bucket
  • AWS SES access to Lambda Functions to send emails to relevant recipients.

4.2 AWS SES

Amazon Simple Email Service (Amazon SES) evolved from the email platform that Amazon.com created to communicate with its own customers. In order to serve its ever-growing global customer base, Amazon.com needed to build an email platform that was flexible, scalable, reliable, and cost-effective. Amazon SES is the result of years of Amazon’s own research, development, and iteration in the areas of sending and receiving email.( Ref. From https://aws.amazon.com/ses/).

We would be utilizing AWS SES to generate emails using AWS lambda.

The configuration of the Lambda functions can be modified to send emails to a distribution group to provide Certificate reporting, or it can be used to send emails to ticketing rsystem in order to provide alerting and ticket creation in case a certificate expiration date crosses a configured threshold.

5. Solution Work Flow

Mentioned below is the workflow for the tasks.

workflow

6Solution Configuration

6.1 Configure IAM Roles

The following Roles should be configured

  • IAM role for Lambda Function.
  • IAM for EC2 instances for S3 bucket Access

6.1.1 Role for Lambda Function

Lambda function need the following access

  • Read data from the S3 bucket
  • Send Emails using Amazon S3

To accomplish the above the following policy should be created and attached to the IAM Role

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1501474857000",
            "Effect": "Allow",
            "Action": [
                "s3:GetObject"
            ],
            "Resource": [
                "arn:aws:s3:::S3BucketName/*"
            ]
        },
        {
            "Sid": "Stmt1501474895000",
            "Effect": "Allow",
            "Action": [
                "ses:SendEmail"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}

6.1.2  Role for EC2 instance

All EC2 instances should have access to store the PowerShell output in the S3 bucket.

To accomplish the above , the following policy should be assigned to the EC2 roles

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Stmt1501475224000",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::CertificateMaintenance"
            ]
        }
    ]
}

6.2 Configure Maintenance Window.

The following tasks need to be performed for the maintenance window

  • Register a Run Command with Run-PowerShell Script using the script in section 4.1
  • Register targets based on the requirements
  • Select the schedule based on your requirement

Maintenance Window Ref : 

http://docs.aws.amazon.com/systems-manager/latest/userguide/what-is-systems-manager.html

6.3 Configure Lambda Function:

The following tasks need to be performed for the Lambda Function

  • Create a blank lambda function with the S3 put event as the triggerlambda function
  • Click on Next
  • Enter the Name and Description
  • Select run time Python 3.6
  • Copy and paste the lambda function mentioned in section 4.3

6.4 Configuring AWS SES

The following tasks need to be completed before the execution of the Run-commands.

  • Email Addresses should be added to the AWS SES section of the tenant.
  • The email addresses should be verified.

 7. Result:

Based on the above configuration, whenever the run command is executed, the following report is generated and sent to the nominated email account.

report

8. Next Steps:

We can modify the script to only report on Certificate expiring in 45 days or less.

We can also use SNS instead of SES to send out SMS to relevant teams as well.

That will be catered in another blog

Creating an AzureAD WebApp using PowerShell to leverage Certificate Based Authentication

Introduction

Previously I’ve posted about using PowerShell to access the Microsoft AzureAD/Graph API in a number of different ways. Two such examples I’ve listed below. The first uses a Username and Password method for Authentication, whilst the second uses a registered application and therefore ClientID and Client Secret.

As time has gone on I have numerous WebApp’s doing all sorts of automation. However they all rely on accounts with a username and password, or clientid and secret, where the passwords and secrets expire. Granted the secrets have a couple of years of life and are better than passwords which depending on the environment roll every 30-45 days.

However using Certificates would allow for a script that is part of an automated process to run for much longer than the key lifetime available for WebApps and definitely longer than passwords. Obviously there is security around the certificate to be considered so do keep that in mind.

Overview

This post is going to detail a couple of simple but versatile scripts;

  1. Using PowerShell we will;
    1.  Configure AzureAD
      1. Create a Self Signed 10yr Certificate
      2. Create an AzureAD WebApp and assign the Certificate to it
      3. Apply permissions to the WebApp (this is manual via the Azure Portal)
      4. Record the key parameters for use in the second script
    2. Connect to AzureAD using our Certificate and new WebApp

Creating the AzureAD WebApp, Self Signed Certificate and Assigning Application Permissions

The script below does everything required. Run it line by line, or in small chunks as you step through the process. You will need the AzureRM and Azure AD Utils Powershell Modules installed on the machine you run this script on.

Change;

  • Lines 3 & 4 if you want a certificate with a time-frame other than 10yrs
  • Line 5 for the password you want associated with the certificate for exporting/importing the private key
  • Line 6 for the certificate subject name and location it’ll be stored
  • Line 8 for a valid location to export it too
  • Line 11 for the same path as provided in Line 8
  • Lines 24 & 25 for an account to automatically connect to AAD with
  • Line 31 for the name of your WebApp

Before running line 37 login to the Azure Portal and assign permissions to the WebApp. e.g. AzureAD Directory Permissions. When you then run Line 37 it will trigger a GUI for AuthN and AuthZ to be presented. Sign in as an Admin and accept the oAuth2 Permission Authorizations for whatever you have request on the WebApp.

e.g Graph API Read/Write Permissions

Connecting to AzureAD using our Certificate and new WebApp

Update lines 3, 4, 6 and 7 as you step through lines 40-43 from the configuration script above which copies key configuration settings to the clipboard.

The following script then gets our certificate out of the local store and takes the Tenant and WebApp parameters and passes them to Connect-AzureAD in Line 15 which will connect you to AAD and allow you to run AzureAD cmdlets.

If you wish to go direct to the GraphAPI, lines 20 and 23 show leveraging the AzureADUtils Module to connect to AzureAD via the GraphAPI.

Notes on creating your Self-Signed Certificate in PowerShell

I’m using the PowerShell New-SelfSignedCertifcate cmdlet to create the self signed certificate. If when you run New-SelfSignedCertificate you get the error as shown below, make sure you have Windows Management Framework 5.1 and if you don’t have Visual Studio or the Windows 8.1/10 SDK, get the Windows 8.1 SDK from here and just install the base SDK as shown further below.

Once the install is complete copy C:\Program Files (x86)\Windows Kits\8.1\bin\x86\makecert.exe to C:\windows\system32

Summary

The two scripts above show how using PowerShell we can quickly create a Self Signed Certifcate, Create an Azure AD WebApp and grant it some permissions. Then using a small PowerShell script we can connect and query AAD/GraphAPI using our certificate and not be concerned about passwords or keys expiring for 10yrs (in this example which can be any timeframe you wish).