Office365 & Windows 10 Profile Pictures

At a customer recently, we were asked if we could provide a non-technical way of controlling profile pictures for both Office 365 and Windows 10. So straight away I thought, time for some PowerShell. I came up with the solution of having a number of shares on a server, which can be permission’d as required…

.\Source – for adding images
.\Replace – if a user wanted to change their picture
.\Remove – if a user opted out of the profile picture setup

As this was a new setup, I requested that they name the images UPN.jpg, as that way I could grab the username during the process. The source images were going to be the ones used for ID badges, so coming from a high resolution camera. I then cracked on and uploaded to Office 365 and it looked the business, here is me thinking this is a piece of cake. However, as soon as I tried to resize and use these for Windows 10, well you guessed it, pixel horribleness. Turns out Office 365 uses some nice squaring technology on the uploaded images.

Not to be deterred, I had a play with some PowerShell functions I found on the internet to resize images and while they worked, I just wasn’t happy with the dimensions. Then, bing, the lightbulb went off, let’s use Office 365 to resize the images for me!

So basically if I resize the image using a PowerShell function to 767×767 (which worked well for my source images), I could upload this for Office 365. Then using the URL below you can download the square image at set resolutions. In my case, I was wanting 96×96 as that’s the thumbnailPhoto attribute for AD. So I plumbed in the UserPrincipalName and ImageSize into the URL using variables and grabbed the image…”+$UserPrincipalName+”&size=HR”+$ImageSize

I’ve provided part of the script below. This essentially followed the logic of…

    • A user’s profile picture should be added to the .\Source share
    • This picture should be named by the user email address, for example
    • The nightly schedule will then poll that directory for all pictures and obtain the username from each file
    • It will resize the Office 365 image to 767×767 and then upload this to Office 365
    • It will then download the Office 365 image in 96×96 format, taking advantage of the squaring process within Office 365, and save it in preparation of on-premises Active Directory
    • Finally, if the user does not have an Active Directory picture set, it will then write this image to the thumbnailPhoto attribute ready for the logon script to download
log "INFO" 100 "Grabbing all images from the Source Directory"
$OriginalImages = Get-ChildItem .\Source -Filter "*.jpg" -Recurse
$OriginalImagesCount = $OriginalImages.Count
log "INFO" 100 "Found $OriginalImagesCount images"
$ImageUserCounter = 0

foreach ($ImageUser in $OriginalImages)
        $ImageUserCounter ++
        $UserPrincipalName = $null
        $UserPrincipalName = [io.path]::GetFileNameWithoutExtension($ImageUser.Name)
        log "INFO" 100 "Grabbing the UserPrincipalName for $UserPrincipalName. $ImageUserCounter/$OriginalImagesCount"
        log "WARNING" 100 "Unable to grab the UserPrincipalName for $ImageUserCounter/$OriginalImagesCount.  The value of the variable is: $UserPrincipalName.  Server said: $_"
        log "INFO" 100 "Resizing $UserPrincipalName Office365 image"
        $SkypeOutput = $null
        $SkypeOutput = $SkypeWorkingDir+"\$UserPrincipalName 767x767.jpg"
        ResizeImage $ImageUser.VersionInfo.FileName 767 767 $SkypeOutput
        log "WARNING" 100 "Unable to resize $UserPrincipalName Office365 image, server said: $_"
            log "INFO" 100 "Setting $UserPrincipalName Office365 picture"
            Set-o365UserPhoto -Identity $UserPrincipalName -PictureData ([System.IO.File]::ReadAllBytes($SkypeOutput)) -Confirm:$false -ErrorAction stop
            log "WARNING" 100 "Unable to set $UserPrincipalName's Office365 image, server said: $_"
        log "INFO" 100 "Resizing $UserPrincipalName AD image"
        $ImageSize = "96x96"
        $DownloadURL = ""+$UserPrincipalName+"&size=HR"+$ImageSize
        $DownloadPath = $ADWorkingDir+"\"+$UserPrincipalName+" "+$ImageSize+".jpg"

        $WebClient = New-Object System.Net.WebClient
        $WebClient.Credentials = New-Object System.Net.NetworkCredential($adminuser,$adminpwd)
        $ADPhoto = ([System.IO.File]::ReadAllBytes($DownloadPath))
        log "WARNING" 100 "Unable to resize $UserPrincipalName's AD image, server said: $_"
    $ADUser = $null
    $ADUser = Get-ADUser -Filter {UserPrincipalname -eq $UserPrincipalName} -Properties * -ErrorAction Stop
    if ($ADUser.thumbnailPhoto -eq $null)
            log "INFO" 100 "$UserPrincipalName does not currently have an AD picture, setting this now"
            Set-ADUser $ADUser -Replace @{thumbnailPhoto=$ADPhoto} -ErrorAction Stop
            log "WARNING" 100 "Unable to set $UserPrincipalName's AD image, server: said $_"
        log "INFO" 100 "$UserPrincipalName already has a picture set in AD, skipping this section"

A quick note on a gotcha I found as well, you need to change the PSSession string to include a proxy method which is described here ala…

$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionURI -Credential $credential -Authentication Basic -AllowRedirection

And there you go, this approach not only uploaded photo’s for Office 365, but took advantage of their image manipulation and re-used that for on-premises applications such as AD. Enjoy!

Office 365 Import Service via PowerShell

UPDATE 10/02/2017

Ok, so sorry everyone, I’ve been a bit slack with this one and Microsoft have made some significant changes in this space since I blogged on it. I thought it best to get this page updated so anyone who googled it would have current info!

Firstly, Microsoft have changed the BLOB they give you for the ingestion service to write once. This of course means if you don’t place things in the right location (folder for example) it’s not going anywhere! While a little inconvenient, I think it’s still thoroughly acceptable, they are still giving you unlimited BLOB storage at no cost for you to upload and ingest your PST’s. While I have read other blogs which have talked about signing up for your own blob (which will work really well with my original blog listing) I’m not sure it’s really necessary for the majority of customers as it will incur cost.

Also, I found that the old ‘Azure Storage Explorer 6’ wasn’t able to interact with the new BLOB so you couldn’t visualise the uploads with a GUI. Well they have released a new version in preview with the 6 dropped that can be downloaded here instead, oh and it’s really straight forward to use!

Finally Microsoft now recommend using the GUI and I must admit it, overall, it works pretty well. I used it for a recent project and I found it far superior to what it was originally. So I found I automated the AzCopy commands and the CSV generation but then I loaded the CSV and ran this import via the GUI. If anyone wants to see some examples, leave some comments below and I’ll write a new blog on it in the coming weeks! Original blog below…

If you’ve been involved in an Exchange Migration to Office 365 of late, I’m sure you’ll be well aware of the new Microsoft Office 365 Import Service.  Simply put, give them your PST’s via hard disk or over the network, and they’ll happily ingest them into the mailbox of your choice – what’s not to like!  If you’ve been hiding under a rock, the details are available here!

Anyway, rather than marvelling at the new Microsoft offering, I thought I’d share my recent experience with the network route option.  When I first started playing [read testing] I soon noticed that the Portal GUI was, well, struggling to keep up with the requests.  So instead, I reached out to MS and found that this is all possible programmatically using PowerShell!

So first of all, you want to get the files into an Azure Blob.  That’s achieved via AzCopy which is available on this link.  Then you’ll want to grab the Azure Storage Explorer, granted, you could probably do this all via PowerShell, but when there is a good GUI sometimes it’s just easier, it’s available here.

When you fire up the Storage Explorer, you’re looking for the Storage Key.  This is available via the Portal | Admin | Import.  From there, you’ll see a key icon which is referred to as ‘Copy Office 365 Storage Key and URL‘.  Once you’re connected you should see some default folders which I have screenshot below…

Blob Storage

Blob Storage

Select one and click Security, select the Shared Access Key tab and generate Signature.  So now we’ve got the necessary items, we can plumb them into variable as I’ve demonstrated below…

cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"

$Source = "\\Server\Share"
$Destination = "https://55f6dd****"
$StorageKey = "9XAsHc/OmkZgdrSXkE+***+njq9qQWvt*******rXysz//CvB2j6mlb6XSJgi4bfh+Br6Onn*****AEVtA=="
$LogFile = "C:\Users\dyoung\Documents\Uploads.log"

& .\AzCopy.exe /Source:$Source /Dest:$Destination /DestKey:$StorageKey /S /V:$LogFile

This will grab all the PST’s from the \ \Server\Share UNC path and look to upload them to the ingestiondata directory. There is a log file which is pretty useful and the PowerShell window will let you know once the upload is complete.

Now we’ve got our PST’s up in Azure, but we need to let MS know where to ingest them. For this, we need to generate a CSV. When I was creating the PST names I found it good to name them ADDisplayName.pst, simply so we can automate the CSV creation.  I also added incremental numbers at the end of each file, as it’s possible to insert multiple PST’s to a single mailbox…

e.g. ‘Dave Young.pst‘, ‘Dave Young1.pst‘ and ‘Dave Young2.pst‘ would all be imported into Dave Young’s mailbox.

I also decided to import to the users archive mailbox (as I’ve already given the users an archive) and I wanted the PST’s in the root directory…these options are all covered in the TechNet article here

$FilePath = "/"

$AllPSTs = Get-ChildItem '\\Server\Share' -filter "*.pst"

$CSVTable= @()
for($Cycle=0; $Cycle -lt $; $Cycle++)
 $DisplayName = [io.path]::GetFileNameWithoutExtension($[$Cycle])
 if ($DisplayName -match "[1-9]")
 $DisplayName = $DisplayName -replace ".$"
 $SMTPAddress = $null
 $Mailbox = Get-Mailbox $DisplayName
 $SMTPAddress = $Mailbox.PrimarySmtpAddress.ToString()
 $SMTPAddress = $SMTPAddress.ToLower()
 $AzureBSAccountUri = 'https://55f6ddf********'+$AllPSTs[$Cycle].Name
 $CSVColumn = New-Object PSObject
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Workload' -Value 'Exchange'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'FilePath' -Value $FilePath
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Name' -Value $[$Cycle]
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'Mailbox' -Value $Mailbox.UserPrincipalName
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'AzureSASToken' -Value '?sv=2014-02-14&sr=c&sig=2dSSbF9CsQ******HpwHOJy4pmGFvsniv46hZh%2BA%3D&st=2016-02-01T13%3A30%3A00Z&se=2016-02-09T13%3A30%3A00Z&sp=r'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'AzureBlobStorageAccountUri' -Value $AzureBSAccountUri
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'IsArchive' -Value 'TRUE'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'TargetRootFolder' -Value '/'
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPFileContainer' -Value ''
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPManifestContainer' -Value ''
 Add-Member -InputObject $CSVColumn -MemberType NoteProperty -Name 'SPSiteUrl' -Value ''
 $CSVTable += $CSVColumn

$CSVTable | Export-Csv "PstImportMappingFile-$((Get-Date).ToString("hh-mm-dd-MM-yy")).csv" -NoTypeInformation

Hopefully the output of this script is a pretty decent csv.  Now it’s time to create the import jobs.  Now you might be wondering, there is no ‘New-o365MailboxImportRequest‘ cmdlet and you’d be right.  I’ve used a suffix so I could load both on-premises and cloud cmdlets into the same PowerShell session, it’s achieved by ‘Import-PSSession $Session -Prefix o365′ which is described in point 2 here

Add-Type -AssemblyName System.Windows.Forms
$FileBrowser = New-Object System.Windows.Forms.OpenFileDialog -Property @{
InitialDirectory = [Environment]::GetFolderPath('MyDocuments')
Filter = 'CSV (*.csv)|*.csv'

Import-Csv $($FileBrowser.SafeFileName) | foreach {New-o365MailboxImportRequest -Mailbox $_.Mailbox -AzureBlobStorageAccountUri $_.AzureBlobStorageAccountUri -BadItemLimit unlimited -AcceptLargeDataLoss -AzureSharedAccessSignatureToken $_.AzureSASToken -TargetRootFolder $_.TargetRootFolder -IsArchive}

These jobs will be in PowerShell and they can of course be checked using the cmdlets ‘Get-MailboxImportRequest’ & ‘Get-MailboxImportRequestStatistics’.

And that’s pretty much it, yes it was quick but I reckon it’s pretty self explanatory and it’s considerably more efficient than doing it via the GUI – Enjoy!


Programmatically interacting with Yammer via PowerShell – Part 2

In my last post I foolishly said that part 2 would be ‘coming in the next few days’. This of course didn’t happen, but I guess it’s better late than never!

In part 1 which is available here, I wrote how it was possible to post to a Yammer group via a *.ps1 using a ‘Yammer Verified Admin’ account. While this worked a treat, it soon became apparent that this approach had limited productivity rewards. Instead, I wanted to create groups and add users to these groups, all while providing minimal inputs.

Firstly, there isn’t a documented create group .json?, but a quick hunt round the tinterweb with Google helped me uncover the groups.json?. This simply needs a name and whether it’s open or closed, open = $false, closed = $true. So building on my example from Part 1, the below code should create a new group…

$clientID = "fvIPx********GoqqV4A"
$clientsecret = "5bYh6vvDTomAJ********RmrX7RzKL0oc0MJobrwnc"
$Token = "AyD********NB65i2LidQ"
$Group = "Posting to Yammer Group"
$GroupType = $True
$CreateGroupUri = "$Group&private=$GroupType"

    $Headers = @{
        "Accept" = "*/*"
        "Authorization" = "Bearer "+$Token
        "accept-encoding" = "gzip"
        "content-type" = "application/json"

Invoke-WebRequest -Method POST -Uri $CreateGroupUri -Header $Headers
    You’ll noticed I’ve moved away from Invoke-RestMethod to Invoke-WebRequest. This is due to finding a bug where the script would hang and eventually timeout which is detailed in this link.

All going well, you should end up with a new group which has your ‘Yammer Verified Admin’ as the sole member ala…


Created Yammer Group

Great, but as I’ve just highlighted, there is only one person in that group, and that’s the admin account we’ve been using. To add other Yammer registered users to the group we need to impersonate. This is only possible via a ‘Yammer Verified Admin’ account for obvious chaos avoiding reasons. So firstly you need to grab the token of the user…

$GetUsersUri = ""
$YammerUPN = ""
$YammerUsers = (Invoke-WebRequest -Uri $GetUsersUri -Method Get -Headers $Headers).content | ConvertFrom-Json

foreach ($YammerUser in $YammerUsers)
    if ($ -eq $YammerUPN)
        $YammerUserId = $

$GetUserTokUri = “$YammerUserId&consumer_key=$clientID"
$YammerUserDave = (Invoke-WebRequest -Uri $GetUserTokUri -Method Get -Headers $Headers).content | ConvertFrom-Json

To step you through the code. I’ve changed the uri to the users.json, provided the UPN of the user that I want to impersonate and I’m using the headers from the previously provided code. I grab all the users into the $YammerUsers variable and then I do a foreach/if to obtain the id of the user. Now we’ve got that we can use the tokens.json to perform a Get request. This will bring you back a lot of information about the user, but most importantly you’ll get the token!

    user_id : 154**24726
    network_id : 20**148
    network_permalink :
    network_name :
    token : 18Lz3********Nu0JlvXYA
    secret : Wn9ab********kellNnQgvSfbGJjBfRMWZNICW0JTA
    view_members : True
    view_groups : True
    view_messages : True
    view_subscriptions : True
    modify_subscriptions : True
    modify_messages : True
    view_tags : True
    created_at : 2015/06/15 23:59:19 +0000
    authorized_at : 2015/06/15 23:59:19 +0000
    expires_at :

Storing this into the $UserToken variable allows for you to append this to the Authorization within the Headers so you can impersonate/authenticate on behalf of the user. The code looks like…

$UserToken = $YammerUserDave.token
$YammerGroupId = "61***91"

 $UserHeaders = @{
                "Accept" = "*/*"
                "Authorization" = "Bearer "+$UserToken
                "accept-encoding" = "gzip"
                "content-type" = "application/json"

$PostGroupUri = "$YammerGroupId"
$AddYammerUser = Invoke-WebRequest -Uri $PostGroupUri -Method Post -Headers $UserHeaders

So using the group that we created earlier and the correct variables we then successfully add the user to the group…


Dave in the group

Something to be mindful of, when you pull the groups or the users it will be done in pages of 50. I found using a Do/While worked nicely to build up the variables so they could then be queried, like this…

If ($YammerGroups.Count -eq 50)
    $GroupCycle = 1
        $GetMoreGroupsUri = "$GroupCycle"
        $MoreYammerGroups = (Invoke-WebRequest -Uri $GetMoreGroupsUri -Method Get -Headers $AdminHeaders).content | ConvertFrom-Json    
        $YammerGroups += $MoreYammerGroups
        $GroupCycle ++
        $GroupCount = $YammerGroups.Count
    While ($MoreYammerGroups.Count -gt 0)

Once you’ve got your head around this, then the rest of the API/Json’s on the REST API are really quite useful, my only gripe right now is that they are really missing a delete group json – hopefully it’ll be out soon!



Programmatically interacting with Yammer via PowerShell – Part 1

For my latest project I was asked to automate some Yammer activity. I’m first to concede that I don’t have much of a Dev background, but I instantly fired up PowerShell ISE in tandem with Google only to find…well not a lot! After a couple of weeks fighting with a steep learning curve, I thought it best to blog my findings, it’s good to share ‘n all that!

    It’s worth mentioning at the outset, if you want to test this out you’ll need an E3 Office 365 Trial and a custom domain. It’s possible to trial Yammer, but not with the default * domain.

First things first, there isn’t a PowerShell Module for Yammer. I suspect it’s on the todo list over in Redmond since their 2012 acquisition. So instead, the REST API is our interaction point. There is some very useful documentation along with examples of the json queries over at the Yammer developer site, linked here.

The site also covers the basics of how to interact using the REST API. Following the instructions, you’ll want to register your own application. This is covered perfectly in the link here.

When registering you’ll need to provide a Expected Redirect. For this I simply put my Yammer site address again For the purposes of my testing, I’ve not had any issues with this setting. This URL is important and you’ll need it later so make sure to take a note of it. From the registration you should also grab your Client ID & Client secret.

While we’ve got what appears to be the necessary tools to authenticate, we actually need to follow some steps to retrieve our Admin token.

    It is key to point out that I use the Yammer Verified Admin. This will be more critical to follow for Part 2 of my post, but it’s always good to start as you mean to go on!

So The following script will load Internet Explorer and compile the necessary URL. You will of course simply change the entries in the variables with the ones you created during your app registration. I have obfuscated some of the details in my examples, for obvious reasons 🙂

$clientID = "fvIPx********GoqqV4A"
$clientsecret = "5bYh6vvDTomAJ********RmrX7RzKL0oc0MJobrwnc"
$RedirURL = ""

$ie = New-Object -ComObject internetexplorer.application
$ie.Visible = $true

From the IE Window, you should login with your Yammer Verified Admin and authorise the app. Once logged in, proceed to this additional code…

$UrlTidy = $ie.LocationURL -match 'code=(......................)'; $Authcode = $Matches[1]
$ie = New-Object -ComObject internetexplorer.application
$ie.Visible = $true

This script simply captures the 302 return and extracts the $Authcode which is required for the token request. It will then launch an additional Internet Explorer session and prompt you to download an access_token.json. Within here you will find your Admin Token which does not expire and can be used for all further admin tasks. I found it useful to load this into a variable using the code below…

$Openjson = $(Get-Content 'C:\Tokens\access_token.json' ) -join "`n" | ConvertFrom-Json
$token = $Openjson.access_token.token

Ok, so we seem to be getting somewhere, but our Yammer page is still looking rather empty! Well now all the prerequistes are complete, we can make our first post. A good example json to use is posting a message, which is detailed in the link here.

I started with this one mainly because all we need is the Group_ID of one of the groups in Yammer and the message body in json format. I created a group manually and then just grabbed the Group_ID from the end of the URL in my browser. I have provided an example below…

$uri = ""

$Payloadjson = '{
"body": "Posting to Yammer!",
"group_id": 59***60

$Headers = @{
"Accept" = "*/*"
"Authorization" = "Bearer "+$token
"accept-encoding" = "gzip"

Invoke-RestMethod -Method Post -Uri $uri -Header $Headers -Body $Payloadjson
Yammer Result

Yammer Post

    It’s at this stage you’ll notice that I’ve only used my second cmdlet, Invoke-RestMethod. Both this and ConvertFrom-Json were introduced in PowerShell 3.0 and specifically designed for REST web services like this.

A key point to highlight here is the Authorisation attribute in the $Headers. This is where the $Token is passed to Yammer for authentication. Furthermore, this $Header construct is all you need going forward. It’s simply a case of changing the -Method, the $uri and the $Payload and you can play around with all the different json queries listed on the Yammer Site.

While this was useful for me, it soon became apparent that I wanted to perform actions on behalf of other users. This is something I’ll look to cover in Part 2 of this Blog, coming in the next few days!