Azure Logic App – Evaluating IF condition with the help of JSON expression by passing null

Introduction

Yes, you read the title right, this blog is about evaluating IF condition. You might be wondering what about IF, even novice developer with no experience knows about it.

Allow me to explain a specific scenario that helps us understand it’s behavior in Logic Apps, it might blow your mind.

Some of us come from years of development experience, and at times we like to skill up ourselves to various other technologies, which leaves us with a mindset based on our past development experience and programming habits, which we gained over the years. When clients requirements are approached based on these backgrounds, we expect the code to work with the certain flow and these are where rules are broken while using IF condition in Azure Logic Apps.

Understanding JSON expression

JSON expression evaluates string to JSON object using syntax as shown below

json({"Person":{"Name": "Simpson"}})  evaluates to var name = Person.Name as Simpson

But, the same json(null), throws an error (important), avoid where possible.

Understanding IF condition

IF don’t need any special introduction, we know how it works. As we know, it has two code blocks that are evaluated based on the condition and falls to one of the blocks. As applies to Logic Apps and below is the syntax for it.

@if("condition","true","false")

To understand IF better, let’s also look into @equals(), it is a simple expression that returns true or false based on the given input and provided comparing value.

Example 1

Below is just an example, please ignore simple equality condition.
@if(equals(1,1),"true1","false1")
Result: true1

Example 2

@if(equals(1,2),"true1","false1")
Result: false1

Now, let us take our person JSON and understand it.
@if(equals(1,1),"Merge",json({"Person":{"Name": "Homer"}}) ['Name'])
Result: Merge

and similarly when the comparison is not equal

@if(equals(1,2),"Merge",json({"Person":{"Name": "Homer"}}) ['Name'])
Result: Homer

Now, recall that IF falls to one of the code blocks and returns. But in case of Azure Logic Apps, it evaluates both the code blocks and returns one code block result, that it falls into.

Here is the proof

For Example, if I do something like below, it should result as “Merge”, but it actually throws an error. According to current Logic Apps, this is the current behavior.

@if(equals(1,1),"Merge",json(null) ['Name'])
Result: error

And similarly when not equal

@if(equals(1,2),"Merge",json(null) ['Name'])
Result: error

The above examples imply that Logic App evaluates both the code blocks and returns one.

Actual error thrown is as below from real logic app

InvalidTemplate. Unable to process template language expressions in action ‘Compose’ inputs at line ‘1’ and column ‘1525’: ‘The template language function ‘json’ expects its parameter to be a string or an XML. The provided value is of type ‘Null’. Please see https://aka.ms/logicexpressions#json for usage details.’.

Adding Bot to Microsoft Teams

If you are following up on my previous blog posts about Bots and integrating LUIS with them, you are almost done with building bots and already had some fun with it. Now it’s time to bring them to life and let internal or external users interact with Bot via some sort of front end channel accessible by them. If you haven’t read my previous posts on the subject yet, please give them a read at Creating a Bot and Creating a LUIS app before reading further.

In this blog post, we will be integrating our previously created intelligent Bots into Microsoft Teams channel. Following a step by step process, you can add your bot to MS Teams channel.

Bringing Bot to Life

  1. As a first step, you need to create a package as outlined here and build a manifest as per the schema listed here. This will include your Bot logos and a manifest file as shown below.

  2. Once manifest file is created, you need to zip it along with logos, as shown above, to make it a package with (*.zip)
  3. Open Microsoft team interface, select the particular team you want to add Bot to and go to Manage team section as highlighted below.

  4. Click on Bots tab, and then select Sideload a bot as highlighted and upload your previously created zip file

  5. Once successful, it will show the bot that you have just added to your selected team as shown below.

  6. If everything went well, your Bot is now ready and available in team’s conversation window to interact with. While addressing Bot, you need to start with symbol @BotName to direct messages to Bot as shown below.

  7. Based on the configuration you have done as part of the manifest file, your command list will be available against your Bot name.

  8. Now you can ask your Bot question that you have trained your LUIS app with and it will respond as programmed.

  9. You just need to ensure your Bot is programmed to respond possible questions your end user can ask it for.

  10. You can program a bot to acknowledge user first and then respond in detail on user’s question. If the response contains multiple records, you can represent it using cards as shown below.

  11. Or if a response requires some additional actions, you can have a link or a button to launch a URL directly from your team conversation.

  12. Besides adding Bot to teams, you can add tabs to a team as well which can show any SPA (single page application) or even a dashboard built as per your needs. Below is just an example of what can be achieved using tabs inside MS Teams.

As MS Teams is evolving as a group chat software, it can be leveraged to build useful integrations as a front face to many of the organisation’s needs capitalising on Bots as an example.

Using a Bot Framework to build LUIS enabled Bots

History

In this post, we are going to build a bot using Microsoft Bot framework and add intelligence to it to extract meanings from the conversation with users utilising Microsoft cognitive service named LUIS. The last post discussed details about LUIS, give it a read before you continue on reading. This post assumes you have a basic understanding of Language Understanding Intelligent Service (LUIS) and Bot Framework, further details can be read about them at LUIS and Bot Framework.

Pre-requisites

You need to download few items to start your quick bot development, please get all of them before you jump on to the next section.

  • Bot template is available at URL (this will help you in scaffolding your solution)
  • Bot SDK is available at NuGet (this is mandatory to build a Bot)
  • Bot emulator is available at GitHub (this helps you in testing your bot during development)

Building a Bot

  1. Create an empty solution in your Visual Studio and add a Bot template project as an existing solution.
  2. Your solution directory should like the one below:

  3. Replace parameters $safeprojectname$ and $guid1$ with some meaningful name for your project and set a unique GUID
  4. Next step is to restore and update NuGet packages and ensure all dependencies are resolved.

  5. Run the application from Visual Studio and you should see bot application up and running

  6. Now open Bot emulator and connect to your Bot as follows:

  7. Once connected, you can send a test text message to see if Bot is responding

  8. At this point, your bot is up and running and in this step you will add Luis dialogue to it. Add a new class named RootLuisDialog under Dialogs folder and add methods as shown below against each intent that you have defined under your LUIS app. Ensure you have your LUIS app id and a key to decorate your class as shown below:

  9. Let’s implement a basic response from LUIS against intent ‘boot’ as shown in the code below.

  10. Open up an emulator, and try to use any utterance we have trained our LUIS application with. A sample bot response should be received as we have implemented in the code above. LUIS will identify intent ‘boot’ from a user message as shown below.

  11. And now we will be implementing a bit advanced response from LUIS against our intent ‘status’ as shown in the code below.

  12. And now you can send a bit complex message to your bot and it will send a message to LUIS to extract entity and intent from the utterance and respond to the user accordingly as per your implementation.

And the list of intent implementation goes on and on, you can customise behaviour as per your needs as your LUIS framework is ready to rock and roll within your bot and users can take advantage of it to issue specific commands or inquire about entities using your Bot. Happy Botting 🙂

How LUIS can help BOTs in understanding natural language

Since bots are evolving, you need a mechanism to better understand what user wants from his/her language and take actions or respond to user queries appropriately. In the days of increasing automation, bots can certainly help provided they are backed by tools to understand user language both naturally and contextually.

Azure Cognitive Services has an API that can help to identify what user wants, extracts concepts and entities from a sentence (user input) using an intelligent service name Language Understanding Intelligent Service (LUIS). It can help process natural language using custom trained language models and can incorporate Active learning concept based on how it was trained.

In this blog post, we will be building a LUIS app that can be utilised in a Bot or any other client application to respond to the user in a more meaningful way.

Create a LUIS app

  1. Go to https://www.luis.ai/ and sign up.
  2. You need to create a LUIS app by clicking ‘New App’ – this is the app you will be using in Bot Framework
  3. Fill out a form and give your app a unique name
  4. Your app will be created, and you can see details as below (page will be redirected to Overview)
  5. You need to create entities to identify the concept, and is very important part of utterances (input from a user). Let’s create few simple entities using the form below
  6. You can also reuse pre built entity like email, URL, date etc.
  7. Next step is to build intent which represents a task or an action from utterance (input from a user). By default, you will have None which is for irrelevant utterances to your LUIS app.
  8. Once you have defined the series of intents, you need to add possible utterances against each intent which forms the basis of Active Learning. You need to make sure to include varied terminology and different phrases to help LUIS identify.You can build Phrase list to include words that must be treated similarly like company name or phone models etc.
  9. As you write utterances, you need to identify or tag entities like we selected $service-request in your utterance.Remember: you are identifying possible phrases to help LUIS extract intents and entities from utterances.
  10. Next step is to train your LUIS app to help it identify entities and intents from utterances. Ensure you click Train Application when you are done with enough training (you can also do such training on per entity or per intent basis)
  11. You can repeat step 10 as much time as you like to ensure LUIS app is trained well enough on your language model.
  12. Publish the app once you have identified all possible entities, intents, utterances and have trained LUIS well to extract them from user input.
  13. Keep a note of Programmatic API key from MyKey section and Application ID from Settings menu of your LUIS app, you will need these two keys when integrating LUIS with your client application.

Now you are ready to go ahead and use your LUIS app in your Bot or any other client application to process natural language in a meaningful manner – Cheers!

Quickly creating and using an Azure Key Vault with PowerShell

Introduction

A couple of weeks back I was messing around with the Azure Key Vault looking to centralise a bunch of credentials for my ever-growing list of Azure Functions that are automating numerous tasks. What I found was getting an Azure Key Vault setup and getting credentials in and out was a little more cumbersome than what I thought it should be. At that same point via Twitter this tweet appeared in my timeline from a retweet. I’m not too sure why, but maybe because I’m been migrating to VSCode myself I checked out Axel’s project.

Tweet

Axel’s PowerShell Module simplifies creating and integrating with the Azure Key Vault. After messing with it and suggesting a couple of enhancements that Axel graciously entertained, I’m creating vaults, adding and removing credentials in the simplified way I’d wanted.

This quickstart guide to using this module will get you started too.

Create an Azure Key Vault

This is one of the beauties of Axel’s module. If the Resource Group and/or Storage Group you want associated with your Key Vault doesn’t exist then it creates them.

Update the following script for the location (line 8) and the name (line 10) that will be given to your Storage Account, Resource Group and Vault. Modify if you want to use different names for each.

Done, Key Vault created.

Create Azure KeyVault

Key Vault Created

Connect to the Azure Key Vault

This script assumes you’re now in a new session and wanting to connect to the Key Vault. Again, a simplified version whereby the SG, RG and KV names are all the same.  Update for your location and Key Vault name.

Connected.

Connect to Azure Key Vault

Add a Certificate to the Azure Key Vault

To add a certificate to our new Key Vault use the command below. It will prompt you for your certificate password and add the cert to the key vault.

Add Cert to Vault

Certificate added to Key Vault.

Cert Added to Vault

Retrieve a Certificate from the Azure Key Vault

To retrieve a certificate from the Key Vault is just as simple.

$VaultCert = Get-AzureCertificate -Name "AADAppCert" -ResourceGroupName $name -StorageAccountName $name -VaultName $name

Retrieve a Cert

Add Credentials to the Azure Key Vault

Adding username/password or clientID/clientSecret to the Key Vault is just as easy.

# Store credentials into the Azure Key Vault
Set-AzureCredential -UserName "serviceAccount" -Password ($pwd = Read-Host -AsSecureString) -VaultName $name -StorageAccountName $name -Verbose

Credentials added to vault

Add Creds to Key Vault

Creds Added to Vault

Retrieve Credentials from the Azure Key Vault

Retrieving credentials is just as easy.

# Get credentials from the Azure Key Vault
$AzVaultCreds = Get-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials retrieved.

Retrieve Account Creds

Remove Credentials from the Azure Key Vault

Removing credentials is also a simple cmdlet.

# Remove credentials from the Azure Key Vault
Remove-AzureCredential -UserName "serviceAccount" -VaultName $name -StorageAccountName $name -Verbose

Credentials removed.

Remove Credential

Summary

Hopefully this gets you started quickly with the Azure Key Vault. Credit to Axel for creating the module. It’s now part of my toolkit that I’m using a lot.

Building websites with Ionic Framework, Angular and Microsoft Azure App Services

The Ionic Framework (https://ionicframework.com/) is an angular 4 based framework that is designed to build beautiful applications quickly and easily that can be targeted to native platforms as well as Progressive Web Apps (PWAs).  In this blog post, I’ll walk through the steps to start your own Ionic PWA hosted on Azure App Services, which will then serve your application.

What is Microsoft Azure App Services?

Microsoft Azure is a cloud platform that allows you to host server workloads that you’d previously host locally in a data centre or on a server somewhere to be hosted in an environment where massive scale and availability becomes available at an hourly rate. This is great for this application because our application only needs to serve static HTML and assets, which is very low on CPU overhead. We’ll host this as a free site in Azure and it will be fine. Azure App Services are Platform-as-a-Service web applications, your code, hosted on Azure’s infrastructure. You don’t worry about the operating system, or the web server, only the code you want to write.

What is Angular?

Angular is a browser based framework that allows complex single page applications to be built and run using a common set of libraries and structure. Angular 4 is the current version of the framework, and uses Typescript as the language you write programming components in, along with SCSS for CSS files, and HTML.

What is Ionic Framework?

Ionic Framework is an application that takes the Angular framework and powers it with Cordova to allow web application developers to be able to develop native apps. This means you have one common language and framework set that everyone can use to develop apps, native or web based. It also recently enabled support to build applications into PWA’s, or Progressive Web Apps, which are web based applications that behave very similar to native apps. Its this capability that we will take advantage of in this tutorial.

Prerequisites

You’ll need a Microsoft Azure account to create the Web App. You should have some knowledge of git, typescript and web application development, but most of this we’ll step through. You also need to install node and npm (https://nodejs.org/en/download/) which will enable you to develop the application. You can check that node and npm are working correctly by opening terminal or a command prompt and typing “npm -v” which will show you the current version of npm.

Steps to take

  1. First you need to install the ionic framework and cordova on your machine. You do this by opening a command prompt or terminal window and running: “npm install -g ionic cordova”
  2. Once you’ve done this, you’ll be able to run the command “ionic” and you should see the following: ionic
  3. Once this is done, you will need to create a directory, and create your ionic app. From your newly created directory, run “ionic start myApp blank”. You’ll be asked if you want to integrate your new app with Cordova. This means you would like to make a native application. In this instance we don’t (we’re creating a PWA) so type “n” and enter. This will download and install the required code for your application. Wait patiently – this will take a few minutes. start.png
  4. Once you’ve seen the success message, then your app is ready to serve locally. Change directory to “./myApp” and then run “ionic serve” and you should see your browser open with your app running. If you get an error saying “Sorry! ionic serve can only be run in an ionic project directory” you aren’t in the right folder.
  5. Now that your application is built, its ready for you to develop. All you do is go and edit the code in your src folder, and build what you need to. There are great generators and assistants that you can use to structure your app.
  6. At this point we need to ready our code for production – this means we need to minify, AoT and treeshake any wasted code from the Typescript, and remove our debug maps to reduce the size of the delivered application to our apps. To do this we run “ionic build –prod” which produces our production ready output.
  7. Its worth noting the “–prod” in the above build. This does the magic of reducing your code size. If you don’t do this, then the app will be megabytes (as you will take all of angular and ionic and its dependencies, which you won’t need). Try checking the size of the “www” folder using both steps. Mine went down from 11.1Mb to 2.96Mb.
  8. Our code is ready to commit to git. In fact, Ionic has already done that for you, there are only a few other items to check in – so run “git add .” and “git commit -m “initial build”” and you’ll be all good.
  9. The next step is to create your web app in Azure. Go to portal.azure.com and click Add -> Web App, then enter the details and choose your plan (note you will need to force the “free” plan in the settings.createwebapp.PNG
  10. Once you’ve deployed – your app will be able to be viewed at https://{yourwebappnamehere}.azurewebsites.net/. In this case https://bradleytest.azurewebsites.net/:webappstart.PNG
  11. Now its just time to get our running ionic code from our local (or build server if you use continuous integration/delivery) to our application. I’ve got a remote GitHub repository I’m going to push this code to (https://github.com/bsmithb2/ionicdemo), so ill run “git remote add origin https://github.com/bsmithb2/ionicdemo.git”  and then “git push origin master”.

Connecting git to Microsoft Azure

In this part, we’ll use git to connect to Windows Azure and continuously build and deploy using Visual Studio Team Services.

  1. Go to your web application you created in the Microsoft Azure portal, and then choose the “Continuous Delivery (preview)” menu option.
  2. Choose your source control repository (Github in my case) in the first stage. deployment
  3. Now select Build, then configure Continuous Delivery. This will set up your build in Visual Studio Team System. You’ll need to select nodejs as your build type. It will take a few minutes to set up the build and perform the first build and deploy. At this stage your app won’t work, but don’t worry – we’ll fix that next.
  4.  Once your build is set up, click on “Build Definition”. We need to make a change in the build definition, as the build isn’t yet running for npm, and the folder you wish to package and deploy is actually the “www” subdirectory.
  5. In the build process, add a new task – choose npm. Change the Command to “Custom” and then add “npm build –prod” to the arguments. This matches the build you did with “ionic build –prod” in step 6.build add NPM
  6. Once done, click the “Archive files” task, and add “/www” to the Root folder (or files) to archive. This tells VSTS to only package our output directory, which is all we need.  build change path
  7. Save the build. You can queue a build now if you like, or wait and queue one once we’ve tweaked the release.
  8. Go to releases, then choose your release (if you are unsure which, there is a “Release Definition” link in the Azure Portal near the Build Definition one.
  9. Turn off the web.config creation in File Transforms & Variable Substitution Options. release mod
  10. Turn on the “Remove Additional Files” setting in Additional Deployment Options.
  11. Save the Release Definition.
  12. At this point, you can trigger a build. It should take a few minutes. You can do this in VSTS, or alternatively change a file in your local git repository and push to github.
  13. Once the build has completed, open your web application again, and you’ll see your Ionic application!

finished

Conclusion

Ionic is a great solution to build cross-platform, responsive, mobile applications. Serving these applications is incredibly easy to do using Visual Studio Team Services and Microsoft Azure. We get great benefits in separation of concerns, while our scalability, security and cost management processes are simple as we’ve only deployed our consumer side code to this service, and its secured and managed infrastructure saves us time and risk. In this tutorial, we’ve built our first Ionic Application, pushed it to github and then set up our continuous delivery system in a few easy steps.

Getting started with Azure Cloud Shell

A few weeks back I noticed that I now had the option for the Azure Cloud Shell in the Azure Portal.

What is Azure Cloud Shell?

Essentially rather than having the Azure CLI installed on your local workstation, you can now initiate it from the Portal and you have automatically assigned (initiated as part of the setup) 5Gbytes of storage associated with it. So you can now create, manage and delete Azure resources using a centrally hosted CLI session. Each time you start your shell your homedrive will mount and your profile, scripts and whatever else you’ve stored in it will be available to you. Nice. Let’s do it.

Getting Started

Login to the Azure Portal and click on the Cloud Shell icon.

As this is the first time you’ve accessed it, you will not have any storage associated with your Azure Cloud Shell. You will be prompted for storage information.

Azure Files must reside in the same region as the machine being mounted to. Cloud Shell machines currently (July 2017) exist in the below regions:

Area Region
Americas East US, South Central US, West US
Europe North Europe, West Europe
Asia Pacific India Central, Southeast Asia

I hit the Advanced Settings to specify creation of a new Resource Group, Storage Account and File Share.

The UI doesn’t check for uniqueness of the configuration settings until it is written. So you might need a couple of attempts with the naming of your storage account. As you can see below it isn’t surprising that my attempt to use azcloudshell as a “Storage Account Name” was already taken.

Providing unique values for these options

.. let the initial creation go through just nicely. I now had a homedrive created for my profile and any files I create, store for my sessions.

As for commands you can use with the Azure CLI go have a look here for the full list that you can use to create, manage and delete your Azure resources.

Personally I’m currently doing a lot with Azure Functions. A list of the full range of Azure Functions CLI commands is available here.

The next thing I looked to do was to put my scripts etc into the clouddrive. I just navigated to the new StorageAccount that I created as part of this and uploaded via the browser.

Below you can see the file I uploaded on the right which appears in the directory in the middle pane.

Using the Azure CLI I changed directories and could see my uploaded file.

And that is pretty much it. Continue as you would with the CLI, but just now with it all centrally stored. Sweet.

The quickest way to create new VMs in Azure from existing VM snapshots, mostly with PowerShell

Originally posted on Lucian’s blog @clouduccino.com. Follow Lucian on Twitter @LucianFrango.


There’s probably multiple ways to do this, both right and wrong, but, here’s a process that I’ve been using for a while that I’ve recently tweaked to take advantage of new Azure Managed Disks.

Sidebar – standard managed disk warning

Before I go on though, I wanted to issue a quick warning about the differences between standard unmanaged and managed disks. Microsoft will be pushing you to you Managed Disks more and more. Yes, its a great feature that makes the management of VM disks simpler. The key bit of information though is as follows:

  • If you provision an unmanaged disk that is 1Tb in size, but, only use 100Gb, you are paying for 100Gb of storage costs. So you’re only paying for what you use. [1. Unmanaged disk cost – Azure Documentation ]
  • If you provision a managed disk that is 1Tb in size, but, only you 10Mb, you will be paying for the privilege of the whole 1Tb disk [2. Managed disk cost – Azure Documentation ]
  • Additionally, Premium disks, you’re paying for what you provision no matter if its managed or unmanaged

That aside, Managed Disks are a pretty good feature that makes disk and storage account management considerably simpler. If you’re frugal with your VM allocation and have the process to manage people and technology correctly, Managed Disks are great.

The Process

tl;dr

  • Create a snapshot in Azure
  • Copy the snapshot from snapshot storage location to Blob storage
  • Create a new VM instance based on the blob.vhd file
    • This blob post outlines the use of managed disks
    • However, mounting direct from Blob can also be done

The actual process

I’ve gone through this recently and updated it so that it’s as streamlined, for me, as possible. Again, this is skewed towards managed disk usage, but, can easily be extended to be used with unmanaged disks as well. Lets begin:

Step 0 ?

If you’re wanting to do this to create copies of your VM instances, to scale out your workload, remember to generalise or sysprep your VM instance prior to Step 1. In the example I go into below, my use case was to create a copy of a server from a production environment (VNET and subscription) and move it to different and seperate non-production environment (seperate VNET and subscription).

Step 1 – Create a snapshot of your VM disk(s)

The first thing we need to do is actually power off your virtual machine instance. I’ve seen that snapshots can happen while the VM instance is running, but, I guess you can call me a a little bit more old school, a little bit more on the cautious side when it comes to these sorts of things. I’ve been bitten by this particular bug in the past, unpleasant it was; so i’m inclined to err on the side of caution.

Once the VM instance is offline, go to the Azure Portal and search for “Snapshots”. Create a new snapshot.

  • NOTE: snapshots in Azure are done per DISK and not per VM INSTANCE
  • Name the snapshot
  • Select the subscription where the VM instance is located
  • Select the resource group you want to save the snapshot to
    • Or create a new one
  • Select the snapshot location
  • Select the source disk
    • If you earlier selected the same resource group where your VM instance is contained, the disk selection will display the resource group member VM instance disks first in the list
  • Select the storage type- standard or premium for your snapshot
    • I usually just use standard as I’ve not had the need for faster speed premium as yet (that will change one day for sure)
  • Create the snapshot

One the snapshot is created, complete this quick next step to generate an export access URL (we’ll need this in step 2):

  • Select the snapshot
  • From the top menu, select Export
  • You’ll be presented with a menu item with a time interval (based in seconds)
  • The default is 3600 or 1 hour
  • That is fine, but, I like to make that 36000 (add another 0) so that I have a whole day to do this again and again if need be
  • Save the generated URL to notepad for later!

Step 2 – Copy the snapshot to Blob

The next part relies on PowerShell. Update the following PowerShell script with your parameters to copy the snapshot to Blob:

$storageAccountName = "<storage account name>"
$storageAccountKey = “<storage account key>
$absoluteUri = “https://blahblahblah.blob.core.windows.net/blahblahblah/........
$destContainer = “<container>”
$blobName = “server.vhd

$destContext = New-AzureStorageContext –StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
Start-AzureStorageBlobCopy -AbsoluteUri $absoluteUri -DestContainer $destContainer -DestContext $destContext -DestBlob $blobName

Just for your info, heres a quick explanation of the above:

  • Storage account name = there storage account where you want to store the VHD
  • The storage account key = either the primary or secondary  which is used for authentication for accessing the storage account
  • The absolute URI = this is the snapshot URI we generated at the end of step 1
  • The destination container = where you want to store the VHD. Usually this is either “vhds”,  or maybe create one called “snapshots”
  • The blob name = the file name of the VHD itself (remember to only use lowercase)

Step 2.5 – Moving around the blob if need be

Before we actually create a new VM instance based on this snapshot blob, there is an additional option we could take. That is, perhaps it would make sense to move the blob to a different subscription. This is particularly handy when you would have a development environment that you would want to move to production. Other use cases might be the inverse- making a replica of a production system for development purposes.

The absolute fastest way to do this, as I don’t like being inefficient here is with the Azure Storage Explorer (ASE) tool. Its an application that provides a quick GUI for completing storage actions. If you add in both the storage accounts in the ASE, you can as easily as this:

  • Select the blob from storage account A (in subscription A)
  • Select copy from the top menu
  • Go to the your second storage account (storage account B in perhaps subscription B)
  • Go to the relevant container
  • Select paste from the top menu
  • Wait for the blob to copy
  • DONE

It can’t get any simpler or faster than that. I’m sure if you’re command line inclined, you have a quick go to PowerShell cmdlet for that, but, for me, I’ve found that to be pretty damn quick. So it isn’t broken, why fix it.

Step 3 – Create a new VM with a managed disk based on the snapshot we’ve put into Azure Blob

The final piece of the puzzle, as the cliche would go, is to create a new virtual machine instance. Again, as the wonderfully elusive and vague title of this blog post states, we’ll use PowerShell to do this. Sure, ARM templates would work and likely the Azure Portal can get you pretty far as well. However, again I like to be efficient and I’ve found that the following PowerShell script does this the best.

Additionally, you can change this up to mount the VHD from blob, vs create a new managed disk as well. So, for purpose of creating a new machine, PowerShell is as flexible as it is fast and convenient.

Here’s the script you’ll need to create the new VM instance:

#Prepare the VM parameters 
$rgName = "<resource-group-name>"
$location = "australiaEast"
$vnet = "<virtual-network>"
$subnet = "/subscriptions/xxxxxxxxx/resourceGroups/<resource-group-name>/providers/Microsoft.Network/virtualNetworks/<virtual-network>/subnets/<subnet>"
$nicName = "VM01-Nic-01"
$vmName = "VM01"
$osDiskName = "VM01-OSDisk"
$osDiskUri = "https://<storage-account>.blob.core.windows.net/<container>/server.vhd"
$VMSize = "Standard_A1"
$storageAccountType = "StandardLRS"
$IPaddress = "10.10.10.10"

#Create the VM resources
$IPconfig = New-AzureRmNetworkInterfaceIpConfig -Name "IPConfig1" -PrivateIpAddressVersion IPv4 -PrivateIpAddress $IPaddress -SubnetId $subnet
$nic = New-AzureRmNetworkInterface -Name $nicName -ResourceGroupName $rgName -Location $location -IpConfiguration $IPconfig
$vmConfig = New-AzureRmVMConfig -VMName $vmName -VMSize $VMSize
$vm = Add-AzureRmVMNetworkInterface -VM $vmConfig -Id $nic.Id

$osDisk = New-AzureRmDisk -DiskName $osDiskName -Disk (New-AzureRmDiskConfig -AccountType $storageAccountType -Location $location -CreateOption Import -SourceUri $osDiskUri) -ResourceGroupName $rgName
$vm = Set-AzureRmVMOSDisk -VM $vm -ManagedDiskId $osDisk.Id -StorageAccountType $storageAccountType -DiskSizeInGB 128 -CreateOption Attach -Windows
$vm = Set-AzureRmVMBootDiagnostics -VM $vm -disable

#Create the new VM
New-AzureRmVM -ResourceGroupName $rgName -Location $location -VM $vm

Again, let me explain a little the parameters we’ve set that the start of the script:

  • $rgName = the resource group where you want to deploy the VM instance
  • $location = the Azure region
  • $vnet = the virtual network where you want to deploy the VM instance
  • $subnet = the subnet where you want to deploy the VM instance
  • $nicName = the name of the NIC of the server
  • $vmName = the name of the VM instance, the server name
  • $osDiskName = the OS disk name
  • $osDiskUri = the direct URI/URL to the VHD in your storage account
  • $VMSize = the VM size or the service plan for the VM
  • $storageAccountType = what type of storage you would like to have
  • $IPaddress = the static IP address of the server as I like to do this in Azure, rather than use dynamic IP’s

And that is pretty much that!

Conclusion

It’s Friday in Sydney. Its the pre-kend and it’s a gloomy, cold 9th day of Winter 2017. I hope that the above content helps you out of a jam or gives you the insight you need to run through this process quickly and efficiently. That feeling of giving back, helping. Thats that feeling that should warm me up and get me to lunch time! Counting down!

Cheers!

How to build and deploy an Azure NodeJS WebApp using Visual Studio Code

Introduction

This week I had the need to build a small web application with a reasonably simple front end that will later be integrated inside a Portal. The web application isn’t going to be high use and didn’t necessitate deployment of infrastructure (VM’s). I’d messed with NodeJS a while back in this post where I configured a UI for Microsoft Identity Manager and Azure based functions.

In the back of my mind I knew I didn’t want to have to go for a full Visual Studio Project Solution for this, and with the recent updates to Visual Studio Code I figured it must be possible to do it using it. There wasn’t much around on doing it, so I dived in and worked it out for myself. Here I share the end-to-end process to make it easy for you to started.

Overview

What you will need on your development workstation before you start are the following components. Download and install them on your development machine.

You will also need an Azure Subscription to where you will publish your NodeJS site.

This post details setting up Visual Studio Code to build a shell NodeJS site and deploy it to Azure using a local GIT Repository. Let’s get started.

Visual Studio Code Extensions

A really smart and handy extension for VS Code is Azure Tools for VS Code. Release a few months ago (January 2017), this extension allows you to quickly create a Web App (Resource Group, App Service, Application Service Plan etc) from within VS Code. With VSCode on your development machine from the prerequisites above click on the Extensions Icon (bottom left) in the VSCode menu and type Azure Tools. Click the green Install button.

Azure Tools for VS Code

Creating the NodeJS Site in VS Code

I had a couple of attempts at doing this before I found a quick, neat and repeatable method of getting started. In order to get the Web App deployed and accessible correctly in Azure I found it easiest to use the Sample Azure NodeJS Hello World example from here. Download that sample and extract the contents to a new folder on your workstation. I created a new path on mine named …\NodeJS\nodejssite and dropped the sample in there so it looked like below.

After creating the folder structure and putting the sample in it, whilst in the sub-directory type:

code .

That will startup Visual Studio Code in the newly created folder with the starter sample.

Install Express for NodeJS

To that base sample site we’ll install Express. From the Terminal tab in the lower pane type:

npm install -g express-generator

Express App

With Express now on our machine, lets add the Express App to our new NodeJS site. Type express in the Terminal window.

express

Accept that the directory is not empty

This will create the folder structure for Express.

Now to get all the files and modules for our site configured for our app run npm install

Now type npm start in the terminal window to start our new site.

The NodeJS site will start. Open a Web Browser and go to http://localhost:3000 and you should see the Express empty site.

Navigate to views => index.jade Update the text like I have below.

Refresh your browser window and you should see the text updated.

In the terminal windows press Cntrl + C to stop NodeJS.

Test Deploy to Azure

Now let’s do a test deploy of our shell site as an Azure WebApp.

Press Cntrl + Shift + P or from the View menu select Command Palette.

Type Azure: Login 

This will generate a code and give you a link to open in your browser and login

Paste in the code from the clipboard and select continue

Then login with your account for the Tenant where you want to deploy the WebApp too. You’ll then be authorized.

From the Command Palette type azure sub and choose Azure: List Azure Subscriptions and choose the subscription where you will create and deploy the WebApp

Now from the Command Palette type Azure Create a Web App (Simple).

Give the WebApp a name. This will become the WebApp Name, and the basis for the all the associated WebApp components. Use Create a Web App (Advanced) if you want to be more specific about the name of the App Resources etc.

If you watch the bottom VS Code Status bar you will see the Azure Tools extension create the new Resource Group, Web App and Web App Plan.

Login to the Azure Portal, select the new Web App.

Select Deployment Options and then Local Git Repository. Select OK.

Select Deployment credentials and provide a username and password. You’ll need this shortly to publish your site.

Click Overview. Copy the Git clone url.

Back in VS Code, select the GIT icon (under the magnifying glass) and from the top choose Initialize Repository.

Then in the terminal window type git remote add azure <git clone url> obtained from the step above.

Type Initial Commit as the message and click the tick icon in the Source Control menu bar.

Select and select Publish

Select azure as the remote target we setup earlier.

You’ll be prompted to authenticate. Use the account you created above in Deployment Credentials.

Back in the Azure Portal under the Web App under Deployment Options you will see the initial commit.

Click on Overview and you should see that it is running. Click on URL and the site will open in a new tab in your browser.

Updating our WebApp

Now, let’s make a change to our WebApp.

Back in VS Code, click on the files and folder icon in the top left corner, navigate to views => index.jade and update the title. Hit Cntrl + S (or select Save from the File menu). In Terminal below type npm start to start our NodeJS site locally.

Check out the update locally. In a browser navigate to the local NodeJS site on localhost:3000. You’ll see the changed page.

Select the Git icon on the left menu, give the update some text e.g. ‘updated page text’ and select the tick from the top menu.

Select the and choose Push to publish the changes to our Azure WebApp.

Go back to your browser which was on the Azure WebApp URL and reload. Our change and been push and reflected in the WebApp.

Summary

Very quickly and easily using Visual Studio Code (with NodeJS and Git Desktop installed locally) we have;

  • Created an Azure WebApp
  • Created a base NodeJS site
  • Have a local NodeJS site we can develop
  • Publish it to Azure

Now go create something awesome.

How to create and auto update route tables in Azure for your local Azure datacentre with Azure Automation, bypassing firewall appliances

Originally posted on Lucians blog, at clouduccino.com. Follow Lucian on Twitter @LucianFrango.


When deploying an “edge” or “perimeter” network in Azure, by way of a peered edge VNET or an edge subnet, you’ll likely want to deploy virtual firewall appliances of some kind to manage and control that ingress and egress traffic. This comes at a cost though. That cost being that Azure services are generally accessed via public IP addresses or hosts, even within Azure. The most common of those and one that has come up recently is Azure Blob storage.

If you have ExpressRoute, you can get around this by implementing Public Peering. This essentially sends all traffic destined for Azure services to your ER gateway. A bottleneck? Perhaps.

The problem in detail

Recently I ran into a road block on a customers site around the use of Blob storage. I designed an edge network that met certain traffic monitoring requirements. Azure NSGs were not able to meet all requirements, so, something considerably more complex and time consuming was implemented. It’s IT, isn’t that what always happens you may ask? 

Heres some reference blog posts:

Getting Azure 99.95% SLA for Cisco FTD virtual appliances in Azure via availability sets and ARM templates

Lessons learned from deploying Cisco Firepower Threat Defence firewall virtual appliances in Azure, a brain dump

WE deployed Cisco Firepower Threat Defence virtual appliance firewalls in an edge VNET. Our subnets had route tables with a default route of 0.0.0.0/0 directed to the “tag” “VirtualAppliance”. So all traffic to a host or network not known by Azure is directed to the firewall(s).  How that can be achieved is another blog post.

When implementing this solution, Azure services that are accessed via an external or public range IP address or host, most commonly Blob Storage which is accessed via blah.blob.core.windows.net, additionally gets directed to the Cisco FTDs. Not a big problem, create some rules to allow traffic flow etc and we’re cooking with gas.

Not exactly the case as the FTDv’s have a NIC with a throughput of 2GiB’s per second. That’s plenty fast, but, when you have a lot of workloads, a lot of user traffic, a lot of writes to Blob storage, bottle necks can occur.

The solution

As I mentioned earlier this can be tackled quickly through a number of methods. These discarded methods in this situation are as follows:

  • Implement ExpressRoute
    • Through ExpressRoute enable public peering
    • All traffic to Azure infrastructure is directed to the gateway
    • This is a “single device” that I have heard whispers is a virtual Cisco appliance similar to a common enterprise router
    • Quick to implement and for most cases, the throughout bottleneck isn’t reached and you’re fine
  • Implement firewall rules
    • Allow traffic to Microsoft IP ranges
    • Manually enter those IP ranges into the firewall
      • These are subject to change to a maintenance or managed services runbook should be implemented to do this on a regular basis
    • Or- enable URL filtering and basically allow traffic to certain URI’s or URL’s

Like I said, both above will work.

The solution in this blob post is a little bit more technical, but, does away with the above. Rather than any manual work, lets automate this process though AzureAutomation. Thankfully, this isn’t something new, but, isn’t something that is used often. Through the use of pre-configured Azure Automation modules and Powershell scripts, a scheduled weekly or monthly (or whatever you like) runbook to download the Microsoft publicly available .xml file that lists all subnets and IP addresses use in Azure. Then uses that file to update a route table the script creates with routes to Azure subnets and IP’s in a region that is specified.

This process does away with any manual intervention and works to the ethos “work smarter, not harder”. I like that very much, and no, that is not being lazy. It’s being smart.

The high five

I’m certainly not trying to take the credit for this, except for the minor tweak to the runbook code, so cheers to James Bannan (@JamesBannan) who wrote this great blog post (available here) on the solution. He’s put together the Powershell script that expands on a Powershell module written by Kieran Jacobson (@kjacobsen). Check out their Twitters, their blogs and all round awesome content for. Thank you and high five to both!

The process

I’m going to speed through this as its really straight forward and nothing to complicated here. The only tricky part is the order in doing that. Stick to the order and you’re guaranteed to succeed:

  • Create a new automation user account
  • Create a new runbook
    • Quick create a new Powershell runbook
  • Go back to the automation account
  • Update some config:
    • Update the modules in the automation account – do this FIRST as there are dependencies on up to date modules (specifically the AzureRM.Profile module by AzureRM.network)
    • By default you have these runbook modules:

    • Go to Browse Gallery
    • Select the following two modules, one at a time, and add to the automation user account
      • AzureRM.Network
      • AzurePublicIPAddress
        • This is the module created by Kieran Jacobson 
    • Once all are in, for the purposes of being borderline OCD, select Update Azure Modules
      • This should update all modules to the latest version, incase some are lagging a little behind
    • Lets create some variables
      • Select Variables from the menu blade in the automation user account
      • The script will need the following variables for YOUR ENVIRONMENT
        • azureDatacenterRegions
        • VirtualNetworkName
        • VirtualNetworkRGLocation
        • VirtualNetworkRGName
      • For my sample script, I have resources in the Australia, AustraliaEast region
      • Enter in the variables that apply to you here (your RGs, VNET etc)
  • Lets add in the Powershell to the runbook
    • Select the runbook
    • Select EDIT from the properties of the runbook (top horizontal menu)
    • Enter in the following Powershell:
      • this is my slightly modified version
$VerbosePreference = 'Continue'

### Authenticate with Azure Automation account

$cred = "AzureRunAsConnection"
try
{
 # Get the connection "AzureRunAsConnection "
 $servicePrincipalConnection=Get-AutomationConnection -Name $cred

"Logging in to Azure..."
 Add-AzureRmAccount `
 -ServicePrincipal `
 -TenantId $servicePrincipalConnection.TenantId `
 -ApplicationId $servicePrincipalConnection.ApplicationId `
 -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
}
catch {
 if (!$servicePrincipalConnection)
 {
 $ErrorMessage = "Connection $cred not found."
 throw $ErrorMessage
 } else{
 Write-Error -Message $_.Exception
 throw $_.Exception
 }
}

### Populate script variables from Azure Automation assets

$resourceGroupName = Get-AutomationVariable -Name 'virtualNetworkRGName'
$resourceLocation = Get-AutomationVariable -Name 'virtualNetworkRGLocation'
$vNetName = Get-AutomationVariable -Name 'virtualNetworkName'
$azureRegion = Get-AutomationVariable -Name 'azureDatacenterRegions'
$azureRegionSearch = '*' + $azureRegion + '*'

[array]$locations = Get-AzureRmLocation | Where-Object {$_.Location -like $azureRegionSearch}

### Retrieve the nominated virtual network and subnets (excluding the gateway subnet)

$vNet = Get-AzureRmVirtualNetwork `
 -ResourceGroupName $resourceGroupName `
 -Name $vNetName

[array]$subnets = $vnet.Subnets | Where-Object {$_.Name -ne 'GatewaySubnet'} | Select-Object Name

### Create and populate a new array with the IP ranges of each datacenter in the specified location

$ipRanges = @()

foreach($location in $locations){
 $ipRanges += Get-MicrosoftAzureDatacenterIPRange -AzureRegion $location.DisplayName
}

$ipRanges = $ipRanges | Sort-Object

### Iterate through each subnet in the virtual network
foreach($subnet in $subnets){

$RouteTableName = $subnet.Name + '-RouteTable'

$vNet = Get-AzureRmVirtualNetwork `
 -ResourceGroupName $resourceGroupName `
 -Name $vNetName

### Create a new route table if one does not already exist
 if ((Get-AzureRmRouteTable -Name $RouteTableName -ResourceGroupName $resourceGroupName) -eq $null){
 $RouteTable = New-AzureRmRouteTable `
 -Name $RouteTableName `
 -ResourceGroupName $resourceGroupName `
 -Location $resourceLocation
 }

### If the route table exists, save as a variable and remove all routing configurations
 else {
 $RouteTable = Get-AzureRmRouteTable `
 -Name $RouteTableName `
 -ResourceGroupName $resourceGroupName
 $routeConfigs = Get-AzureRmRouteConfig -RouteTable $RouteTable
 foreach($config in $routeConfigs){
 Remove-AzureRmRouteConfig -RouteTable $RouteTable -Name $config.Name | Out-Null
 }
 }

### Create a routing configuration for each IP range and give each a descriptive name
 foreach($ipRange in $ipRanges){
 $routeName = ($ipRange.Region.Replace(' ','').ToLower()) + '-' + $ipRange.Subnet.Replace('/','-')
 Add-AzureRmRouteConfig `
 -Name $routeName `
 -AddressPrefix $ipRange.Subnet `
 -NextHopType Internet `
 -RouteTable $RouteTable | Out-Null
 }

### Add default route for Edge Firewalls
 Add-AzureRmRouteConfig `
 -Name 'DefaultRoute' `
 -AddressPrefix 0.0.0.0/0 `
 -NextHopType VirtualAppliance `
 -NextHopIpAddress 10.10.10.10 `
 -RouteTable $RouteTable
 
### Include a routing configuration to give direct access to Microsoft's KMS servers for Windows activation
 Add-AzureRmRouteConfig `
 -Name 'AzureKMS' `
 -AddressPrefix 23.102.135.246/32 `
 -NextHopType Internet `
 -RouteTable $RouteTable

### Apply the route table to the subnet
 Set-AzureRmRouteTable -RouteTable $RouteTable

$forcedTunnelVNet = $vNet.Subnets | Where-Object Name -eq $subnet.Name
 $forcedTunnelVNet.RouteTable = $RouteTable

### Update the virtual network with the new subnet configuration
 Set-AzureRmVirtualNetwork -VirtualNetwork $vnet -Verbose

}

How is this different from James’s?

I’ve made two changes to the original script. These changes are as follows:

I changed the authentication to use an Azure Automation account. This streamlined the deployment process so I could reuse the script across another of subscriptions. This change was the following:

$cred = "AzureRunAsConnection"
try
{
 # Get the connection "AzureRunAsConnection "
 $servicePrincipalConnection=Get-AutomationConnection -Name $cred

"Logging in to Azure..."
 Add-AzureRmAccount `
 -ServicePrincipal `
 -TenantId $servicePrincipalConnection.TenantId `
 -ApplicationId $servicePrincipalConnection.ApplicationId `
 -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint 
}
catch {
 if (!$servicePrincipalConnection)
 {
 $ErrorMessage = "Connection $cred not found."
 throw $ErrorMessage
 } else{
 Write-Error -Message $_.Exception
 throw $_.Exception
 }
}

Secondly, I added an additional static route. This was for the default route (0.0.0.0/0) which is used to forward to our edge firewalls. This change was the following:

### Add default route for Edge Firewalls
 Add-AzureRmRouteConfig `
 -Name 'DefaultRoute' `
 -AddressPrefix 0.0.0.0/0 `
 -NextHopType VirtualAppliance `
 -NextHopIpAddress 10.10.10.10 `
 -RouteTable $RouteTable

You can re-use this section to add further custom static routes

Tying it all together

  • Hit the SAVE button and job done
    • Well, almost…
  • Next you should test the script
    • Select the TEST PANE from the top horizontal menu
    • A word of warning- this will go off and create the route tables and associate them with the subnets in your selected VNET!!!
    • Without testing though, you can’t confirm it works correctly
  • Should the test work out nicely, hit the publish button in the top hand menu
    • This gets the runbook ready to be used
  • Now go off and create a schedule
    • Azure public IP addresses can update often
    • It’s best to be ahead of the game and keep your route tables up to date
    • A regular schedule is recommended – I do once a week as the script only takes about 10-15min to run 
    • From the runbook top horizontal menu, select schedule
    • Create the schedule as desired
    • Save
  • JOB DONE! -No really, thats it!

Enjoy!

Best,

Lucian