Accessing to Camera on Mobile Devices from ASP.NET Core Application in Vue.js and TypeScript

In the previous post, we built an ASP.NET Core application using Vue.js and TypeScript. As a working example, we’re building a mobile web application. Many modern web browsers supporting HTML5 can access to multimedia devices on users’ computer, smartphones or tablets, such as camera and microphone. The Navigator.getUserMedia() API enables us to access to those resources. In this post, we’re actually going to implement a feature for camera access on our computer and mobile devices, by writing codes in VueJs and TypeScript.

The code samples used for this post can be found here.

getUserMedia() API

Most modern web browsers support this getUserMedia() API, as long as they support HTML5. There are two different APIs around this method – one is Navigator.getUserMedia() that supports callback functions, while the other MediaDevices.getUserMedia(), that came up later, supports Promise so that we can avoid Callback Hell. However, not all browsers support the MediaDevices.getUserMedia(), so we need to support both anyway. For more details around getUserMedia(), we can find some practical samples in this MDN document.

Prerequisites

  • ASP.NET Core application from the previous post
  • Computer, tablet or smartphone having camera

NOTE 1: This post uses VueJs 2.2.1 and TypeScript 2.2.1. VueJs 2.2.1 introduced some breaking changes how it interacts with TypeScript. Please have a look at the official guide document.

NOTE 2: vue-webcam written by @smronju was referenced for camera access, and modified to fit in the TypeScript format.

Update Hello.vue

We need a placeholder for camera access and video streaming. Add the following HTML codes into the template section in Hello.vue.

  • video accepts the camera input. src, width, height and autoplay are bound with the component in Hello.ts. Additionally, we add the ref attribute for the component to recognise the video tag.
  • img is where the camera input is rendered. The photo field is used for data binding.
  • button raises the mouse click or finger tab event by invoking the takePhoto function.

The HTML bits are done. Let’s move on for TypeScript part.

Update Hello.ts

The existing Hello.ts was simple, while this time it’s grown up to handle the camera API. Here’s the bits:

We can see many extra data fields for two-way data binding between user input and application. Some of them comes with their default values so that we don’t have to worry about their initialisation too much.

  • The takePhoto() function creates a virtual DOM for canvas, converts the input signal from the video into an image, and sends it to the img tag to display snapshot.

  • The mounted() event function is invoked when this component, Hello.ts, is mounted to its parent. It uses the getUserMedia() API to bind streaming source to the video tag.
  • The video tag through this.$refs.video is the HTML element that has the ref attribute in Hello.vue. Without the ref attribute, VueJs cannot know where to access to the tag.

NOTE: The original type of the this.$refs instance is { [key: string]: Vue | Element | Vue[] | Element[] }, while we cast it to any. This is to avoid build failure due to the linting error caused by using the original type and accessing to the resource by referencing like this.$refs.video. If we don’t want to cast it to any, we can use this.$refs["video"] instead.

We’ve so far completed the coding part. Now, let’s build this up and run a local IIS Express, and access to the web app through http://localhost:port. It works fine.

This time, instead of localhost, use the IP address. If we want to remotely access to our local dev website, this post would help.

It says we can’t use the camera because of its insecure access. In order to use the getUserMedia() API, we should use HTTPS connection to prevent private data exposure. This only happens when we’re using Google Chrome, not FireFox or Edge. So, just change the connection to HTTPS.

Now we can use IP address for camera access. Once we allow it we can immediately see our face directly on the web like below (yeah, it’s me! lol).

Let’s try this from our mobile devices. The first one is taken from Android phone, followed by the one taken from Windows Phone, then the ones from iPhone. Thanks Boris for help take those pictures!

Errr… what happened on iPhone? The camera is not accessible from both Safari for iOS and Chrome for iOS!!

This is because not all mobile web browsers support the getUserMedia() API.

getUserMedia Browser Compatibility

Here’s the data sheet from http://mobilehtml5.org/.

Unfortunately, we can’t use the getUserMedia API on iOS for now. For iOS users, we have to provide alternative methods for their user experience. There’s another API called HTML Media Capture that is supported by all mobile web browsers. It uses the traditional input type="file" tag. With this, we can access to camera on our mobile devices.

In the next post, we’re going to figure out how to provide a fallback option, if getUserMedia() API is not available.

It’s time to get your head out of the clouds!

head-in-the-clouds1

For those of you who know me, you are probably thinking “Why on earth would we be wanting to get our heads out of the “Cloud” when all you’ve been telling me for years now is the need to adopt cloud!

This is true for the most part, but my point here is many businesses are being flooded by service providers in every direction to adopt or subscribe to their “cloud” based offering, furthermore ICT budgets are being squeezed forcing organisations into SaaS applications. The question that is often forgotten is the how? What do you need to access any of these services? And the answer is, an Identity!

It is for this reason I make the title statement. For many organisations, preparing to utilise a cloud offering` can require months if not years of planning and implementing organisational changes to better enable the ICT teams to offer such services to the business. Historically “enterprises” use local datacentres and infrastructure to provide all business services, where everything is centrally managed and controlled locally. Controlling access to applications is as easy as adding a user to a group for example. But to utilise any cloud service you need to take a step back and get your head back down on ground zero, look at whether your environment is suitably ready. Do you have an automated simple method of providing seamless access to these services, do you need to invest in getting your business “Cloud Ready”?

image1This is where Identity comes to the front and centre, and it’s really the first time for a long time now Identity Management has become a centrepiece to the way a business can potentially operate. This identity provides your business with the “how’ factor in a more secure model.

So, if you have an existing identity management platform great, but that’s not to say it’s ready to consume any of these offerings, this just says you have the potential means. For those who don’t have something in place then welcome! Now I’m not going to tell you products you should or shouldn’t use, this is something that is dictated by your requirements, there is no “one size fits all” platform out there, in fact some may need a multitude of applications to deliver on the requirements that your business has, but one thing every business needs is a strategy! A roadmap on how to get to that elusive finish line!

A strategy is not just around the technical changes that need to be made but also around the organisational changes, the processes, and policies that will inevitably change as a consequence of adopting a cloud model. One example of this could be around the service management model for any consumption based services, the way you manage the services provided by a consumption based model is not the same as a typical managed service model.

And these strategies start from the ground up, they work with the business to determine their requirements and their use cases. Those two single aspects can either make or break an IAM service. If you do not have a thorough and complete understanding of the requirements that your business needs, then whatever the solution being built is, will never succeed.

So now you may be asking so what next? What do we as an organisation need to do to get ourselves ready for adopting cloud? Well all in all there are basically 5 principles you need to undertake to set the foundation of being “Cloud Ready”

  1. Understand your requirements
  2. Know your source of truth/s (there could be more then one)
  3. Determine your central repository (Metaverse/Vault)
  4. Know your targets
  5. Know your universal authentication model

Now I’m not going to cover all these in this one post, otherwise you will be reading a novel, I will touch base on these separately. I would point out that many of these you have probably already read, these are common discussion points online and the top 2 topics are discussed extensively by my peers. The point of difference I will be making will be around the cloud, determining the best approach to these discussions so when you have your head in the clouds, you not going to get any unexpected surprises.

Remote Access to Local ASP.NET Core Applications from Mobile Devices

One of the most popular tools for ASP.NET or ASP.NET Core application development is IIS Express. We can’t deny it. Unless we need specific requirements, IIS Express is a sort of de-facto web server for debugging on developers’ local machines. With IIS Express, we can easily access to our local web applications with no problem during the debugging time.

There are, however, always cases that we need to access to our locally running website from another web browsers like mobile devices. As we can see the picture above, localhost is the loopback address so we can’t use it outside our dev box. It’s not working by simply replacing the loopback address with a physical IP address. We need to adjust our dev box to allow this traffic. In this post, we’re going to solve this issue by looking at two different approaches.

At the time of writing this post, we’re using Visual Studio (VS) 2015, as VS 2017 will be launched on March 7, 2017.

Network Sharing Options and Windows Firewall

Please make sure that all screenshots for this section are taken from Windows 10. Open my current connected network (either wireless or wired).

Make sure that the “Make this PC discoverable” option is turned on.

This option enables our network in “Private” mode on Windows Firewall:

WARNING!!!: If our PC is currently connected to a public network, for our better security, we need to turn off the private network settings; otherwise our PC will get vulnerable from malicious attacks.

Update Windows Firewall Settings

In this example, the locally running web app uses the port number of 7314. Therefore, we need to register a new inbound firewall rule to allow access through the port number. Open “Windows Firewall with Advanced Security” through Control Panel and create a new rule with options below:

  • Rule Type: Port
  • Protocol: TCP
  • Port Number: 7314
  • Action: Allow the Connection
  • Profile: Private (Domain can also be selected if our PC is bound with domain controllers)
  • Name: Self-descriptive name of anything! eg) IIS Express Port Opener

Now, all traffic through this port number is allowed from now on. So far, we’ve completed the basic environment settings including firewalls. Let’s move onto the first option using IIS Express itself.

1. Updating IIS Express Configurations Directly

When we install VS, IIS Express is also installed at the same time. Its default configuration file is located at somewhere but each solution that VS 2015 creates has its own settings that overwriting the default one and it’s stored to the .vs folder like:

Open applicationhost.config for update.

Add another binding with my local IP address like:

We can easily find our local IP address by running the ipconfig command. We’re using 192.168.1.3 for now.

IIS Express has now been set. Let’s try our mobile web browser to access the local dev website by IP address.

All good! It seems to be working now. However, if we have more web applications running on our dev environment for our development work, every time we create a new web application project, we have to register the port number, allocated by IIS Express, to Windows Firewall. No good. Too repetitive. Is there any other convenient way? Of course there is.

2. Conveyor – Visual Studio Extension

Conveyor can sort out this hassle. At the time of this writing, its version is 1.3.2. After installing this extension, run the debugging mode by typing the F5 key again and we will be able to see a new window like:

The Remote URL is what we’re going to use. In general, the IP address would look like 192.168.xxx.xxx, if we’re in a small network (home, for example), or something different type of IP address type, if we’re in a corporate network. This is the IP address that the mobile devices use. Another important point is Conveyor uses the port number starting from 45455. Whatever port number IIS Express assigns the web application project, Conveyor forwards it to 45455. If 45455 is taken by others, it looks up one and one until a free port number exists. Due to this behaviour, we can easily predict the port number range, instead of the random nature of IIS Express. Therefore, we can register the port number range starting from 45455 to whatever we want, 45500 for example.

Now, we can access to our local dev website by using this port number pool like:

If we’re developing a web application using HTTPS connection, that wouldn’t be an issue. If no self-signed certificate is installed on our local dev machine, Conveyor will install one and that’s it. Visiting the website again through HTTPS connection will display the initial warning message and finally gets the page.

We’ve so far discussed how to remotely access to our local dev website using either IIS Express configuration or Conveyor. Conveyor gets rid of repetitive firewall registration, so it’s worth installing for our web app development.

Writing Vue.js Applications in TypeScript on ASP.NET Core

In the previous post, we’ve briefly walked through how to build Vue.js application on ASP.NET Core. Like other modern JavaScript framework, VueJs also supports TypeScript out-of-the-box. If we can get full benefits from TypeScript to build a VueJs app, it would be awesome! There are many resources referring to the combination of VueJs and TypeScript. However, they are not using the basic template that VueJs provides, which brings about less confidence to those developers who just started using VueJs. Even worse, due to the recent version up of Webpack to 2.x, we might need a new tutorial to build a VueJs application using TypeScript. In this post, our goal will be:

  • To use the basic template provided by VueJs,
  • To use Webpack version 2.x, and
  • To run the app on ASP.NET Core.

The sample code used in this post can be found at here.

Prerequisites

We have already built a VueJs application running on ASP.NET Core in the previous post. So we’re going to re-use that.

Update on March 6th, 2017: We updated the TypeScript version to 2.2.1 for this post.

Installing npm Packages

TypeScript

We can install TypeScript locally only for this application:

Or we can install it globally:

If TypeScript is installed globally, we should link it to this application:

ts-loader

ts-loader offers us to load .ts files to .js file without actually building them during the development time.

vue-class-component & vue-property-decorator

If we want to use .ts in our VueJs development, as the official document recommends, we should install the vue-class-component library for class decorators.

It may be necessary to install vue-property-decorator to extend vue-class-component. This is not relevant to this post, though.

vue-typescript-import-dts

TypeScript needs type definitions. vue-typescript-import-dts helps recognise .vue files as .ts.

All necessary npm packages are installed. Let’s move on.

Configurations for TypeScript

tsconfig.json

In order to use .ts, we firstly need tsconfig.json. In this post we just use the bare minimum settings to work. Further details about tsconfig.json can be found here.

Let me explain the configuration in-depth.

  • VueJs supports ECMAScript 5. Therefore, we need to target TypeScript to es5. It means that module should be CommonJs as well as lib should include dom, es2015 and es2015.promise.
  • types declares custom type definitions. As we’ve installed vue-typescript-import-dts, include it here so that the application can recognise .vue files as .ts files.
  • In order to use class decorators, we’ve installed vue-class-component. But this is not enough. We need to enable it by setting the experimentalDecorators value to be true.
  • Within the include property, we need to declare which directories are considered containing .ts files.

Update on March 6th, 2017 Due to the version update of VueJs to 2.2.x, tsconfig.json also needs to be updated. This is the recommended configuration from the official guide.

Also, please make sure that we create the template from vue-cli by running vue init webpack. It installs vue@2.2.1 and vue-router@2.2.0. If those versions are different, please update them.


.eslintignore

While developing apps in TypeScript, .js files are automatically compiled and generated. But it’s not guaranteed those files comply to linting process. Therefore, we can’t be sure if those generated .js files are ESLint compliant or not. Therefore, to avoid linting errors from those generated .js files, we just turn it off by adding a line, src/**/*.js, to .eslintignore.

We just completed basic configurations for TypeScript compiling. Let’s move on.

Converting JavaScript to TypeScript

It’s time to convert existing .js files in the src directory to .ts ones situated. We’ll only look after both build and src directories.

build/webpack.base.conf.js

As Webpack is the only service to refer this file, so it’s not necessary to change this to .ts. But we do need to modify it.

First of all, the entry point should be changed from main.js to main.ts:

Then we need to replace the babel-loader part with the ts-loader one:

Every .ts file is handled by this loader. Here’s an interesting option, appendTsSuffixTo. If we use this, .vue files can be treated as .ts ones. VueJs uses the Single File Component approach – all HTML section, JavaScript section, and CSS sections are put in one single file called .vue. Therefore we need to handle it to be a TypeScript file, particularly for the JavaScript section.

We’ve completed webpack configuration to enable TypeScript handling. Let’s really convert JavaScript files to TypeScript ones.

src/main.jssrc/main.ts

Change the existing JavaScript syntax to the one for TypeScript like:

Spot on the new Vue({ ... }) part. Instead of template and components, the render function is placed. Everything has been compiled before hitting this point, and each component needs more control by itself, so we just use the render function. For more details about the render function, please refer to the official document.


Updated on March 6th, 2017: Due to the version update of VueJs to 2.2.x, the import statements part needs to be updated like below:


src/router/index.jssrc/router/index.ts

We don’t have to worry about this. Just be cautious when using import ....


Updated on March 6th, 2017: Due to the version update of VueJs to 2.2.x, the import statements part needs to be updated like below:


src/App.vuesrc/App.ts

Instead of using one single .vue file, we’re separating the TypeScript part from each .vue. Why are we doing this, by the way? We can still use .vue indeed. But for better maintainability, we’d better to create a separate .ts file. Let’s have a look how we can implement App.ts that is extracted from App.vue.

@Component decorator contains the name declaration so that the router can easily recognise it. The script part in the original App.vue can be altered like this:

Make sure that we should include lang="ts" as an attribute of the script tag.


Updated on March 6th, 2017: Due to the version update of VueJs to 2.2.x, the import statements part needs to be updated like below:


src/components/Hello.vuesrc/components/Hello.ts

Now, we’re going to extract the script section from Hello.vue to Hello.ts. Let’s have a look.

Likewise, @Component contains the name declaration. Previously all two-way binding fields were defined within the data function. Using properties makes them more class-friendly. Functions became methods.

Maybe someone indicates a small change, comparing to the previous post. In order to use AJAX requests and responses, we used vue-resource. However, it’s changed to axios. According to the official VueJs blog post, vue-resource is no more supported as an official VueJs extension. Instead axios is recommended because of its richer features. In addition to this, axios provides TypeScript definitions, so there’s no reason not to use this. Its usage is almost identical to vue-resource.

Once Hello.ts is extracted, the original Hello.vue should now be changed to:


Updated on March 6th, 2017: Due to the version update of VueJs to 2.2.x, the import statements part needs to be updated like below:


All done for conversion from JavaScript to TypeScript! It seems that we’ve done fairly massive conversion. The basic template is optimised to JavaScript, so what we’ve done so far is basically the conversion job. From now on, we can write all logic using TypeScript!

Press F5 key on your Visual Studio to run the application and see the result.

The right-hand side of the window on the picture above is Vue.js devtools, which is a Chrome extension. When we install it, we can use it right away through Chrome’s Developer Tools.

One More Thing …

So far, we’ve done the conversion of VueJs to TypeScript. As this is for local development environment, we need one last modification for deployment. Here’s the overall process of building applications for deployment:

  1. To compile .ts files and generate corresponding .js ones.
  2. To modularise and build bundles through webpack.
  3. To build ASP.NET Core libraries.
  4. To generate an artifact for deployment to Azure or IIS.
  5. To deploy.

By updating package.json and project.json we can easily achieve this goal.

package.json

Within package.json, the scripts was originally looking like:

We need to add another one for TypeScript compilation. Let’s change it like:

  • build:ts is to compile .ts files.
  • build:main is to be responsible for existing build.
  • build is to change to call both build:ts and build:main consecutively.
  • --no-deprecation flag may bring an attention. When compiling, ts-loader throws a deprecation warning. It’s OK but Visual Studio treats it as an error so build/deploy fails. By providing this flag will enable build/deploy through Visual Studio successfully.

project.json

Finally, open project.json to confirm the prepublish section.

All good now! After the deployment to Azure Web App, we can see the following screen:

Of course, if CI/CD is preferred, we can simply use dotnet publish feature.

We’ve so far had a quick look to write a VueJs application in TypeScript, bundle it on ASP.NET Core and deploy it to Azure. As mentioned earlier, the very first part is a bit complicating but it’s not that different from normal TypeScript development. Let’s build a real world application using VueJs and TypeScript!!

Send mail to Office 365 via an Exchange Server hosted in Azure

Those of you who have attempted to send mail to Office 365 from Azure know that sending outbound mail directly from an email server hosted in Azure is not supported due to elastic nature of public cloud service IPs and the potential for abuse. Therefore, the Azure IP address blocks are added to public block lists with no exceptions to this policy.

To be able to send mail from an Azure hosted email server to Office 365 you to need to send mail via a SMTP relay. There is a number of different SMTP relays you can utilise including Exchange Online Protection, more information can be found here: https://blogs.msdn.microsoft.com/mast/2016/04/04/sending-e-mail-from-azure-compute-resource-to-external-domains

To configure Exchange Server 2016 hosted in Azure to send mail to Office 365 via SMTP relay to Exchange Online protection you need to do the following;

  1. Create a connector in your Office 365 tenant
  2. Configure accepted domains on your Exchange Server in Azure
  3. Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

Create a connector in your Office 365 tenant

  1. Login to Exchange Online Admin Center
  2. Click mail flow | connector
  3. Click +
  4. Select from: “Your organisation’s email server” to: “Office 365”o365-connector1
  5. Enter in a Name for the Connector | Click Nexto365-connector2
  6. Select “By verifying that the IP address of the sending server matches one of these IP addresses that belong to your organization”
  7. Add the public IP address of your Exchange Server in Azureo365-connector3

Configure accepted domains on your Exchange Server in Azure

  1. Open Exchange Management Shell
  2. Execute the following PowerShell command for each domain you want to send mail to in Office 365;
  3. New-AcceptedDomain -DomainName Contoso.com -DomainType InternalRelay -Name Contosoaccepted-domain1

Create a send connector on your Exchange Server in Azure that relays to Exchange Online Protection

  1. Execute the following PowerShell command;
  2. New-SendConnector -Name “My company to Office 365” -AddressSpaces * -CloudServicesMailEnabled $true -RequireTLS $true -SmartHosts yourdomain-com.mail.protection.outlook.com -TlsAuthLevel CertificateValidationsend-connector1

Inviting Microsoft Account users to your Azure AD-secured VSTS tenant

siliconvalve

I’ve done a lot of external invite management for VSTS after the last few years, and generally without fail we’ll have issues getting everyone on-boarded easily. This blog post is a reference for me (and I guess you too) to understand the invite process and document the experience the invited user has.

There are two sections to this blog post:

1. Admin instructions to invite users.

2. Invited user instructions.

Select whichever one applies to you.

The starting point for this post is that external user hasn’t yet been invited to your Azure AD tenant. The user doing in the inviting is also not an Azure AD Global Admin, but I has rights in an Azure tenant.

The Invite to Azure AD

Log into an Azure subscription using your Azure AD account and select Subscriptions. Ideally this shouldn’t be a production tenant!

Select Subscription

I am going to start by…

View original post 721 more words

Free/busy Exchange hybrid troubleshooting with Microsoft Edge

Those of you who have configured Exchange hybrid with Office 365 before know that free/busy functionality can be troublesome at times and not work correctly.

Instead of searching through Exchange logs I found that you can pin point the exact error message through Microsoft Edge to assist with troubleshooting.

To do so;

  1. Open Microsoft Edge and login to Office 365 OWA (https://outlook.office365.com/owa) with an Office 365 account
  2. Create a new meeting request
  3. Press F12 to launch developer tools
  4. Conduct a free/busy lookup on a person with a mailbox on-premises
  5. Select the Network tab
  6. Select the entry with “GetUserAvailability”devtools-getuseravailability
  7. Select the body tab (on the right hand side)
  8. The MessageText element will display the exact error messagedevtools-messagetext

Exchange Server 2016 in Azure

I recently worked on a project where I had to install Exchange Server 2016 on an Azure VM and I chose a D2 sized Azure VM (2 cores, 7GB RAM) thinking that will suffice, well that was a big mistake.

The installation made it to the last step before a warning appeared informing me that the server is low on memory resources and eventually terminated the installation, leaving it incomplete.

Let this be a warning to the rest of you, choose a D3 or above sized Azure VM to save yourself a whole lot of agony.

To try and salvage the Exchange install I attempted to re-run the installation as it detects an incomplete installation and tries to pick up where it failed previously, this did not work.

I then tried to uninstall Exchange completely by running command: “Setup.exe /mode:Uninstall /IAcceptExchangeServerLicenseTerms”. This also did not work as it was trying to uninstall an Exchange role that never got installed, this left me one option manually remove Exchange from Active Directory and rebuild the Azure VM. 

To remove the Exchange organisation from Active Directory I had to complete the following steps;

  1. On a Domain Controller | Open ADSI Edit
  2. Connect to the Configuration naming contextconfig-naming-context
  3. Expand Services
  4. Delete CN=Microsoft Exchange and CN=Microsoft Exchange Autodiscoverconfig-exchange-objects
  5. Connect to the Default naming contextdefault-naming-context
  6. Under the root OU delete OU=Microsoft Exchange Security Groups and CN=Microsoft Exchange System Objects delete-exchange-objects
  7. Open Active Directory Users and Computers
  8. Select the Users OU
  9. Delete the following:
    • DiscoverySearchMailbox{GUID}
    • Exchange Online-ApplicationAccount
    • Migration.GUID
    • SystemMailbox{GUID}ad-exchange-objects

After Exchange was completely removed from Active Directory and my Azure VM was rebuilt with a D3 size I could successfully install Exchange Server 2016.

Exchange Server 2016 install error: “Active Directory could not be contacted”

I recently worked on a project where I had to install Exchange Server 2016 on an Azure VM and received error “Active Directory could not be contacted”.

To resolve the issue, I had to complete the following steps;

  1. Remove the Azure VM public IP address
  2. Disable IPv6 on the NICipv6-disabled
  3. Set the IPv4 DNS suffix to point to your domain. If a public address is being used it will be set to reddog.microsoft.com by default.dns-suffix

Once done the installation could proceed and Active Directory was contactable.

A [brief] intro to Azure Resource Visualiser (ARMVIZ.io)

Another week, another Azure tool that I’ve come by and thought I’d share with the masses. Though this one isn’t a major revelation or a something that I’ve added to my Chrome work profile bookmarks bar like I did with the Azure Resource Explorer (as yet, though, I may well add this in the very near future), I certainly have it bookmarked in my Azure folder in Chrome bookmarks.

When working with Azure Resource Manager templates, you’re dealing with long JSON files. These files can get to be pretty big in size and span hundreds of lines. Here’s an example of one:

(word or warning- scroll to the bottom as there is more content after this ~250+ line ARM template sample)

{
 "$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json",
 "contentVersion": "1.0.0.0",
 "parameters": {
 "storageAccountName": {
 "type": "string",
 "metadata": {
 "description": "Name of storage account"
 }
 },
 "adminUsername": {
 "type": "string",
 "metadata": {
 "description": "Admin username"
 }
 },
 "adminPassword": {
 "type": "securestring",
 "metadata": {
 "description": "Admin password"
 }
 },
 "dnsNameforLBIP": {
 "type": "string",
 "metadata": {
 "description": "DNS for Load Balancer IP"
 }
 },
 "vmSize": {
 "type": "string",
 "defaultValue": "Standard_D2",
 "metadata": {
 "description": "Size of the VM"
 }
 }
 },
 "variables": {
 "storageAccountType": "Standard_LRS",
 "addressPrefix": "10.0.0.0/16",
 "subnetName": "Subnet-1",
 "subnetPrefix": "10.0.0.0/24",
 "publicIPAddressType": "Dynamic",
 "nic1NamePrefix": "nic1",
 "nic2NamePrefix": "nic2",
 "imagePublisher": "MicrosoftWindowsServer",
 "imageOffer": "WindowsServer",
 "imageSKU": "2012-R2-Datacenter",
 "vnetName": "myVNET",
 "publicIPAddressName": "myPublicIP",
 "lbName": "myLB",
 "vmNamePrefix": "myVM",
 "vnetID": "[resourceId('Microsoft.Network/virtualNetworks',variables('vnetName'))]",
 "subnetRef": "[concat(variables('vnetID'),'/subnets/',variables('subnetName'))]",
 "publicIPAddressID": "[resourceId('Microsoft.Network/publicIPAddresses',variables('publicIPAddressName'))]",
 "lbID": "[resourceId('Microsoft.Network/loadBalancers',variables('lbName'))]",
 "frontEndIPConfigID": "[concat(variables('lbID'),'/frontendIPConfigurations/LoadBalancerFrontEnd')]",
 "lbPoolID": "[concat(variables('lbID'),'/backendAddressPools/BackendPool1')]"
 },
 "resources": [
 {
 "type": "Microsoft.Storage/storageAccounts",
 "name": "[parameters('storageAccountName')]",
 "apiVersion": "2015-05-01-preview",
 "location": "[resourceGroup().location]",
 "properties": {
 "accountType": "[variables('storageAccountType')]"
 }
 },
 {
 "apiVersion": "2015-05-01-preview",
 "type": "Microsoft.Network/publicIPAddresses",
 "name": "[variables('publicIPAddressName')]",
 "location": "[resourceGroup().location]",
 "properties": {
 "publicIPAllocationMethod": "[variables('publicIPAddressType')]",
 "dnsSettings": {
 "domainNameLabel": "[parameters('dnsNameforLBIP')]"
 }
 }
 },
 {
 "apiVersion": "2015-05-01-preview",
 "type": "Microsoft.Network/virtualNetworks",
 "name": "[variables('vnetName')]",
 "location": "[resourceGroup().location]",
 "properties": {
 "addressSpace": {
 "addressPrefixes": [
 "[variables('addressPrefix')]"
 ]
 },
 "subnets": [
 {
 "name": "[variables('subnetName')]",
 "properties": {
 "addressPrefix": "[variables('subnetPrefix')]"
 }
 }
 ]
 }
 },
 {
 "apiVersion": "2015-05-01-preview",
 "type": "Microsoft.Network/networkInterfaces",
 "name": "[variables('nic1NamePrefix')]",
 "location": "[resourceGroup().location]",
 "dependsOn": [
 "[concat('Microsoft.Network/virtualNetworks/', variables('vnetName'))]",
 "[concat('Microsoft.Network/loadBalancers/', variables('lbName'))]"
 ],
 "properties": {
 "ipConfigurations": [
 {
 "name": "ipconfig1",
 "properties": {
 "privateIPAllocationMethod": "Dynamic",
 "subnet": {
 "id": "[variables('subnetRef')]"
 },
 "loadBalancerBackendAddressPools": [
 {
 "id": "[concat(variables('lbID'), '/backendAddressPools/BackendPool1')]"
 }
 ],
 "loadBalancerInboundNatRules": [
 {
 "id": "[concat(variables('lbID'),'/inboundNatRules/RDP-VM0')]"
 }
 ]
 }
 }
 ]
 }
 },
 {
 "apiVersion": "2015-05-01-preview",
 "type": "Microsoft.Network/networkInterfaces",
 "name": "[variables('nic2NamePrefix')]",
 "location": "[resourceGroup().location]",
 "dependsOn": [
 "[concat('Microsoft.Network/virtualNetworks/', variables('vnetName'))]"
 ],
 "properties": {
 "ipConfigurations": [
 {
 "name": "ipconfig1",
 "properties": {
 "privateIPAllocationMethod": "Dynamic",
 "subnet": {
 "id": "[variables('subnetRef')]"
 }
 }
 }
 ]
 }
 },
 {
 "apiVersion": "2015-05-01-preview",
 "name": "[variables('lbName')]",
 "type": "Microsoft.Network/loadBalancers",
 "location": "[resourceGroup().location]",
 "dependsOn": [
 "[concat('Microsoft.Network/publicIPAddresses/', variables('publicIPAddressName'))]"
 ],
 "properties": {
 "frontendIPConfigurations": [
 {
 "name": "LoadBalancerFrontEnd",
 "properties": {
 "publicIPAddress": {
 "id": "[variables('publicIPAddressID')]"
 }
 }
 }
 ],
 "backendAddressPools": [
 {
 "name": "BackendPool1"
 }
 ],
 "inboundNatRules": [
 {
 "name": "RDP-VM0",
 "properties": {
 "frontendIPConfiguration": {
 "id": "[variables('frontEndIPConfigID')]"
 },
 "protocol": "tcp",
 "frontendPort": 50001,
 "backendPort": 3389,
 "enableFloatingIP": false
 }
 }
 ]
 }
 },
 {
 "apiVersion": "2015-06-15",
 "type": "Microsoft.Compute/virtualMachines",
 "name": "[variables('vmNamePrefix')]",
 "location": "[resourceGroup().location]",
 "dependsOn": [
 "[concat('Microsoft.Storage/storageAccounts/', parameters('storageAccountName'))]",
 "[concat('Microsoft.Network/networkInterfaces/', variables('nic1NamePrefix'))]",
 "[concat('Microsoft.Network/networkInterfaces/', variables('nic2NamePrefix'))]"
 ],
 "properties": {
 "hardwareProfile": {
 "vmSize": "[parameters('vmSize')]"
 },
 "osProfile": {
 "computername": "[variables('vmNamePrefix')]",
 "adminUsername": "[parameters('adminUsername')]",
 "adminPassword": "[parameters('adminPassword')]"
 },
 "storageProfile": {
 "imageReference": {
 "publisher": "[variables('imagePublisher')]",
 "offer": "[variables('imageOffer')]",
 "sku": "[variables('imageSKU')]",
 "version": "latest"
 },
 "osDisk": {
 "name": "osdisk",
 "vhd": {
 "uri": "[concat('http://',parameters('storageAccountName'),'.blob.core.windows.net/vhds/','osdisk', '.vhd')]"
 },
 "caching": "ReadWrite",
 "createOption": "FromImage"
 }
 },
 "networkProfile": {
 "networkInterfaces": [
 {
 "properties": {
 "primary": true
 },
 "id": "[resourceId('Microsoft.Network/networkInterfaces',variables('nic1NamePrefix'))]"
 },
 {
 "properties": {
 "primary": false
 },
 "id": "[resourceId('Microsoft.Network/networkInterfaces',variables('nic2NamePrefix'))]"
 }
 ]
 },
 "diagnosticsProfile": {
 "bootDiagnostics": {
 "enabled": "true",
 "storageUri": "[concat('http://',parameters('StorageAccountName'),'.blob.core.windows.net')]"
 }
 }
 }
 }
 ]
} 


So what is this Azure Resource Visualiser?

For those following at home and are cluey enough to pick up on what the ARM Visualiser is, it’s a means to help visualise the components and relationships of those components in an ARM template. It does this in a very user friendly and easy to use way.

What you can do is as follows:

  • Play around with ARMVIZ with a pre-build ARM template
  • Import your own ARM template and visualise the characteristics of your components and their relationships
    • This helps validate the ARM template and make sure there are no errors or bugs
  • You can have quick access to the ARM Quickstart templates library in Github
  • You can view your JSON ARM template online
  • You can create a new ARM template from scratch or via copying one of the quick start templates ONLINE in your browser of choice
  • You can freely edit that template and go from JSON view to the Visualiser view quickly
  • You can then download a copy of your ARM template that is built, tested, visualised and working
  • You can provide helpful feedback to the team from Azure working on this service
    • Possibly leading to this being rolled up into the Azure Portal at some point in the future
  • All of the components are able to be moved around the screen as well
  • If you double click on any of the components, for example the Load Balancer, you’ll be taken to the line in the ARM template to view or amend the config

What does ARMVIZ.io look like?

Here’s a sample ARM template visualised

Azure Resource Visualiser

This is a screen grab of the Azure Resource Visualiser site. It shows the default, sample template that can be manipulated and played with.

Final words

If you want a quick, fun ARM template editor and your favourite ISE is just behaving like it did ever other day, then spruce up and pimp up your day with the Azure Resource Visualiser. In the coming weeks I’m going to be doing quick a few ARM templates. I can assure you that those will be run through the Azure Resource Visualiser to validate and check for any errors.

Pro-tip: don’t upload an ARM template to Microsoft that might have sensitive information. I know it’s common sense, but, it happens to the best of us. I thought I’d just quickly mention that as a reminder more than anything else.

Hope you enjoy it,

Best,

Lucian

 

 

***

Originally posted on Lucian’s blog at clouduccino.com. Follow Lucian on Twitter @LucianFrango, or, connect via LinkedIn.