How to use a Powershell Azure Function to Tweet IoT environment data


This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.

The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.


The following are prerequisites for this solution;

Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name InvokeTwitterAPIs -Path c:\temp\twitter

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.

Create a Twitter App

Head over to and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.

Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.

Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”

Give your Function App a name and select Create.

Configure Azure Function App Application Settings

In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.

Deployment Credentials

If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.

Take note of your Deployment Username and FTP Hostname.

Upload the Twitter Powershell Module to the Azure Function App

Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.

From the Applications Settings option start Kudu.

Traverse the folder structure to get the path do the Twitter Powershell Module and note it.

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Twitter Module.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.

Viewing the Tweet

And here is the successful tweet.


This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.

Follow Darren on Twitter @darrenjrobinson

How to use a Powershell Azure Function App to get RestAPI IoT data into Power BI for Visualization


This blog post details using a Powershell Azure Function App to get IoT data from a RestAPI and update a table in Power BI with that data for visualization.

The data can come from anywhere, however in the case of this post I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the major change is to use a TimerTrigger Azure Function to perform the work and leverage the “serverless” Azure Functions model. No need for a reporting server or messing around with Windows scheduled tasks.


The following are the prerequisites for this solution;

  • The Power BI Powershell Module
  • Register an application for RestAPI Access to Power BI
  • A Power BI Dataset ready for the data to go into
  • AzureADPreview Powershell Module

Create a folder on your local machine for the Powershell Modules then save the modules to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name PowerBIPS -Path C:\temp\PowerBI
Save-Module -Name AzureADPreview -Path c:\temp\AzureAD 

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption Plan for the Hosting Plan so you only pay for what you use, and select an appropriate location and Storage Account.

Register a Power BI Application

Register a Power BI App if you haven’t already using the link and instructions in the prerequisites. Take a note of the ClientID. You’ll need this in the next step.

Configure Azure Function App Application Settings

In this example I’m using Azure Functions Application Settings for the Azure AD AccountName, Password and the Power BI ClientID. In your Azure Function App select “Configure app settings”. Create new App Settings for your UserID and Password for Azure (to access Power BI) and our PowerBI Application Client ID. Select Save.

Not shown here I’ve also placed the URL’s for the RestAPI’s that I’m calling to get the IoT environment data as Application Settings variables.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function App. The default of a 5 min schedule should be perfect. Give it a name and select Create.

Upload the Powershell Modules to the Azure Function App

Now that we have created the base of our Function App we’re going to need to upload the Powershell Modules we’ll be using that are detailed in the prerequisites. In order to upload them to your Azure Function App, go to App Service Settings => Deployment Credentials and set a Username and Password as shown below. Select Save.

Take note of your Deployment Username and FTP Hostname.

Create a sub-directory under your Function App named bin and upload the Power BI Powershell Module using a FTP Client. I’m using WinSCP.

To make sure you get the correct path to the powershell module from Application Settings start Kudu.

Traverse the folder structure to get the path to the Power BI Powershell Module and note the path and the name of the psm1 file.

Now upload the Azure AD Preview Powershell Module in the same way as you did the Power BI Powershell Module.

Again using Kudu validate the path to the Azure AD Preview Powershell Module. The file you are looking for is the Microsoft.IdentityModel.Clients.ActiveDirectory.dll” file. My file after uploading is located in “D:\home\site\wwwroot\MyAzureFunction\bin\AzureADPreview\\Microsoft.IdentityModel.Clients.ActiveDirectory.dll”

This library is used by the Power BI Powershell Module.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Power BI Powershell Module. Include the get-help line for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Power BI Module get-help command. I can see that the module was successfully loaded.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and puts the data directly into Power BI.

Viewing the data in Power BI

In Power BI it is then quick and easy to select our Inside and Outside temperature readings referenced against time. This timescale is overnight so both sensors are reading quite close to each other.


This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a visualization of IoT data into Power BI. The input could easily be business data from an API and the output a real time reporting dashboard.

Follow Darren on Twitter @darrenjrobinson





Command and control with Arduino, Windows Phone and Azure Mobile Services

In most of our posts on the topic of IoT to date we’ve focussed on how to send data emitted from sensors and devices to centralised platforms where we can further process and analyse this data. In this post we’re going to have a look at how we can reverse this model and control our ‘things’ remotely by utilising cloud services. I’m going to demonstrate how to remotely control a light emitting diode (LED) strip with a Windows Phone using Microsoft Azure Mobile Services.

To control the RGB LED strip I’m going to use an Arduino Uno, a breadboard and some MOSFETs (a type of transistor). The LED strip will require more power than the Arduino can supply, so I’m using a 9V battery as a power supply which needs to be separated from the Arduino power circuit, hence why we’re using MOSFET transistors to switch the LEDs on and off.

The Arduino Uno will control the colour of the light by controlling three MOSFETs – one each for the red, blue and green LEDs. The limited programmability of the Arduino Uno means we can’t establish an Azure Service Bus relay connection, or use Azure Service Bus queues. Luckily Azure Mobile Services allow us to retrieve data via plain HTTP.

A Windows Phone App will control the colour of the lights by sending data to the mobile service. Subsequently the Arduino Uno can retrieve this data from the service to control the colour by using a technique called ‘pulse width modulation‘ on the red, green and blue LEDs. Pulse width modulation allows us to adjust the brightness of the LEDs by quickly turning on and off a particular LED colour, thus artificially creating a unique colour spectrum.

For the purpose of this example we won’t incorporate any authentication in our application, though you can easily enforce authentication for your Mobile Service with a Microsoft Account by following these two guides:

A diagram showing our overall implementation is shown below.

Command and Control diagram

Mobile service

We will first start by creating an Azure Mobile Service in the Azure portal and for the purpose of this demonstration we can use the service’s free tier which provides data storage up to 20MB per subscription.

Navigate to the Azure portal and create a new service:

Creating a Mobile Service 1

Next, choose a name for your Mobile Service, your database tier and geographic location. We’ll choose a Javascript backend for simplicity in this instance.

Creating a Mobile Service 2

Creating a Mobile Service 3

In this example we’ll create a table ‘sensordata’ with the following permissions:

Mobile Service Permissions

These permissions allows us to insert records from our Windows Phone app with the application key, and have the Arduino Uno retrieve data without any security. We could make the insertion of new data secure by demanding authentication from our Windows Phone device without too much effort, but for the purpose of this demo we stick to this very basic form of protection.

In the next section we’re going to create a Windows Phone application to send commands to our mobile service.

Windows Phone Application

To control the colour in a user friendly way we will use a colour picker control from the Windows Phone Toolkit, which can be installed as a NuGet package. This toolkit is not compatible with Windows Phone 8.1 yet, so we’ll create a Windows Phone Silverlight project and target the Windows Phone 8.0 platform as shown below.

Visual Studio Create Project 1

Visual Studio Create Project 2

Next, we’ll install the ‘Windows Phone Toolkit’ NuGet package as well as the mobile services NuGet package:

Install Windows Phone Toolkit Nuget

Install Mobile Services NuGet

For the purpose of this demo we won’t go through all the colour picker code in detail here. Excellent guidance on how to use the colour picker can be found at on the Microsoft Mobile Developer Wiki.

The code that sends the selected colour to our mobile service table is as follows.

The event data consists of colour data in the RGB model, separated by semicolons.

The complete working solution can be found in this Github repository. Make sure you point it to the right Azure Mobile Service and change the Application Key before you use it!

Run the application and pick a colour on the phone as shown below.

Phone ScreenShot

Now that we have a remote control that is sending out data to the Mobile Service it’s time to look at how we can use this information to control our LED strip.

Arduino sketch

In order to receive commands from the Windows Phone App we are going to use OData queries to retrieve the last inserted record from the Azure Mobile Servicewhihch exposes table data via OData out of the box. We can easily get the last inserted record in JSON format via a HTTP GET request to a URL similar to the following:$top=1&$orderby=__createdAt%20desc

When we send a HTTP GET request, the following HTTP body will be returned:


Notice how the colour is set to blue in the RGB data.

The Arduino schematics for the solution:

Arduino Command Control Schematic

For illustrative purposes I’ve drawn a single LED. In reality I’m using a LED strip that needs more voltage than the Arduino can supply, hence the 9V battery is attached and MOSFET transistors are used. Don’t attach a 9V battery to a single LED or it will have a very short life…

The complete Arduino sketch:

When we run the sketch the JSON data will be retrieved, and the colour of the LED strip set to blue:

The Working Prototype!

In this article I’ve demonstrated how to control a low end IoT device that does not have any HTTPS/TLS capabilities. This scenario is far from perfect, and ideally we want to take different security measures to prevent unauthorised access to our IoT devices and transport data. In a future article I will showcase how we can resolve these issues by using a much more powerful device than the Arduino Uno with an even smaller form factor: the Intel Edison. Stay tuned!

Microsoft Windows IoT and the Intel Galileo

You might have seen one of these headlines a while back: ‘Microsoft Windows now running on Intel Galileo development board’, ‘Microsoft giving away free Windows 8.1 for IoT developers’. Now before we all get too excited, let’s have a closer look beyond these headlines and see what we’re actually getting!

Intel Galileo

With a zillion devices being connected to the Internet by the year 2020 a lot of hardware manufacturers want to have a piece of this big pie, and Intel got into the game by releasing two different development boards / processors: the Intel Galileo and more recently the Intel Edison.

Intel Galileo

Intel Galileo

Intel Edison

Intel Edison

The Galileo is Intel’s first attempt to break into consumer prototyping, or the ‘maker scene’. The board comes in two flavours, Gen 1 and Gen 2 with the latter being a slightly upgraded model of the first release.

Like many other development platforms the board offers hardware and pin compatibility with a range of Arduino shields to catch the interest from a large number of existing DIY enthusiasts. The fundamental difference between boards like the Arduino Uno and the Intel Galileo is that Arduino devices run on a real-time microcontroller (mostly Atmel Atmega processors) whereas the Galileo runs on a System on Chip architecture (SoC). The SoC runs a standard multi-tasking operating system like Linux or Windows, which aren’t real-time.

Both Gen1 and Gen2 boards contain an Intel Quark 32-bit 400 MHz processor, which is compatible with the Intel Pentium processor instruction set. Furthermore we have a full-sized mini-PCI express slot, a 100 Mb Ethernet port, microSD slot and USB port. The Galileo is a headless device which means you can’t connect a monitor via a VGA or HDMI unlike the Raspberry Pi for example. The Galileo effectively offers Arduino compatibility through hardware pins, and software simulation within the operation system.

The microSD card slot makes it easy to run different operating systems on the device as you can simply write an operating system image on an SD card, insert it into the slot and boot the Galileo. Although Intel offers the Yocto Poky Linux environment there are some great initiatives to support other operating systems. At Build 2014 Microsoft announced the ‘Windows Developer Program for IoT’. As part of this program Microsoft offers a custom Windows image that can run on Galileo boards (there’s no official name yet, but let’s call it Windows IoT for now).

Windows on Devices / Windows Developer Program for IoT

Great, so now we can run .NET Framework application, and for example utilise the .NET Azure SDK? Well not really, yet… The Windows image is still in Alpha release stage and only runs a small subset of the .NET CLR and is not able to support larger .NET applications of any kind. Although a simple “Hello World” application will run flawlessly, applications will throw multiple Exceptions as soon as functionality beyond the System.Core.dll is called.

So how can we start building our things? You can write applications using the Wiring APIs in exactly the same way as you program your Arduino. Microsoft provides compatibility with the Arduino environment with a set of C++ libraries that are part of a new Visual Studio project type when you setup your development environment according to the instructions on

We’ll start off by creating a new ‘Windows for IoT’ project in Visual Studio 2013:

New IoT VS Project

The project template will create a Visual C++ console application with a basic Arduino program that turns the built-in LED on and off in a loop:

Now let’s grab our breadboard and wire up some sensors. For the purpose of this demo I will use the built-in temperature sensor on the Galileo board. The objective will be to transmit the temperature to an Azure storage queue.

Since the Arduino Wiring API is implemented in C++ I decided to utilise some of the other Microsoft C++ libraries on offer: the Azure Storage Client Library for C++, which in return is using the C++ REST SDK. They’re hosted on Github and Codeplex respectively and can both be installed as Nuget packages. I was able to deliver messages to a storage queue with the C++ library in a standard C++ Win32 console application, so assumed this would work on the Galileo. Here’s the program listing of the ‘main.cpp’ file of the project:

The instructions mentioned earlier explain in detail how to setup your Galileo to run Windows, so I won’t repeat that here. We can deploy the Galileo console application to the development board from Visual Studio. This simply causes the compiled executable to be copied to the Galileo via a file share. Since it’s a headless device we can only connect to the Galileo via good old Telnet. Next, we launch the deployed application on the command line:

Windows IoT command line output

Although the console application is supposed to write output to the console, none of it is shown. I am wondering if there are certain Win32 features missing in this Windows on Devices release, since no debug information is outputted to the console for most commands that are executed over Telnet. When I tried to debug the application from Visual Studio I was able to extract some further diagnostics:

IoT VS Debug Output

Perhaps this is due to a missing Visual Studio C++ runtime on the Galileo board. I tried to perform an unattended installation of this runtime it did not seem to install at all, although a lack of command line output makes this guesswork.


Microsoft’s IoT offering is still in its very early days. That doesn’t only apply to the Windows IoT operating system, but for also to Azure platform features like Event Hubs as well. Although this is an Alpha release of Windows IoT I can’t say I’m overly impressed. The Arduino compatibility is a great feature, but a lack of easy connectivity makes it just a ‘thing’ without Internet. Although you can use the Arduino Ethernet / HTTP library, I would have liked to benefit from the available C++ libraries to securely connect to APIs over HTTPS, something which is impossible on the Arduino platform.

The Microsoft product documentation looks rather sloppy at times and is generally just lacking and I’m curious to see what the next release will bring along. According to Microsoft’s FAQ they’re focussing on supporting the universal app model. The recent announcements around open sourcing the .NET Framework will perhaps enable us to use some .NET Framework features in a Galileo Linux distribution in the not-to-distant future.

In a future blog post I will explore some other scenarios for the Intel Galileo using Intel’s IoT XDK, Node JS and look at how to connect the Galileo board to some of the Microsoft Azure platform services.

IoT – Solar & Azure

Ever since we got our solar system installed about two years ago, I’ve been keeping track of the total power generated by the system. Every month I would write down the totals and add it to my Excel spreadsheet. Although it’s not much work, it’s still manual work… yes all 2 minutes every month.

So when the whole “Internet of Things” discussion started at our office (see Matt’s blog “Azure Mobile Services and the Internet of Things“) I thought it would be a good opportunity to look at doing this using Azure – even if it was only to prove the IoT concept. The potential solution should:

  1. Use a device which connects to the solar inverter to reads its data via RS232.
  2. This device needs to be powered by a battery as no power outlet is close to the inverter.
  3. Upload data to Azure without having to rely on a computer running 24/7 to do this.
  4. Use Azure to store and present this data.


The device I built is based on the Arduino Uno and consists of the following components:

Arduino UNO R3
With a little bit of programming these devices are perfectly capable of retrieving data from various data sources, are small in size, expandable with various libraries, add on shields and break-out boards and can be battery powered. Having the inverter on a side of the house with no power outlet close by made this a main requirement.
MAX3232 RS232 Serial to TTL Converter module
As the Arduino Uno doesn’t come with any serial connectors this module adds a DB9 connector to the board. Now the Arduino can be connected to the inverter using a null modem cable.
Adafruit CC3000 WiFi Shield with Onboard Ceramic Antenna
Some of the existing solutions which can send inverter data to a website (e.g. PVOutput) or computer logging those details, all rely on a computer which runs 24/7 which is one of the things I definitely didn’t want to do. I ended up getting this WiFi shield which, after soldering it on top of the Arduino board, turns the Arduino into a WiFi enabled device and allows it to send data to the internet directly. After adding the required libraries and credentials to my script, having access to a wireless router already enables basic access to the internet. Even though it is sitting quite a bit away from the wireless router, connectivity is no issue.
arduinobuild inverter
The Arduino Uno unit… …connected to the inverter


To store and / or display any of the info the Arduino is collecting, an Azure subscription is required. For this project I signed up for a free trial. Once the subscription is sorted, the following Azure services have to be setup:

Azure Service Description
Cloud service Running the worker roles.
Storage Account Hosting the table storage.
Service Bus Message queue for the Arduino.
Website For displaying data in (near) realtime.

Putting it all together

So how do all these different components fit together?

The Arduino connects to the inverter via a null-modem cable. Reading data from it is achieved by adding a MODBUS library to the Arduino script. This adds additional functionality to the Arduino which is now able to read (and write) data from MODBUS (an industrial comms standard) enabled devices.
The script is set to run every 30 minutes and only after a successful connection (the inverter shuts down if there is not enough sunlight) it will set up a wireless internet connection and send the data to the TCP listener worker role in Azure.

In Azure, a service bus message queue was created to hold all incoming data packets sent from the Arduino. A storage table was also created to permantly store data received from the Arduino. The great thing with the storage table is there is no need to create a table schema before being able to use it, just creating the “placeholder” is enough!

Using Visual Studio, two worker roles were created:

  • A TCP listener which “listens” for any device sending information to the specified endpoints. If a message from the Arduino is received it will write it onto the message queue.

service bus explorer screenshot

Using Service Bus Explorer you can see the individual messages arriving in the message queue.

  • A data writer which checks the message queue for new messages. If a new message has arrived, the message will be read, its content stored in the storage table and the message deleted.

Finally, a simple ASP.Net MVC website is used to display data from the storage table in near real-time. The website displays statistics on how many KWs have been generated during that day and how a day compares to previous days.

Energy Today

Stats for current day.


Website display.


This IoT project was a good opportunity to have a play with various Azure components through using multiple worker roles, message queues and the like. It probably sounds like overkill when just using the one device sending one message every 30 minutes, but a similar setup can be used in larger environments such factories where multiple devices send dozens of messages per minute.

Publishing to Azure Event Hubs using a .NET Micro Framework Device

In previous posts, Kloudies Matt Davies and Olaf Loogman have shown how we connect Arduino based devices to the Azure platform. Preferring the .NET Micro Framework (NETMF) platform myself, I thought it time to show how we can publish senor data to Azure Event Hubs using a NETMF connected device.

.NET Micro Framework

Like Arduino, the .NET Micro Framework is an open source platform that runs on small, microcontroller based devices or “things”as we call them now in the world of the Internet-of-Things (IoT). However, unlike the Arduino platform, developers in the NETMF world use Visual Studio and C# to develop embedded applications leveraging the rich developer experience that comes with working within the Visual Studio IDE. Using the .NET Gadgeteer toolkit, we take this experience to the next level using a model driven development approach with graphical designers that abstracts much of the low level “engineering” aspects of embedded device development.

net gadgeteer i VS

Whether we are working with earlier NETMF versions or with the Gadgeteer toolkit, we still get the rich debugging experience and deployment features from Visual Studio which is the big differentiator of the NETMF platform.

FEZ Panda II

The device I have had for a number of years is the FEZ Panda II from GHI Electronics (FEZ stands for Freak’N’Easy).


Although not a member of the newer .NET Gadgeteer family, the FEZ Panda II still provides those in the maker community a solid device platform for DIY and commercial grade applications. The FEZ Panda II sports:

  • 72Mhz 32-bit processor with 512KB of FLASH and 96KB of RAM (compared to the Arduino Yun’s 16 MHz and 32KB of FLASH and  2KB of RAM)
  • Micro SD socket for up to 16GB of memory
  • Real Time Clock (RTC)
  • Over 60 digital inputs and outputs
  • TCP/IP HTTP support
  • Arduino shield compatibility

Note: The FEZ Panda II does not have built in support for TLS/SSL which is required to publish data to Azure Event Hubs. This is not a problem for the newer Gadgeteer boards such as the FEZ Raptor.

Our Scenario

iot cloud gateway

The scenario I will walkthrough in this post will feature our NETMF embedded device with a number of analogue sensors taking readings a couple of times per second and publishing the sensor data to an Azure Event Hub via a field gateway. A monitoring application will act as an event hub consumer to display sensor readings in near real-time.

  • Sensors – Thermometer (degs Celsius) and Light (intensity of light as a % with zero being complete darkness)
  • Device – FEZ Panda II connected to the internet using the Wiznet W5100 ethernet controller.
  • Field Gateway – Simple IIS Application Request Routing rule in an Azure hosted Virtual Machine that routes the request as-is to a HTTPS Azure Event Hub endpoint.
  • Cloud Gateway – Azure Event Hub configured as follows:
    • 8 partitions
    • 1 day message retention
    • Monitoring Consumer Group
    • Publisher access policy for our connected device
    • Consumer access policy for our monitoring application
  • Monitoring Application – Silverlight (long live Ag!) application consuming events off the Monitoring Consumer Group.

Creating the Visual Studio Solution

To develop NETMF applications we must first install the .NET Micro Framework SDK and any vendor specific SDK’s. Using Visual Studio we create a NETMF project using the .NET Micro Framework Console Application template (or Gadgeteer template if you are using the newer family of devices).


For the FEZ Panda II, I need to target NETMF version 4.1.


Additionally, I also need to add assembly references to the device manufacturer’s SDK libraries, GHI Electronics in my case.


Sensor Code

Working with sensors and other modules is fairly straight forward using NETMF and the GHI libraries. To initialise an instance of my sensor class I need to know:

  • The analogue pin on the device my sensor is connected to
  • The interval between sensor readings
  • The min/max values of the readings
public Thermometer(FEZ_Pin.AnalogIn pin, int interval, int min, int max)
    // Set reading parameters
    _interval = interval;
    _minScale = min;
    _maxScale = max;
    // Initialise thermometer sensor 
    _sensor = new AnalogIn((AnalogIn.Pin)pin);
    _sensor.SetLinearScale(_minScale, _maxScale);

    // Set sensor id
    _sensorId = "An" + pin.ToString() + _sensorType;

I then use a background thread to periodically take sensor readings and raise an event passing the sensor data as an event argument

void SensorThreadStart()
    while (true)
        // Take a sensor reading
        var temp = _sensor.Read();

        // Create sensor event data 
        var eventData = new SensorEventData()
            DeviceId = _deviceId,
            SensorData = new SensorData[]
                    new SensorData() { SensorId = _sensorId, SensorType = _sensorType, SensorValue = temp }

        // Raise sensor event

        // Pause


A critical attribute of any “thing” in the world of IoT is being connected. When working with resource constrained devices we quickly come to terms with having to perform many lower level functions than we may not be accustomed to in our day to day development. Initialising your network stack may be one of them…

public static void InitNetworkStack()
    Debug.Print("Network settings...");

        // Enable ethernet

        // Enable DHCP
        Dhcp.EnableDhcp(new byte[] { 0x00, 0x5B, 0x1C, 0x51, 0xC6, 0xC7 }, "FEZA");
        Debug.Print("IP Address: " + new IPAddress(NetworkInterface.IPAddress).ToString());
        Debug.Print("Subnet Mask: " + new IPAddress(NetworkInterface.SubnetMask).ToString());
        Debug.Print("Default Gateway: " + new IPAddress(NetworkInterface.GatewayAddress).ToString());
        Debug.Print("DNS Server: " + new IPAddress(NetworkInterface.DnsServer).ToString());
    catch (Exception ex)
        Debug.Print("Network settings...Error: " + ex.ToString());



Note the use of the Debug.Print statements. While in debug mode these are written to the output Window for easy troubleshooting and debugging.

Event Hub Client

As I write this, we don’t yet have a Azure SDK for NETMF (but we have been told it is on its way). Like most services in Azure, Event Hubs provides a REST based API that I can consume using plain old web http requests. To handle the access control, I assigned a pre-generated SAS token to the device during deployment. This avoids the resource constrained device having to generate a SAS token itself and use up precious memory doing so.

To construct our request to Event Hubs we need the following details:

  • Service Bus Namespace
  • Event Hub name
  • PartitionKey (I am using a device ID)
  • Authorisation token
public EventHubClient(string serviceNamespace, string eventhub, string deviceName, string accessSignature)
    // Assign event hub details
    _serviceNamespace = serviceNamespace;
    _hubName = eventhub;
    _deviceName = deviceName;
    _sas = accessSignature;

    // Generate the url to the event hub
    //_url = "https://" + _serviceNamespace + "" + _hubName + "/Publishers/" + _deviceName;

    //  Note: As the FEZ Panda (.NET MF 4.1) does not support SSL I need to send this to the field gateway over HTTP
    _url = "" + _serviceNamespace + "/" + _hubName + "/" + _deviceName;

Note here I have switched my Event Hub Client to use an intermediary field gateway URL as the device does not support SSL and cannot post requests directly to Azure Event Hub endpoint.

Finally the actual payload is the sensor data that I serialise into json format. Event Hubs is payload agnostic so any stream of data may be sent through the hub. Anything from sensor data, application logging or perhaps observational data from medical devices can be published to Azure Event Hubs.

public bool SendEvent(SensorEventData sensorData)
    var success = false;
        // Format the sensor data as json
        var eventData = sensorData.ToJson();

        Debug.Print("Sending event data: " + eventData);

        // Create an HTTP Web request.
        HttpWebRequest webReq = HttpWebRequest.Create(_url) as HttpWebRequest;

        // Add required headers
        webReq.Method = "POST";
        webReq.Headers.Add("Authorization", _sas);
        webReq.ContentType = "application/atom+xml;type=entry;charset=utf-8";
        webReq.ContentLength = eventData.Length;
        webReq.KeepAlive = true;                

        using (var writer = new StreamWriter(webReq.GetRequestStream()))

        webReq.Timeout = 3000; // 3 secs
        using (var response = webReq.GetResponse() as HttpWebResponse)
            Debug.Print("HttpWebResponse: " + response.StatusCode.ToString());
            // Check status code
            success = (response.StatusCode == HttpStatusCode.Created);
    catch (Exception ex)

    return success;


Wiring it all together is the job of our entry point Main(). Here we initialise our network stack, sensors, LEDs and of course our Azure Event Hub client. We then wire up the sensor event handlers and off we go.

public static void Main()
    // Initialise device and sensors

    // Setup Event Hub client
    client = new EventHubClient(serviceNamespace, hubName, deviceName, sas);

    Debug.Print("Device ready");

    // Start sensor monitoring

static void Init()
    // Enable ethernet

    // Init LED 
    led = new LED((Cpu.Pin)FEZ_Pin.Digital.Di5, false);

    // Init thermometer sensor
    thermo = new Thermometer(FEZ_Pin.AnalogIn.An2, 500, -22, 56);
    thermo.SensorReadEvent += SensorReadEvent;

    // Init light sensor
    light = new Light(FEZ_Pin.AnalogIn.An3, 500, 0, 100);
    light.SensorReadEvent += SensorReadEvent;

    // Flash once if all is good

static void SensorReadEvent(SensorEventData data)
    // Send event to Event Hubs
    if (!client.SendEvent(data))
        // Flash three times if failed to send
        // Flash once if all is good

Note the use of the LED, connected to digital pin 5, to provide runtime feedback. We flash the LED once for every successful publish of an event and three times if we have a failure. It is this kind of low level controller interaction that makes NETMF development such a satisfying, albeit geeky pastime.

Field Gateway

As mentioned above, the FEZ Panda II does not support TLS/SSL. To overcome this, I posted sensor data to a “field gateway” consisting of a simple IIS Application Request Routing rule to perform the protocol translation from HTTP to HTTPS. The ARR rule only performed a rewrite of the URL and did not need to enrich or modify the request in any other way.


Our Consumer

Azure Event Hubs provides Consumer Groups that subscribe to events published to the hub. Only one consumer can receive events from each partition at time so I have found it good practice to create at least two consumer groups so that one group can be used for monitoring as required while your downstream processing application/services consume the primary consumer group. To this end, I developed a quick Silverlight application (yes I know. long live Ag!) to act as a monitoring consumer for the event hub.

event hubs consumer


The .NET Micro Framework provides a great way for .NET developers to participate in the growing Internet-of-Things movement for a relatively small ( < $100 ) outlay while retaining the rich developer experience using familiar tools such as Visual Studio. Azure Event Hubs provides the platform for a cloud-based device gateway allowing the ingestion of millions of events that downstream applications and services can consume for real-time analytics and ordered processing.

IPv6 – Are we there yet??

The topic of IPv6 seems to come up every couple of years. The first time I recall there being a lot of hype about IPv6 was way back in the early 2000’s, ever since then the topic seems to get attention every once in a while and then disappears into insignificance alongside more exciting IT news.

The problem with IPv4 is that there are only about 3.7 billion public IPv4 addresses. Whilst this may initially sound like a lot, take a moment to think about how many devices you currently have that connect to the Internet. Globally we have already experienced a rapid uptake of Internet connected smart-phones and the recent hype surrounding the Internet of Things (IoT) promises to connect an even larger array of devices to the Internet. With a global population of approx. 7 billion people we just don’t have enough to go around.

Back in the early 2000’s there was limited support in the form of hardware and software that supported IPv6. So now that we have wide spread hardware and software IPv6 support, why is it that we haven’t all switched?

Like most things in the world it’s often determined by the capacity to monetise an event. Surprisingly not all carriers / ISP’s are on board and some are reluctant to spend money to drive the switch. Network address translation (NAT) and Classless Inter-Domain Routing (CIDR), have made it much easier to live with IPv4. NAT used on firewalls and routers lets many nodes in a network sit behind a single public IP address. CIDR, sometimes referred to as supernetting is a way to allocate and specify the Internet addresses used in inter-domain routing in a much more flexible manner than with the original system of Internet Protocol (IP) address classes. As a result, the number of available Internet addresses has been greatly increased and has allowed service providers to conserve addresses by divvying up pieces of a full range of IP addresses to multiple customers.

Perceived risk by consumers also comes into play. It is plausible that many companies may be of the view that the introduction of IPv6 is somewhat unnecessary and potentially risky in terms of effort required to implement and loss of productivity during implementation. Most corporations are simply not feeling any pain with IPv4 so it’s not on their short term radar as being of any level of criticality to their business. When considering IPv6 implementation from a business perspective, the successful adoption of new technologies are typically accompanied by some form of reward or competitive advantage associated with early adoption. The potential for financial reward is often what drives significant change.

To IPv6’s detriment from the layperson’s perspective it has little to distinguish itself from IPv4 in terms of services and service costs. Many of IPv4’s short comings have been addressed. Financial incentives to make the decision to commence widespread deployment just don’t exist.

We have all heard the doom and gloom stories associated with the impending end of IPv4. Surely this should be reason enough for accelerated implementation of IPv6? Why isn’t everyone rushing to implement IPv6 and mitigate future risk? The situation where exhaustion of IPv4 addresses would cause rapid escalation in costs to consumers hasn’t really happened yet and has failed to be a significant factor to encourage further deployment of IPv6 in the Internet.

Another factor to consider is backward compatibility. IPv4 hosts are unable to address IP packets directly to an IPv6 host and vice-versa.

So this means that it is not realistic to just switch over a network from IPv4 to IPv6. When implementing IPv6 a significant period of dual stack IPv4 and IPv6 coexistence needs to take place. This is where IPv6 is turned on and run in parallel with the existing IPv4 network. This just sounds like two networks instead of one and double administrative overhead for most IT decision makers.

Networks need to provide continued support for IPv4 for as long as there are significant levels of IPv4 only networks and services still deployed. Many IT decision makers would rather spend their budget elsewhere and ignore the issue for another year.

Only once the majority of the Internet supports a dual stack environment can networks start to turn off their continued support for IPv4. Therefore, while there is no particular competitive advantage to be gained by early adoption of IPv6, the collective internet wide decommissioning of IPv4 is likely to be determined by the late adopters.

So what should I do?

It’s important to understand where you are now and arm yourself with enough information to plan accordingly.

  • Check if your ISP is currently supporting IPv6 by visiting a website like There is a dual stack test which will let you know if you are using IPv4 alongside IPv6.
  • Understand if the networking equipment you have in place supports IPv6.
  • Understand if all your existing networked devices (everything that consumes an IP address) supports IPv6.
  • Ensure that all new device acquisitions are fully supportive of IPv6.
  • Understand if the services you consume support IPv6. (If you are making use of public cloud providers, understand if the services you consume support IPv6 or have a road map to IPv6.)

The reality is that IPv6 isn’t going away and as IT decision makers we can’t postpone planning for its implementation indefinitely. Take the time now to understand where your organisation is at. Make your transition to IPv6 a success story!!

Azure Mobile Services and the Internet of Things

The IT industry is full of buzzwords and “The Internet of Things” (IoT) is one that’s getting thrown about a lot lately. The IoT promises to connect billions of devices and sensors to the internet. How this data is stored, sorted, analysed and surfaced will determine the amount of value it is to your business. With this in mind I thought it’s time to start playing around with some bits and pieces to see if I could create my very own IoT connected array of sensors.

To get started I’ll need a micro-controller that I can attach some sensors to. Second I’ll need some kind of web service and storage to accept and store my raw sensor data. I’m not a developer so I’ve decided to keep things simple to start with. My design goals however, are to make use of cloud services to accept and store my raw data. Azure Mobile Services seems like a good place to start.

I’ve chosen the following components for my IoT Project

  1. Arduino Yun – the Micro-controller board
  2. Temperature Sensor – to detect ambient temperature
  3. Light Sensor – to detect light levels
  4. Pressure Sensor – to detect barometric pressure
  5. Azure Mobile Services – to connect the Arduino Yun to the cloud
  6. NoSQL database – to store my raw data

Arduino Yun

From the Arduino website the board’s capabilities are as follows: “The Arduino Yún is a microcontroller board based on the ATmega32u4 and the Atheros AR9331. The Atheros processor supports a Linux distribution based on OpenWrt named OpenWrt-Yun. The board has built-in Ethernet and WiFi support, a USB-A port, micro-SD card slot, 20 digital input/output pins (of which 7 can be used as PWM outputs and 12 as analog inputs), a 16 MHz crystal oscillator, a micro USB connection, an ICSP header, and a 3 reset buttons.”

Further Details on the Arduino Yun can be found at:


The following schematic diagram illustrates the wiring arrangements between the Arduino Yun and the sensors. In this blog I’m not going to provide any specific detail in this area, instead we are going to focus on how the Arduino Yun can be programed to send its sensor data to a database in Microsoft’s Azure cloud. (There are loads of other blogs that focus specifically on connecting sensors to the Arduino boards, checkout



Azure Mobile Services

To make use of Azure Mobile Services you will need an Azure subscription. Microsoft offer a free one month trial with $210 credit to spend on all Azure Services. So what are you waiting for J

Ok Back to Azure Mobile Services, Microsoft define Azure Mobile Services as “a scalable and secure backend that can be used to power apps on any platform–iOS, Android, Windows or Mac. With Mobile Services, it’s easy to store app data in the cloud or on-premises, authenticate users, and send push notifications, as well as add your custom backend logic in C# or Node.js.” For my IoT project I’m just going to use Azure Mobile Services as a place to accept connections from the Arduino Yun and store the raw sensor data.

Create a Mobile Service

Creating the mobile service is pretty straight forward. Within the web management portal select New, Compute, Mobile Service then Create.


Azure Mobile Services will prompt you for:

  • A URL – This is the end point address the Arduino Yun will use to connect to Azure Mobile Services
  • Database – A NoSQL database to store our raw sensor data
  • Region – Which geographic region will host the mobile service and db
  • Backend – the code family used in the back end. I’m using JavaScript

Next you’ll be asked to specify some database settings including server name. You can either choose an existing server (if you have one) or alternatively create a brand new one on the fly. Azure Mobile Services will prompt you for:

  • Name – That’s the name of your database
  • Server – I don’t have an existing one so I’m selecting “New SQL database Server”
  • Server Login Name – A login name for your new db
  • Server Login Password – The password

Now that we have a database it’s time to create a table to store our raw data. The Azure Management Portal provides an easy to use UI to create a new table. Go to the Service Management page, select the Data tab, and click the “+” sign at the bottom of the page. As we are creating a NoSQL table there is no need to specify a schema. Simply provide a table name and configure the permissions “insert/update/delete/read” operations.

Retrieval of the Application Key

Now it’s time to retrieve the Application Key. This will be used to authenticate the REST API calls, when we post data to the table. To retrieve the application key go to the Dashboard page and select the “manage keys” button at the bottom of the page. Two keys will be displayed, copy the “application” one.



Create the Table

Within the Mobile Service Management page, select Data

Click Create


Once the table has been create its ready to accept values. The Arduino Yun can be programed to send its sensor values to our new database via the Azure Mobile Services REST API.

The Application key retrieved earlier will be used to authenticate the API calls.


The Arduino Yun Sketch

Here is the basic code inside the Arduino Yun sketch, the code has the some core functions as follows:


Every Arduino Yun sketch contains a setup function, this is where things like the serial port and bridge are initialized.


Every Arduino Yun sketch contains a loop this is where the sensor values are read and the other functions are called.


The send_request function is used to establish a connection with the Azure Mobile Services endpoint. A HTTP POST is formed, authentication takes place and a JSON object is generated with our sensor values and placed in the body. Currently the sample code below sends a single sensor value (lightLevel) to Azure Mobile Services. This could easily be expanded to include all sensor values from the array of sensors connected to the Arduino Yun.


This function waits until there are free bytes available on the connection.


This function reads the response bytes from Azure Mobile Services and outputs the HTTP response code to the serial console for debugging / troubleshooting purposes.


/* Arduino Yun sketch writes sensor data to Azure Mobile Services.*/

 // Include Arduino Yun libraries
#include <Bridge.h>
#include <YunClient.h>
#include <SPI.h>

 // Azure Mobile Service address
const char *server = “”;

// Azure Mobile Service table name
const char *table_name = “iotarduino_data”;

// Azure Mobile Service Application Key
const char *ams_key = “HJRxXXXXXXXXXXXXXXXmuNWAfxXXX”;

 YunClient client;
char buffer[64];

/*Send HTTP POST request to the Azure Mobile Service data API */
void send_request(int lightLevel)
if (client.connect(server, 80)) {
Serial.print(“sending “);

sprintf(buffer, “POST /tables/%s HTTP/1.1”, table_name);

 // Host header
sprintf(buffer, “Host: %s”, server);

 // Azure Mobile Services application key
sprintf(buffer, “X-ZUMO-APPLICATION: %s”, ams_key);

 // JSON content type
client.println(“Content-Type: application/json”);

 // POST body
sprintf(buffer, “{\”LightLevel\”: %d}”, lightLevel);

 // Content length
client.print(“Content-Length: “);

 // End of headers

 // Request body

 } else {
Serial.println(“connection failed”);

 /* Wait for a response */
void wait_response()
while (!client.available()) {
if (!client.connected()) {

 /* Read the response and output to the serial monitor */
void read_response()
bool print = true;

 while (client.available()) {
char c =;
// Print only until the first carriage return
if (c == ‘\n’)
print = false;
if (print)

 /* Terminate the connection*/
void end_request()

 /* Arduino Yun Setup */
void setup()
Serial.println(“Starting Bridge”);

 /* Arduino Yun Loop */
void loop()
int val = analogRead(A0);


The Sensor Data in Azure

Once the sketch is uploaded to the Arduino Yun and executed, the sensor data can be viewed within the Azure Mobile Services dashboard.


The Arduino Yun Serial monitor displays the serial (debugging / troubleshooting) comments as the sketch executes.



This was just a bit of fun and this is obviously not an enterprise grade solution, however, I hope it goes some way to illustrating the possibilities that are readily available to all of us. Things can be done today in far fewer steps. Access to powerful compute and storage is easier than ever.

The Arduino Yun is an open source electronic prototyping board that allows people like me, without any real developer skills to mess around with electronics, explore ideas and interact with the outside world. There are loads of interesting Arduino code samples and ideas for projects with real world use cases.

My goal here was to illustrate how easy it is to obtain raw sensor data from the outside world and store it in a place where I have loads of options for data processing. By placing my data in Azure I have access to the power and resources of the Microsoft Azure cloud platform literally at my fingertips. . .Enjoy the possibilities!