How to use a Powershell Azure Function to Tweet IoT environment data

Overview

This blog post details how to use a Powershell Azure Function App to get information from a RestAPI and send a social media update.

The data can come from anywhere, and in the case of this example I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the difference in this post is outputting the manipulated data to social media (Twitter) whilst still using a TimerTrigger Powershell Azure Function App to perform the work and leverage the “serverless” Azure Functions model.

Prerequisites

The following are prerequisites for this solution;

Create a folder on your local machine for the Powershell Module then save the module to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name InvokeTwitterAPIs -Path c:\temp\twitter

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption so you only pay for what you use, and select an appropriate location and Storage Account.

Create a Twitter App

Head over to http://dev.twitter.com and create a new Twitter App so you can interact with Twitter using their API. Give you Twitter App a name. Don’t worry about the URL too much or the need for the Callback URL. Select Create your Twitter Application.

Select the Keys and Access Tokens tab and take a note of the API Key and the API Secret. Select the Create my access token button.

Take a note of your Access Token and Access Token Secret. We’ll need these to interact with the Twitter API.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function. For my app I’m changing from the default of a 5 min schedule to hourly on the top of the hour. I did this after I’d already created the Function App as shown below. To update the schedule I edited the Function.json file and changed the schedule to “schedule”: “0 0 * * * *”

Give your Function App a name and select Create.

Configure Azure Function App Application Settings

In your Azure Function App select “Configure app settings”. Create new App Settings for your Twitter Account, Twitter Account AccessToken, AccessTokenSecret, APIKey and APISecret using the values from when you created your Twitter App earlier.

Deployment Credentials

If you haven’t already configured Deployment Credentials for your Azure Function Plan do that and take note of them so you can upload the Twitter Powershell module to your app in the next step.

Take note of your Deployment Username and FTP Hostname.

Upload the Twitter Powershell Module to the Azure Function App

Create a sub-directory under your Function App named bin and upload the Twitter Powershell Module using a FTP Client. I’m using WinSCP.

From the Applications Settings option start Kudu.

Traverse the folder structure to get the path do the Twitter Powershell Module and note it.

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help lines for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Twitter Powershell Module. Include the get-help line for the module so we can see in the logs that the module was imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Twitter Module.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and sends a tweet out to Twitter.

Viewing the Tweet

And here is the successful tweet.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a social media platform. The input could easily be business data from an API and the output a corporate social platform such as Yammer.

Follow Darren on Twitter @darrenjrobinson

How to use a Powershell Azure Function App to get RestAPI IoT data into Power BI for Visualization

Overview

This blog post details using a Powershell Azure Function App to get IoT data from a RestAPI and update a table in Power BI with that data for visualization.

The data can come from anywhere, however in the case of this post I’m getting the data from WioLink IoT Sensors. This builds upon my previous post here that details using Powershell to get environmental information and put it in Power BI.  Essentially the major change is to use a TimerTrigger Azure Function to perform the work and leverage the “serverless” Azure Functions model. No need for a reporting server or messing around with Windows scheduled tasks.

Prerequisites

The following are the prerequisites for this solution;

  • The Power BI Powershell Module
  • Register an application for RestAPI Access to Power BI
  • A Power BI Dataset ready for the data to go into
  • AzureADPreview Powershell Module

Create a folder on your local machine for the Powershell Modules then save the modules to your local machine using the powershell command ‘Save-Module” as per below.

Save-Module -Name PowerBIPS -Path C:\temp\PowerBI
Save-Module -Name AzureADPreview -Path c:\temp\AzureAD 

Create a Function App Plan

If you don’t already have a Function App Plan create one by searching for Function App in the Azure Management Portal. Give it a Name, Select Consumption Plan for the Hosting Plan so you only pay for what you use, and select an appropriate location and Storage Account.

Register a Power BI Application

Register a Power BI App if you haven’t already using the link and instructions in the prerequisites. Take a note of the ClientID. You’ll need this in the next step.

Configure Azure Function App Application Settings

In this example I’m using Azure Functions Application Settings for the Azure AD AccountName, Password and the Power BI ClientID. In your Azure Function App select “Configure app settings”. Create new App Settings for your UserID and Password for Azure (to access Power BI) and our PowerBI Application Client ID. Select Save.

Not shown here I’ve also placed the URL’s for the RestAPI’s that I’m calling to get the IoT environment data as Application Settings variables.

Create a Timer Trigger Azure Function App

Create a new TimerTrigger Azure Powershell Function App. The default of a 5 min schedule should be perfect. Give it a name and select Create.

Upload the Powershell Modules to the Azure Function App

Now that we have created the base of our Function App we’re going to need to upload the Powershell Modules we’ll be using that are detailed in the prerequisites. In order to upload them to your Azure Function App, go to App Service Settings => Deployment Credentials and set a Username and Password as shown below. Select Save.

Take note of your Deployment Username and FTP Hostname.

Create a sub-directory under your Function App named bin and upload the Power BI Powershell Module using a FTP Client. I’m using WinSCP.

To make sure you get the correct path to the powershell module from Application Settings start Kudu.

Traverse the folder structure to get the path to the Power BI Powershell Module and note the path and the name of the psm1 file.

Now upload the Azure AD Preview Powershell Module in the same way as you did the Power BI Powershell Module.

Again using Kudu validate the path to the Azure AD Preview Powershell Module. The file you are looking for is the Microsoft.IdentityModel.Clients.ActiveDirectory.dll” file. My file after uploading is located in “D:\home\site\wwwroot\MyAzureFunction\bin\AzureADPreview\2.0.0.33\Microsoft.IdentityModel.Clients.ActiveDirectory.dll”

This library is used by the Power BI Powershell Module.

Validating our Function App Environment

Update the code to replace the sample from the creation of the Trigger Azure Function as shown below to import the Power BI Powershell Module. Include the get-help line for the module so we can see in the logs that the modules were imported and we can see the cmdlets they contain. Select Save and Run.

Below is my output. I can see the output from the Power BI Module get-help command. I can see that the module was successfully loaded.

Function Application Script

Below is my sample script. It has no error handling etc so isn’t production ready, but gives a working example of getting data in from an API (in this case IoT sensors) and puts the data directly into Power BI.

Viewing the data in Power BI

In Power BI it is then quick and easy to select our Inside and Outside temperature readings referenced against time. This timescale is overnight so both sensors are reading quite close to each other.

Summary

This shows how easy it is to utilise Powershell and Azure Function Apps to get data and transform it for use in other ways. In this example a visualization of IoT data into Power BI. The input could easily be business data from an API and the output a real time reporting dashboard.

Follow Darren on Twitter @darrenjrobinson

 

 

 

 

Azure API Management Step by Step – Use Cases

jorge-fotoUse Cases

On this second post about Azure API management, let’s discuss about use cases. Why “Use Cases”?                  

Use cases helps to manage complexity, since it focuses on one specific usage aspect at the time. I am grouping and versioning use cases to facilitate your learning process and helping to keep track with future changes. You are welcome to use these diagrams to demonstrate Azure API management features.

API On-boarding is a key aspect of API governance and first thing to be discussed. How can I publish my existing and future APIs back-ends to API Management?

API description formats like Swagger Specification (aka Open API Initiative https://openapis.org/) are fundamental to properly implement automation and devops on your APIM initiative. API can be imported using swagger, created manually or as part of a custom automation/integration process.

Azure API management administrators can group APIs by product allowing subscription workflow. Products visibility are linked with user groups, providing restricted access to APIs. You can manage your API policies as Code thought an exclusive GIT source control repository available to your APIM instance. Secrets and constants used by policies are managed by a key/value(string) service called properties.

apim-use-cases-adm-api-onboarding

Azure API management platform provides a rich developers portal. Developers can create an account/profile, discover APIs and subscribe to products. API Documentation, multiple language source code samples, console to try APIs, API subscription keys management and Analytics are main features provided. 

apim-use-cases-developer

The management and operation of the platform plays an important role on daily tasks. For enterprises, user groups and user(developers) can be fully integrated with Active Directory. Analytics dashboards and reports are available. Email notification and templates are customizable. APIM REST API and powershell commands are available to most of platform features, including exporting analytics reports.

apim-use-cases-administrator

Security administration use cases groups different configurations. Delegation allows custom development of portal sign-in, sign-up and product subscription. OAuth 2.0 and OpenID providers registration are used by development portal console, when trying APIs, to generate required tokens. Client certificates upload and management are done here or using automation. Developers portal identities configurations brings out of the box integration with social providers. GIT source control settings/management and APIM REST API tokens are available as well.

apim-use-cases-adm-security

Administrators can customize developers portal using built in content management systems functionality. Custom pages and modern javascript development is now allowed. Blogs feature allow of the box blog/post publish/unpublish functionality. Developers submitted applications can be published/unpublished by administrator, to be displayed at developers portal.

apim-use-cases-adm-developer-poral

In Summary, Azure API management is a mature and live platform with a few new features under development, bringing a strong integration with Azure Cloud. Click here for RoadMap

In my next post, I will deep dive in API on-boarding strategies.  

Thanks for reading @jorgearteiro

Posts: 1) Introduction  2) Use Cases

Azure API Management Step by Step

jorge-fotoIntroduction

As a speaker and cloud consultant, I have learned and received a lot of feedback about Azure API management platform from customers and community members. I will share some of my learnings in this series of blog posts. Let’s get started!

apim-image

APIs – Application programming interfaces are everywhere! They are already part of many companies’ strategies. But how could we consolidate internal and external APIs? How could you productize and monetize them for your company?

We often build APIs to be consumed by a unique application. However, we could also build these APIs to be shared. If you write HTTP APIs around a single and specific business requirement, you can encourage API re-usability and adoption. Bleeding edge technologies like containers and serverless architecture are pushing this approach even further.

API strategy and Governance comes in play to help build a Gateway on top of your APIs. Companies are developing MVPs (minimum viable products) and time to market is fundamental. For example, we do not have time to write authentication, caching and Analytics over and over again.  Azure API management can help you make this happen. apim-consolidation

apim-consolidation-operations

This demo API Manegement instance that I created for Kloud solutions illustrates how you could create a unified API endpoint to expose your APIs. Multiple “Services” are published there with a single Authentication layer. If your Email Service back-end implementation uses an external API, like Sendgrid, you can Inject this authentication on the API Management gateway layer, making it transparent for end users.

Azure API management provides a high scalable and multi-regional Gateway that can be deployed on any Azure Region around the world. It is a fully PaaS (platform-as-a-service) API management solution, where you do not have to manage any infrastructure. This, combined with other Azure offerings, like App Services (Web Apps, API Apps, Logic Apps and Functions), provides an Enterprise grade platform to delivery any API strategy.

apim-diagram

Looking this diagram above, we can decouple API Management in 3 main components:

  • Developer Portal – Customizable web site exclusive to your company to allow internal and external developers to engage, discover and consume APIs.
  • Gateway (proxy) – Engine of APIM where Policies can be applied on you inbound, back-end and outbound traffic. It’s very scalable and allows multi-regional deployment, Azure Virtual Network VPN, Azure Active Directory integration and native caching solution. Policies are written in XML and C# expressions to define complex rules like: Rate limit, quota, caching, JWT token validation, Authentication, XML to Json and Json to XML transformations, rewrite URL, CORS, restrict IPs, Set Headers, etc.
  • Administration Portal (aka Publisher Portal) – Administration of your APIM instance can be done via the portal.  Automation and devops teams can use APIM management REST API and/or Powershell commands to fully integrate APIM in your onboarding, build and release processes.

Please keep in mind that this strategy can apply to any environment and architecture where HTTP APIs are exposed, whether they are new microservices or older legacy applications.

Feel free to create an user at https://kloud.portal.azure-api.net, I will try to keep this Azure API Management instance usable for demo purposes only, no guaranties. Then, you can create your own development instance from Azure Portal later.

In my next post, I will talk about API Management use cases and give you a broader view of how deep this platform can go. Click here.

Thanks for reading! @jorgearteiro

Posts: 1) Introduction  2) Use Cases

Querying Skype for Business Online using UCWA and PowerShell

Introduction

Recently a colleague from a previous employer of mine pinged me about connecting to Skype for Business using the Unified Communications Web API (UCWA). UCWA is the REST API that comes with Skype for Business 2015 and exposes Instant Messaging and Presence capabilities. Initially UCWA was only for the OnPremise release of S4B, but this has recently been extended to Skype for Business Online.

The detail on leveraging the UCWA is all here however when it comes to actually doing it, it gets a little daunting. This blog by Matthew Proctor gives more hope and a few more threads to pull on to get something working. After jumping through a few hoops and also being pointed in the right direction from this post from Adam I got the sequence sorted to authenticate, access the API using PowerShell and then actually doing something with it. I thought I’d write it up so I can save someone else the pain (and myself if I need it in the future).

Getting Started

As per the title I’m going to show you connecting to and consuming information from Skype for Business Online using PowerShell. Naturally you’re going to need to have an Office365 Tenant with S4B and all the pre-requisites associated with that configured (such as your CNAME and SRV S4B DNS settings etc).

This post covers using PowerShell to:

  • Authenticate to the UCWA API
  • Get your S4B account details
  • Change your status between Busy, Away and Available, Do not disturb
  • Get your Contacts
  • Get your Contacts status

That should give you enough to also get started and munge it for whatever you need.

Authenticating

The following script gets you authenticated to S4B Online via the API. Change the script for your S4BO account and associated password. Also update for your S4BO tenant Autodiscover URL.

Create the UCWA Application

The following script will create an application on the UCWA endpoint. The Endpoint ID you can make up yourself. Same for the Application name.

 

Change your Presence, Get your Contacts &  their Status

Now you can do what you want or need to with the API. Here are a few examples of changing your status, getting your contacts, getting the status of your contacts.

Contacts Statuses

Contacts List

Follow Darren on Twitter @darrenjrobinson

Moving SharePoint Online workflow task metadata into the data warehouse using Nintex Flows and custom Web API

This post suggests the idea of automatic copying of SharePoint Online(SPO) workflow tasks’ metadata into the external data warehouse.  In this scenario, workflow tasks are becoming a subject of another workflow that performs automatic copying of task’s data into the external database using a custom Web API endpoint as the interface to that database. Commonly, the requirement to move workflow tasks data elsewhere arises from limitations of SPO. In particular, SPO throttles requests for access to workflow data making it virtually impossible to create a meaningful workflow reporting system with large amounts of workflow tasks. The easiest approach to solve the problem is to use Nintex workflow to “listen” to the changes in the workflow tasks, then request the task data via SPO REST API and, finally, send the data to external data warehouse Web API endpoint.

Some SPO solutions require creation of a reporting system that includes workflow tasks’ metadata. For example, it could be a report about documents with statuses of workflows linked to these documents. Using conventional approach (ex. SPO REST API) to obtain the data seems unfeasible as SPO throttles requests for workflow data. In fact, the throttling is so tight that generation of reports with more than a hundred of records is unrealistic. In addition to that, many companies would like to create Business Intelligence(BI) systems analysing workflow tasks data. Having data warehouse with all the workflow tasks metadata can assist in this job very well.

To be able to implement the solution a few prerequisites must be met. You must know basics of Nintex workflow creation and to be able to create a backend solution with the database of your choice and custom Web API endpoint that allows you to write the data model to that database. In this post we have used Visual Studio 2015 and created ordinary REST Web API 2.0 project with Azure SQL Database.

The solution will involve following steps:

  1. Get sample of your workflow task metadata and create your data model.
  2. Create a Web API capable of writing data model to the database.
  3. Expose one POST endpoint method of the Web REST API that accepts JSON model of the workflow task metadata.
  4. Create Nintex workflow in the SPO list storing your workflow tasks.
  5. Design Nintex workflow: call SPO REST API to get JSON metadata and pass this JSON object to your Web API hook.

Below is detailed description of each step.

We are looking here to export metadata of a workflow task. We need to find the SPO list that holds all your workflow tasks and navigate there. You will need a name of the list to be able to start calling SPO REST API. It is better to use a REST tool to perform Web API requests. Many people use Fiddler or Postman (Chrome Extension) for this job. Request SPO REST API to get a sample of JSON data that you want to put into your database. The request will look similar to this example:

Picture 1

The key element in this request is getbytitle(“list name”), where “list name” is SPO list name of your workflow tasks. Please remember to add header “Accept” with the value “application/json”. It tells SPO to return JSON instead of the HTML. As a result, you will get one JSON object that contains JSON metadata of Task 1. This JSON object is the example of data that you will need to put into your database. Not all fields are required in the data warehouse. We need to create a data model containing only fields of our choice. For example, it can look like this one in C# and all properties are based on model returned earlier:

The next step is to create a Web API that exposes a single method that accepts our model as a parameter from the body of the request. You can choose any REST Web API design. We have created a simple Web API 2.0 in Visual Studio 2015 using general wizard for MVC, Web API 2.0 project. Then, we have added an empty controller and filled  it with the code that works with the Entity Framework to write data model to the database. We have also created code-first EF database context that works with just one entity described above.

The code of the controller:

The code of the database context for Entity Framework

Once you have created the Web API, you should be able to call Web API method like this:
https://yoursite.azurewebsites.net/api/EmployeeFileReviewTasksWebHook

You will need to put your model data in the request body as a JSON object. Also don’t forget to include proper headers for your authentication and header “Accept” with “application/json” and set type of the request to POST. Once you’ve tested the method, you can move on to the next steps. For example, below is how we tested it in our project.

Picture 4

Next, we will create a new Nintex Workflow in the SPO list with our workflow tasks. It is all straightforward. Click Nintex Workflows, then create a new workflow and start designing it.

Picture 5

Picture 6

Once you’ve created a new workflow, click on Workflow Settings button. In the displayed form please set parameters as it shown on screenshot below. We set “Start when items are created” and “Start when items are modified”. In this scenario, any modifications of our Workflow task will start this workflow automatically. It also includes cases when Workflow task have been modified by other workflows.

Picture 7.1

Create 5 steps in this workflow as it shown on the following screenshots labelled as numbers 1 to 5. Please keep in mind that blocks 3 and 5 are there to assist in debugging only and not required in production use.

Picture 7

Step 1. Create a Dictionary variable that contains SPO REST API request headers. You can add any required headers, including Authentication headers. It is essential here to include Accept header with “application/json” in it to tell SPO that we want JSON in responses. We set Output variable to SPListRequestHeaders so we can use it later.

Picture 8

Step 2. Call HTTP Web Service. We call SPO REST API here. It is important to make sure that getbytitle parameter is correctly set to your Workflow Tasks list as we discussed before. The list of fields that we want to be returned is defined in the “$select=…” parameter of OData request. We need only fields that are included in our data model. Other settings are straightforward: we supply our Request Headers created in Step 1 and create two more variables for response. SPListResponseContent will get resulting JSON object that we going to need at the Step 4.

Picture 9

Step 3 is optional. We’ve added it to debug our workflow. It will send an email with the contents of our JSON response from the previous step. It will show us what was returned by SPO REST API.

Picture 10

Step 4. Here we are calling our custom API endpoint with passing JSON object model that we got from SPO REST API. We supply full URL of our Web Hook, Set method to POST and in the Request body we inject SPListResponseContent from the Step 2. We’re also capturing response code to display later in workflow history.

Picture 11

Step 5 is also optional. It writes a log message with the response code that we have received from our API Endpoint.

Picture 12

Once all five steps are completed, we publish this Nintex workflow. Now we are ready for testing.

To test the system, open list of our Workflow tasks. Click on any task and modify any of task’s properties and save the task. This will initiate our workflow automatically. You can monitor workflow execution in workflow history. Once workflow is completed, you should be able to see messages as displayed below. Notice that our workflow has also written Web API response code at the end.

Picture 13

To make sure that everything went well, open your database and check the records updated by your Web API. After every Workflow Task modification you will see corresponding changes in the database. For example:

Picture 14

 

In this post we have shown that automatic copying of Workflow Tasks metadata into your data warehouse can be done with a simple Nintex Workflow setup and performing only two REST Web API requests. The solution is quite flexible as you can select required properties from the SPO list and export into the data warehouse. We can easily add more tables in case if there are more than one workflow tasks lists. This solution enables creation of powerful reporting system  using data warehouse and also allows to employ BI data analytics tool of your choice.

Entity Framework 7 Data Migration through KUDU

From DevOps perspective, everything needs to be automated in regards to application setup and deployment. There’s no exception for database migration. If database schema change occurs, it should be automatically applied before/after the application deployment. Unlike Entity Framework 6.x using PowerShell cmdlets for database migration, Entity Framework 7 (EF7) uses DNX for it.

Applying Database Migration with EF7

In EF7, updating database change can be done by running the following command:

If your DbContext is located in another project and your web application has a reference to it, then you can run the following command:

By running the command above within your build/deployment pipeline, your database change is easily applied to the existing database. Connectionstrings are defined in appsettings.json in your ASP.NET Core application.

Visit https://docs.efproject.net for more details.

In most cases, there’s no issue to access to Azure SQL Database from your build/deployment server, as long as Azure SQL Database has a proper firewall setup. But what if your enterprise firewall doesn’t allow to connect to Azure SQL Database, like blocking the TCP port of 1433? Then we can’t run this command from our build/deployment server.

REST API in KUDU

KUDU is basically a backend service engine for deployment tied to your Azure Website. If you are running any Azure App Service, your KUDU can be accessible via https://your-azure-website.scm.azurewebsites.net. It provides REST API for website maintenance and one of its endpoint is command. Therefore, we can write a script, say db-migration.cmd, deploy it at the same time when the application is deployed, and run it through this REST API. The db-migration.cmd might look like:

So, the application is ready for database migration. Let’s write a PowerShell script to run the command. Make sure that we are using Azure Service Management (ASM) cmdlets.

NOTE: You should login to ASM with appropriate subscription first.

NOTE: The dir property is where the actual command is run, which is the relative path to %HOME% in Azure App Service.

Running the PS script above will bring you to database migration completed within KUDU. Make sure that the $result object has an exit code of 0 by examining $result.ExitCode. If the exit code is other than 0, database migration has come to fail.

So far, we have briefly looked at KUDU for Azure SQL Database migration. KUDU actually has many useful functions for monitoring, so it would be worth taking a look.

Creating Service Contract with AutoRest, Swagger and HAL

According to Richardson Maturity Model, REST API level 3 should provide hypermedia within responses as metadata, so that users can easily navigate to other resources. This is particularly important if Microservices architecture (MSA) is considered for your service or application. Because MSA is a collection of small REST API services, each API server should provide complete set of documents or manuals for others to consume it properly. In order to provide document automation, Swagger is one of the best choices, and HAL, in order to provide hypermedia within the response, is also yet another option. Of course, consuming those REST API at client applications is also important. In this case, AutoRest can give great benefits for your development. In this post, I’ll discuss HAL, Swagger and AutoRest with some demo code.

Integrating HAL

HAL (Hypertext Application Language) has been suggested by Mike Kelly. Its formal specification can be found at this IETF draft. Generally speaking, many REST API services are well structured. They don’t just only return data in the response, but also include various metadata for reference purpose, in their own way. HAL offers a standardised way of including hyperlinks into those responses. Here’s an example

If you send a request with the URL above, the expecting response will look like:

However, if HAL is applied, the response will look like:

Let’s see another example returning a collection.

The request will expect response like:

If HAL is applied, the response will look like:

Can you see differences? HAL applied responses contain much richer data that can easily navigate to other resources.

OK, theory is done. Let’s implement HAL on your API. First of all, assuming you’ve got an existing Web API app containing models of Product and ProductCollection looking like:

There are many HAL implementations on our NuGet. For now, I’m going to use WebApi.Hal. Once installed, you can modify those models with this way, by minimising impacts on the existing application:

Then, register HAL into your Global.asax.cs or Startup.cs:

That’s it! Too simple? Here’s the magic inside the Representation class. It has the public Links property that takes care of all hypertext links. All we need after the HAL implementation is to add appropriate links to each instance, manually or automatically. Here’s a way to include hyperlinks manually within a controller level:

Once it’s done, build it and check the API response. Then you’ll see the result including those HAL links like above. Now, let’s move to the next part, Swagger.

Integrating Swagger

According to Swagger website, Swagger provides a standard, language-independent interface to REST API so that both humans and computers can easily find and understand the API structure without reading source codes or documents. There are several other alternatives like RAML or API Blueprint, so it’s up to you which one should you choose.

Swashbuckle is one of C# implementations of Swagger. It’s easy to use and virtually zero configuration is required for the basic usage. Let’s install Swashbuckle into your project. Once installed, it also adds another file, SwaggerConfig.cs under the App_Start directory. You can directly use it or do some modification for your application. Let’s have a look.

When you open SwaggerConfig.cs file, it looks like above. In order for you to have more control on it, comment out the first line, [assembly: ...] and make the Register() method to be an extension method:

Then, put the following like into either your Global.asax.cs or Startup.cs for registration:

That’s all. Build and run your app, type URL like http://my.api.service/swagger and you will see the Swagger UI screen like:

What you need to notice is the JSON schema URL in the picture above. When you type the URL from Postman, it will look like:

This is basically a JSON schema file and pretty much equivalent to WSDL from SOAP. With terms of WCF, it contains both service contract and data contract so, with this schema, your client applications can access to the API and consume it. There’s no obstacle for it.

So far, we’ve implemented both HAL and Swagger on your REST API server. How can we then consume those APIs on your clients like web app or mobile app? Let’s move onto the next section for this.

Deserialising Swagger JSON schema with AutoRest

Once Swagger JSON schema is generated, your client apps should be able to consume it with minimised efforts. AutoRest can bring great benefits to your client apps. It automatically generates both service contracts and data contracts pointing to your REST APIs using Swagger JSON schema. Currently, AutoRest supports C#, Java, Node.js and Ruby and the number of languages gets gradually expanded.

Here’s an ASP.NET MVC app. Within the app, assuming there’s a folder, Proxies. Swagger JSON schema file is then placed into that folder with name of swagger.json. Then install AutoRest CLI app and runtime library from NuGet. In order to auto-generate client libraries, try the following command on your Command Prompt:

What does the command line mean? Here’s the brief explanation:

  1. Run AutoRest.exe
  2. Input JSON schema name is swagger.json
  3. The namespace for auto-generated files is HalSwaggerSample.WebApp.Proxies
  4. Store auto-generated files into Proxies
  5. Set output file type as C#

After the generation, you will be able to see bunch of files have been stored into the Proxies folder. The most important file is HalSwaggerSampleHalApiApp.cs. Of course, the file name (or class name) will be different from yours. It provides the service endpoint and service contract information. It also contains the corresponding interface, IHalSwaggerSampleHalApiApp, so this service contract is testable, mockable and injectable. Therefore, your controller/action can be written like:

Therefore, the result screen will look like:

Did you find that can’t be easier?

We’ve so far had a look at HAL, Swagger and AutoRest. Both HAL and Swagger are to design your REST API to be more discoverable, readable and maintainable and make client apps build their access point much easier with tools like AutoRest. How about implementing those into your Web API, if you’re considering MSA? Kloud can help you.

The entire sample source code that I wrote for this post can be found at here.