Versioning of iOS app (Dev, Stage, and Prod) using same code base – Xamarin Forms

App versioning based on same code base is the favorite feature of most of the clients I worked for various reasons. Usually, this feature is requested to cater App to the different set of departments or support multi-tenancy that client want to support. When creating a complex app, there are n-number of teams working together to build it, which includes dev testers,  beta testers, or even a client who wants to demo a product before actually purchasing it. Having single app does not support the purpose, as a client may want to cherry-pick different features to be available in the different version of the app.

In this blog, I will be sharing about how to create the different version of the app in iOS, I will do a different blog of Android. For this purpose, initially, we need to setup configurations for each required App version we want to support.

Set different configurations

In Mac, double-clicking on the solution level will prompt us with Solution Options dialog, and go to Configurations in the Build section. Go to Configurations tab as shown in the screen below.

Screen Shot 2018-04-27 at 1.34.45 pm

As you can notice for a new project, there are two configuration settings available as Debug and Release for various platforms. if you go to General tab, you can find all the available configurations broken down. Also, you can find options to add new, copy, remove and rename configurations. As per our requirements, we will Dev, Staging and Production configurations. Usually, we need the only version for iPhone, therefore create Configurations for iPhone only.

Dev Configuration setup

Screen Shot 2018-04-27 at 1.47.10 pm

Staging configuration setup

Screen Shot 2018-04-27 at 1.46.27 pm

Prod configuration setup

Screen Shot 2018-04-27 at 1.46.49 pm

Final configuration setup should be as below. Now close solution and open it again, so the Visual Studio can do all the binding properly.

Screen Shot 2018-04-27 at 1.47.19 pm

Complier symbols options:

Symbols are used along with the directives to compile code as per the given conditions, just like if-else condition. If you can double click on the iOS project, a dialog box called project options will be opened. If you go to Compiler tab under Build section, you can notice different compiler symbol options are available. We need to set new complier names for our newly created configurations. This can be done by selecting the required configuration and adding relevant Symbol to it, as shown in below screenshot.

For Dev, I have Dev as compiler symbol.

Screen Shot 2018-04-27 at 2.01.25 pm

For Staging, I Stage Dev as compiler symbol.

Screen Shot 2018-04-27 at 2.02.57 pm

For Production, I have Prod as compiler symbol.

Screen Shot 2018-04-27 at 2.03.53 pm.png

Purpose

Now, that we have setup configurations, we can write code as shown below and whenever different configuration builds are made, it will automatically pick right setting for the right environment.

Screen Shot 2018-04-27 at 3.08.08 pm.png

Info.plist

Info.plist is the file that dictates what configurations are to be applied to the concerned app. Original info.plist can be found setup as below

Screen Shot 2018-04-27 at 2.06.30 pm.png

We will be creating three different info.plist for our three versions of the app as shown below.

Screen Shot 2018-04-27 at 2.09.24 pm.png

To keep thing simple, just change the CFBundleIdentifier to relevant names, in this case, I will be appending with Dev, Stage, and Prod to current Identifier. Also, dont forget to remove display name from it

Dev

Screen Shot 2018-04-27 at 2.58.32 pm.png

Screen Shot 2018-04-27 at 2.48.57 pm

Stage

Screen Shot 2018-04-27 at 2.59.03 pm.png

Screen Shot 2018-04-27 at 2.50.12 pm.png

Prod:

Screen Shot 2018-04-27 at 2.59.21 pm.png

Screen Shot 2018-04-27 at 2.50.55 pm.png

Edit iOS Proj file

Now go and edit iOS proj file, to edit do as shown in the below screenshot

Screen Shot 2018-04-27 at 2.37.24 pm.png

After opening the proj file, look for info.plist file

Screen Shot 2018-04-27 at 2.39.07 pm.png

Now add the conditions to them according to each platform and move the newly created info.plist’s above the original info.plist or else it will take first available plist as shown below

Screen Shot 2018-04-27 at 2.42.52 pm.png

Output

When we run with above configurations, it will create different apps for same code base

IMG_0038IMG_0039IMG_0040IMG_0041

 

Github link

https://github.com/divikiran/AppVersioning

How to make Property Bag Values indexed and searchable in SharePoint Online

In an earlier post here we have seen how we can set Property Bag values in Modern SharePoint sites. One of the major reasons for setting Property Bag values in SharePoint sites is to make the SharePoint sites searchable based on custom metadata.

However, property bag values are not crawled by the SharePoint Online Search index directly. To make a property bag value searchable, we must explicit set the property bag values to be indexed by the Search crawler. To do this, we simply set the Property bag values to indexed using the SharePoint PnP PowerShell.

The “-Indexed” attribute of the Set-PnPPropertyBagValue cmdlet makes an entry in a hidden system property bag value vti_indexedpropertykeys which then makes these property bag values searchable. A screenshot of the result is below.

SetPnPPropertyBagIndexed.png

After performing the above, we flag the site to be reindexed so that the SharePoint search crawler will pick up the new Property Bag values in the next crawl. This can be done from “Site Settings -> Search and offline availability” (in the Search group), then click the button “Reindex site” (screenshot below).

Note: Since SharePoint Online is a massive SaaS system, it can take up to 24 hours for the crawler to pick up this property.  Unfortunately, there’s no workaround for this delay, you simply must be patient.

Search and Offline Availability

Conclusion

In the above post we saw how we can enable Property Bag values to be searchable.

 

Report of All Taxonomy Fields containing a term in SharePoint Tenancy

Recently we had a request to find fields/columns in all lists across the tenancy which have a specific Taxonomy term because we needed to report on field usage across all site collections. However, we found that getting a report of all Taxonomy fields in your SharePoint tenancy that is linked to a specific Term Set can get quite daunting because there is no direct SharePoint Query to fetch the associations.

The technical challenge is that using PnP PowerShell, the Taxonomy fields are returned as a generic SP.Field type and not of type SP.TaxonomyField. Hence the Taxonomy field metadata values such as Group ID and Termset ID are absent. To resolve the above limitation, we used the Field $field.SchemaXml to find the specified values.

Note: Querying the Term store while searching for a specific termset by using Get-PnPTerm can add a lot of latency time. Hence to decrease the additional time we could download the entire term store to an Excel file and use that Excel file as the master data for matching. Below is the command to get an export of all Taxonomy values as a CSV file.

Export-PnPTaxonomy -Path "[path]\taxonomyreport.csv" -Delimiter "," -IncludeID

Steps:

The steps to retrieve and check for taxonomy fields can be found below.

1. Get all lists in a web site
2. Get the Taxonomy fields for a List
3. Read the schema.xml and search for a TermsetID and AnchorID (Thanks to @Colin Philips (http://itgroove.net/mmman/) for finding the correct xpath for parsing xml)
4. Match the data with the above Taxonomy report for Group ID, Termset ID, and Anchor ID with Term ID that is column is linked to
5. In case of a match, save the values into the report.

The below code uses PnP PowerShell. For a quick set up of PnP PowerShell, please refer to this blog.

Conclusion:

Hence with the above approach, we can retrieve the taxonomy fields for all site collections.  Be aware that the above process can take about half a day or more to run depending on the number of site collections and taxonomy fields in your tenancy.  Be sure to give it enough time to run before prematurely cancelling it.

Happy Coding!! 😊

Promoting and Demoting Site pages to News in Modern SharePoint Sites using SPFx extension and Azure Function

The requirement that I will be addressing in this blog is how to Promote and Demote site pages to news articles in Modern SharePoint sites. This approach allows us to promote any site page to News, add approval steps and demote news articles to site pages if the news need to be updated. The news also shows in the modern news web part when the site page is promoted.

Solution Approach:

To start with, create a site page. For creating a Modern page using Azure Function, please refer to this blog. After the site page is created, we will be able to use a status column to track the news status and promote a site page to news status. The status column could have three values – draft, pending approval and published.

We will use a SPFx extension to set the values of the status column and call the Azure Function to promote the site page to news page using SharePoint Online CSOM.

Promoting a site page to news page

Below are the attributes that need to be set for site pages to promote as news article.

1. Promoted State Column set to 2 – set through SPFx extension
2. First Published date value set to published date – set through SPFx extension
3. Promoted state tag in the news site page to be set to value 2 – done in Azure Function
4. Site page needs to be published – done in Azure Function

For a detailed walkthrough on how to create a custom site page with metadata values, please refer to this blog. In order to set the values of ‘Promoted State’ and ‘First Published Date’ metadata values, use the below code after the page is created.

For calling Azure Function from SPFx extension, which will promote the site page to news, can be done using the below method.

Inside the Azure Function, use the below to promote a site page to news article.

Demoting a news to site page

Below are the attributes that needs to be set for demoting a news article to site page

1. Promoted State Column set to 0 – set through SPFx extension
2. First Published date value set to blank – set through SPFx extension
3. Promoted state tag in the news site page to be set to value 0 – done in Azure Function
4. Site page needs to be published – done in Azure Function

For setting the metadata values, the method calls as done above during promotion of site page, can be used. Next in Azure Function, use the below to demote a site page.

Conclusion:

Hence above we saw how we can use SPFx extension and Azure Function to promote and demote site pages to news articles in Modern SharePoint sites.

Protecting Application Credentials when implementing Modular Azure Functions with Microsoft Flow

This weekend I was attempting to rework some older Azure Automation tasks I wrote some time ago that were a combination of PowerShell scripts and Azure (PowerShell Functions). I was looking to leverage Microsoft Flow so that I could have them handy as ‘Buttons’ in the Microsoft Flow mobile app.

Quite quickly I realized that Microsoft Flow didn’t have the capability to perform some of the automation I required, so I handed that off to an Azure Function. The Azure Function then needed to leverage a Registered AAD Application. That required an Application ID and Secret (or a certificate).  This wasn’t going the way I wanted so I took a step back.

The Goals I was attempting to achieve were;

  • A set of Azure Functions that perform small repetitive tasks that can be re-used across multiple Flows
  • Separation of permissions associated with function/object orientated Azure Functions

The Constraints I encountered were;

  • Microsoft Flow doesn’t currently have Azure Key Vault Actions
  • The Flows I was looking to build required functionality that isn’t currently covered by available Actions within Flow

With my goal to have a series of Functions that can be re-used for multiple subscriptions I came up with the following workaround (until Flow has actions for Key Vault or Managed Service Identity).

Current working Workaround/Bodge;

  • I created an Azure Function that can be passed Key Vault URI’s for credential and subscription information
    • typically this is the Application ID, Application Secret, Azure Subscription. These are retrieved from Azure Key Vault using Managed Service Identity
    • returns to the Flow the parameters listed above
  • Flow calls another Azure Function to perform required tasks
    • that Azure Function can be leveraged for an AAD App in any Subscription as credentials are passed to it

RG VM Flow Integration 640px.png

Example Scenario (as shown above);

  1. Microsoft Flow triggered using a Flow Button in the mobile application to report on Azure Virtual Machines
  2. Flow calls Azure Function (Get-Creds) to get credentials associated with the Flow for the environment being reported on
  3. Managed Service Identity used from Azure Function to obtain credentials from Azure Key Vault
    • Application ID, Application Secret and Azure Subscription returned to Flow
  4. Flow calls Azure Function (Get-VM-Status) that authenticates to Azure AD based of credentials and subscription passed to it
  5. Azure Resource Group(s) and VM’s queried from the Function App with the details returned to Flow

Concerns/thoughts;

  1. Passing credentials between integration elements isn’t the best idea
    • obfuscation is that best that can be done for now
    • having the information stored in three different secrets means all information isn’t sent in one call
      • but three web requests are required to get the necessary creds
  2. A certificate for AAD App Authentication would reduce the Key Vault calls to one
    • would this be considered better or worse?
  3. At least the credentials aren’t at rest anywhere other than in the Key Vault.

Summary

We’ve come a long way in a year. Previously we just had Application Settings in Azure Functions and we were obfuscating credentials stored their using encryption techniques. Now with Managed Service Identity and Azure Key Vault we have Function sorted. Leveraging modular Azure Functions to perform actions not possible in Flow though still seems like a gap. How are you approaching such integration?

 

Global Azure Bootcamp 2018 – Creating the Internet of YOUR Things

Today is the 6th Global Azure Bootcamp and I presented at the Sydney Microsoft Office on the Creating the Internet of YOUR Things.

In my session I gave an overview on where IoT is going and some of the amazing things we can look forward to (maybe). I then covered a number of IoT devices that you can buy now that can enrich your life.

I then moved on to building IoT devices and leveraging Azure, the focus of my presentation. How to get started quickly with devices, integration and automation. I provided a working example based off previous my previous posts Integrating Azure IoT Devices with MongooseOS MQTT and PowerShellBuilding a Teenager Notification Service using Azure IoT an Azure Function, Microsoft Flow, Mongoose OS and a Micro Controller, and Adding a Display to the Teenager Notification Service Azure IoT Device

I provided enough information and hopefully inspiration to get you started.

Here is my presentation.

 

How to quickly copy Azure Functions between Azure Tenants and implement ‘Run From Zip’

As mentioned in this post yesterday I needed to copy a bunch of Azure WebApps and Functions from one Tenant to another. With anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well for Azure Functions. This does require a storage account associated with the Function App Plan. My Functions didn’t have the need for storage and the plan tier they were on meant that wasn’t a prerequisite. I didn’t have the desire to add a storage account to backup to then migrate.

Overview

In this post I show my method to quickly copy Azure Functions from one Azure Tenant to another. My approach is;

  • In the Source Tenant from the Azure Functions App
    • Using Kudu take a backup of the wwwroot folder (that will contain one or more Functions)
  • In the Target Tenant
    • Create an Azure Function App
    • Using Kudu locate the wwwroot archive in the new Azure Function App
    • Configure Azure Function Run From Zip

Backing up the Azure Functions in the Source Tenant

Using the Azure Portal in the Source Tenant go to your Function App => Application Settings and select Advanced Tools. Select Debug Console – Powershell and navigate to the Site Folder. Next to wwwroot select the download icon to obtain an archive of your functions.

Download WWWRoot Folder 2.PNG

Copying the Azure Functions to the Target Tenant

In the Target Tenant first create a New Azure Function App. I did this as I wanted to change the naming, the plan and a few other configuration items. Then using the Azure Portal go to your new Function App, Application Settings and select Advanced Tools.

Function Advanced Tools

Create a folder under D:\home\data named SitePackages.

Create Site Packages Folder

Drag and drop your wwwroot.zip file into the SitePackages Folder.

Drag Drop wwwroot

In the same folder select the + icon to create a file named siteversion.txt

Site Packages

Inside the file give the name of your archive file e.g.  wwwroot.zip Select Save.

Siteversion.txt.png

Back in your new Function App select Application Settings

Application Settings

Under Application Settings add a new setting for Website_Use_Zip with a setting value of ‘1’.

Website Use Zip.PNG

Refresh your Function App and you’ll notice it is now Read Only as it is running from Zip. All the Functions that were in the Zip are displayed.

Functions Migrated.PNG

Summary

This is a quick and easy method to get your functions copied from one Tenant to another. Keep in mind if your functions are using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates, credentials in the target environment.

How to quickly copy an Azure Web App between Azure Tenants using ‘Zip Push Deploy’

In the last couple of weeks I’ve had to copy a bunch of Azure WebApps and Functions from one Azure Tenant to another. I hadn’t had to do this for a while and went looking for the quickest and easiest way to accomplish it. As with anything cloud based, things move fast. Some of the methods I found were too onerous and more complex than they needed to be. There is of course the Backup option as well. However for WebApps that is only available if you are on a Standard or above tier Plan. Mine weren’t and I didn’t have the desire to uplift to get that feature.

Overview

In this post I show my method to quickly copy an Azure WebApp from one Azure Tenant to another. I cover copying Azure Functions in another post. My approach is;

  • In the Source Tenant from the WebApp
    • Download the Automation Scripts for the WebApp
    • Using Kudu take a backup of the wwwroot folder
  • In the Target Tenant
    • Create a new Resource from a Template
    • Import the Deployment Automation Scripts from above
    • Modify for any changes, Resource Group, Location etc
    • Use Zip Push Deploy to upload the wwwroot archive and deploy it

Backing up the WebApp in the Source Tenant

Open your WebApp in the Azure Portal. Select Automation Script

WebApp Deployment Script

Download the Automation Script

Save Deployment Script

Select Advanced Tools

Kudu Adv Tools

Select the Site Folder then on the right menu of wwwroot select the download icon and save the backup of the WebApp.

Download WWWRoot Folder 3.png

Expand the Deployment Script archive file from the first step above. The contents will look like those below.

Expand the Deploy Script Archive.PNG

Deploy the WebApp to another Tenant

In the Azure Portal select Create a Resource from the top of the menu list on the left hand side. Type Template in the search box and select Template Deployment then select Create. Select Build your own template in the editor. Select Load File and select the parameters.json file. Then select Load File again and select the template.json file. Select Save.

Load Parameters then Template JSON Files

Make any changes to naming, and provide an existing or new Resource Group for the WebApp. Select Purchase.

New Template Deployment - Change Parameters

The WebApp will be created. Once completed select it from the Resource Group you specified and select Advanced Tools. From the Tools menu select Zip Push Deploy.

Tools Zip Push Deploy

Drag and drop the Zip file with the archive of the wwwroot folder you created earlier.

Drop WebApp ZipFile Export via Kudu

The zip will be processed and the WebApp deployed.

Deployed WebApp

Selecting the App in the new Tenant we can see it is deployed and running.

App Running.PNG

Hitting the App URL we can see that is being served.

Deployed App.PNG

This WebApp is the Microsoft Identity Manager User Object Report that I detailed in this post here.

Summary

In less that 10 minutes the WebApp is copied. No modifying JSON files, no long command lines, no FTP clients. Pretty simple. In the next post I’ll detail how I copied Azure Functions using a similar process.

Keep in mind if your WebApp is using Application Settings, KeyVaults, Managed Service Identity type options you’ll need to add those settings, certificates/credentials in the target environment.

Demystifying Managed Service Identities on Azure

Managed service identities (MSIs) are a great feature of Azure that are being gradually enabled on a number of different resource types. But when I’m talking to developers, operations engineers, and other Azure customers, I often find that there is some confusion and uncertainty about what they do. In this post I will explain what MSIs are and are not, where they make sense to use, and give some general advice on how to work with them.

What Do Managed Service Identities Do?

A managed service identity allows an Azure resource to identify itself to Azure Active Directory without needing to present any explicit credentials. Let’s explain that a little more.

In many situations, you may have Azure resources that need to securely communicate with other resources. For example, you may have an application running on Azure App Service that needs to retrieve some secrets from a Key Vault. Before MSIs existed, you would need to create an identity for the application in Azure AD, set up credentials for that application (also known as creating a service principal), configure the application to know these credentials, and then communicate with Azure AD to exchange the credentials for a short-lived token that Key Vault will accept. This requires quite a lot of upfront setup, and can be difficult to achieve within a fully automated deployment pipeline. Additionally, to maintain a high level of security, the credentials should be changed (rotated) regularly, and this requires even more manual effort.

With an MSI, in contrast, the App Service automatically gets its own identity in Azure AD, and there is a built-in way that the app can use its identity to retrieve a token. We don’t need to maintain any AD applications, create any credentials, or handle the rotation of these credentials ourselves. Azure takes care of it for us.

It can do this because Azure can identify the resource – it already knows where a given App Service or virtual machine ‘lives’ inside the Azure environment, so it can use this information to allow the application to identify itself to Azure AD without the need for exchanging credentials.

What Do Managed Service Identities Not Do?

Inbound requests: One of the biggest points of confusion about MSIs is whether they are used for inbound requests to the resource or for outbound requests from the resource. MSIs are for the latter – when a resource needs to make an outbound request, it can identify itself with an MSI and pass its identity along to the resource it’s requesting access to.

MSIs pair nicely with other features of Azure resources that allow for Azure AD tokens to be used for their own inbound requests. For example, Azure Key Vault accepts requests with an Azure AD token attached, and it evaluates which parts of Key Vault can be accessed based on the identity of the caller. An MSI can be used in conjunction with this feature to allow an Azure resource to directly access a Key Vault-managed secret.

Authorization: Another important point is that MSIs are only directly involved in authentication, and not in authorization. In other words, an MSI allows Azure AD to determine what the resource or application is, but that by itself says nothing about what the resource can do. For some Azure resources this is Azure’s own Identity and Access Management system (IAM). Key Vault is one exception – it maintains its own access control system, and is managed outside of Azure’s IAM. For non-Azure resources, we could communicate with any authorisation system that understands Azure AD tokens; an MSI will then just be another way of getting a valid token that an authorisation system can accept.

Another important point to be aware of is that the target resource doesn’t need to run within the same Azure subscription, or even within Azure at all. Any service that understands Azure Active Directory tokens should work with tokens for MSIs.

How to Use MSIs

Now that we know what MSIs can do, let’s have a look at how to use them. Generally there will be three main parts to working with an MSI: enabling the MSI; granting it rights to a target resource; and using it.

  1. Enabling an MSI on a resource. Before a resource can identify itself to Azure AD,it needs to be configured to expose an MSI. The way that you do this will depend on the specific resource type you’re enabling the MSI on. In App Services, an MSI can be enabled through the Azure Portal, through an ARM template, or through the Azure CLI, as documented here. For virtual machines, an MSI can be enabled through the Azure Portal or through an ARM template. Other MSI-enabled services have their own ways of doing this.

  2. Granting rights to the target resource. Once the resource has an MSI enabled, we can grant it rights to do something. The way that we do this is different depending on the type of target resource. For example, Key Vault requires that you configure its Access Policies, while to use the Event Hubs or the Azure Resource Manager APIs you need to use Azure’s IAM system. Other target resource types will have their own way of handling access control.

  3. Using the MSI to issue tokens. Finally, now that the resource’s MSI is enabled and has been granted rights to a target resource, it can be used to actually issue tokens so that a target resource request can be issued. Once again, the approach will be different depending on the resource type. For App Services, there is an HTTP endpoint within the App Service’s private environment that can be used to get a token, and there is also a .NET library that will handle the API calls if you’re using a supported platform. For virtual machines, there is also an HTTP endpoint that can similarly be used to obtain a token. Of course, you don’t need to specify any credentials when you call these endpoints – they’re only available within that App Service or virtual machine, and Azure handles all of the credentials for you.

Finding an MSI’s Details and Listing MSIs

There may be situations where we need to find our MSI’s details, such as the principal ID used to represent the application in Azure AD. For example, we may need to manually configure an external service to authorise our application to access it. As of April 2018, the Azure Portal shows MSIs when adding role assignments, but the Azure AD blade doesn’t seem to provide any way to view a list of MSIs. They are effectively hidden from the list of Azure AD applications. However, there are a couple of other ways we can find an MSI.

If we want to find a specific resource’s MSI details then we can go to the Azure Resource Explorer and find our resource. The JSON details for the resource will generally include an identity property, which in turn includes a principalId:

Screenshot 1

That principalId is the client ID of the service principal, and can be used for role assignments.

Another way to find and list MSIs is to use the Azure AD PowerShell cmdlets. The Get-AzureRmADServicePrincipal cmdlet will return back a complete list of service principals in your Azure AD directory, including any MSIs. MSIs have service principal names starting with https://identity.azure.net, and the ApplicationId is the client ID of the service principal:

Screenshot 2

Now that we’ve seen how to work with an MSI, let’s look at which Azure resources actually support creating and using them.

Resource Types with MSI and AAD Support

As of April 2018, there are only a small number of Azure services with support for creating MSIs, and of these, currently all of them are in preview. Additionally, while it’s not yet listed on that page, Azure API Management also supports MSIs – this is primarily for handling Key Vault integration for SSL certificates.

One important note is that for App Services, MSIs are currently incompatible with deployment slots – only the production slot gets assigned an MSI. Hopefully this will be resolved before MSIs become fully available and supported.

As I mentioned above, MSIs are really just a feature that allows a resource to assume an identity that Azure AD will accept. However, in order to actually use MSIs within Azure, it’s also helpful to look at which resource types support receiving requests with Azure AD authentication, and therefore support receiving MSIs on incoming requests. Microsoft maintain a list of these resource types here.

Example Scenarios

Now that we understand what MSIs are and how they can be used with AAD-enabled services, let’s look at a few example real-world scenarios where they can be used.

Virtual Machines and Key Vault

Azure Key Vault is a secure data store for secrets, keys, and certificates. Key Vault requires that every request is authenticated with Azure AD. As an example of how this might be used with an MSI, imagine we have an application running on a virtual machine that needs to retrieve a database connection string from Key Vault. Once the VM is configured with an MSI and the MSI is granted Key Vault access rights, the application can request a token and can then get the connection string without needing to maintain any credentials to access Key Vault.

API Management and Key Vault

Another great example of an MSI being used with Key Vault is Azure API Management. API Management creates a public domain name for the API gateway, to which we can assign a custom domain name and SSL certificate. We can store the SSL certificate inside Key Vault, and then give Azure API Management an MSI and access to that Key Vault secret. Once it has this, API Management can automatically retrieve the SSL certificate for the custom domain name straight from Key Vault, simplifying the certificate installation process and improving security by ensuring that the certificate is not directly passed around.

Azure Functions and Azure Resource Manager

Azure Resource Manager (ARM) is the deployment and resource management system used by Azure. ARM itself supports AAD authentication. Imagine we have an Azure Function that needs to scan our Azure subscription to find resources that have recently been created. In order to do this, the function needs to log into ARM and get a list of resources. Our Azure Functions app can expose an MSI, and so once that MSI has been granted reader rights on the resource group, the function can get a token to make ARM requests and get the list without needing to maintain any credentials.

App Services and Event Hubs/Service Bus

Event Hubs is a managed event stream. Communication to both publish onto, and subscribe to events from, the stream can be secured using Azure AD. An example scenario where MSIs would help here is when an application running on Azure App Service needs to publish events to an Event Hub. Once the App Service has been configured with an MSI, and Event Hubs has been configured to grant that MSI publishing permissions, the application can retrieve an Azure AD token and use it to post messages without having to maintain keys.

Service Bus provides a number of features related to messaging and queuing, including queues and topics (similar to queues but with multiple subscribers). As with Event Hubs, an application could use its MSI to post messages to a queue or to read messages from a topic subscription, without having to maintain keys.

App Services and Azure SQL

Azure SQL is a managed relational database, and it supports Azure AD authentication for incoming connections. A database can be configured to allow Azure AD users and applications to read or write specific types of data, to execute stored procedures, and to manage the database itself. When coupled with an App Service with an MSI, Azure SQL’s AAD support is very powerful – it reduces the need to provision and manage database credentials, and ensures that only a given application can log into a database with a given user account. Tomas Restrepo has written a great blog post explaining how to use Azure SQL with App Services and MSIs.

Summary

In this post we’ve looked into the details of managed service identities (MSIs) in Azure. MSIs provide some great security and management benefits for applications and systems hosted on Azure, and enable high levels of automation in our deployments. While they aren’t particularly complicated to understand, there are a few subtleties to be aware of. As long as you understand that MSIs are for authentication of a resource making an outbound request, and that authorisation is a separate thing that needs to be managed independently, you will be able to take advantage of MSIs with the services that already support them, as well as the services that may soon get MSI and AAD support.

Xamarin Forms – Platform Specifics (iOS) : Blur Effect

As a Xamarin mobile developer ever wonder, why we need write some much of code in PCL and iOS projects to do simple Native feature, some of which are usually one-liners code natively.

Xamarin has now introduced a nice nifty feature that helps us to write code in Xamarin Forms in the form of Platform Specifics.

In short, Platform Specifics helps us to consume the features or functionalities that are only available on iOS, without needing to implement custom renderers or effects in the required platform project.

One of the best example to understand this features is Blur effect. Platform specifics are baked into the Xamarin and it is ready to use.

Below are the steps to test this feature

Create a Xamarin Forms projectScreen Shot 2018-04-12 at 22.28.54

Namespaces: It is important to understand Xaml namespaces to get to know about Xamarin specifics. Below is the required namespace on each page.

xmlns:ios=clr-namespace:Xamarin.Forms.PlatformConfiguration.iOSSpecific;assembly=Xamarin.Forms.Core 

Blur Effect

Below the way we can define Blur effect on a boxview, this effect can be implemented on any visual elements in Xamarin Forms.

<BoxView x:Name=boxView ios:VisualElement.BlurEffect=Dark HeightRequest=50 WidthRequest=50 />

Blur effect have enumeration options to be set to it.

  1. Dark – applies a Dark blur effect
  2. Light – applies the light blur effect
  3. ExtraLight – applies an extra light blur effect
  4. None – applies no blur effect

Below is sample code for various blur effects we can  notice

<StackLayout>
    <Image Source=Aus.png HeightRequest=50 WidthRequest=50 />
    <BoxView x:Name=boxView ios:VisualElement.BlurEffect=Dark HeightRequest=50 WidthRequest=50 />
</StackLayout>
<StackLayout>
    <Image Source=Aus.png HeightRequest=50 WidthRequest=50 />
    <BoxView x:Name=boxView1 ios:VisualElement.BlurEffect=Light HeightRequest=50 WidthRequest=50 />
</StackLayout>
<StackLayout>
    <Image Source=Aus.png HeightRequest=50 WidthRequest=50 />
    <BoxView x:Name=boxView2 ios:VisualElement.BlurEffect=ExtraLight HeightRequest=50 WidthRequest=50 />
</StackLayout>

Below is the sample output on iOS

Screen Shot 2018-04-12 at 22.50.55

Here is the sample available in Github

https://github.com/divikiran/PlatformSpecficBlurEffect.git