Kloud delivers infrastructure reforms for one of SA’s largest privately-owned companies

Customer Overview

Cavpower is one of South Australia’s largest privately-owned companies and has been the dealer for the supply, service and maintenance of Caterpillar equipment in SA and Broken Hill since 1972. They provide equipment sales and product support to the mining, quarry, local government, building/heavy construction, power generation, industrial services, petroleum, road transport, waste management, forestry and marine industries.

Business Situation

Cavpower self-manage the majority of their ICT in-house. The company’s infrastructure is largely centralised with core servers and infrastructure hosted on–premises at their head office. As the first stage of readiness for a replacement ERP, Cavpower commenced a strategy to simplify its infrastructure and user environment. This included resolving aging infrastructure and a desktop fleet operating on Windows XP that was approaching end of life, meaning eventual loss of desktop application support and security. With end of support earmarked for April 2014, Cavpower needed to upgrade the entire desktop environment to mitigate risk going forward.

Solution

Cavpower’s MIS team presented a strategy for modernising infrastructure in the organisation that would provide a stable platform needed for future growth of the business. Kloud assisted Cavpower with the delivery of this strategy, specifically around the reforms of desktop and server infrastructure.

Kloud identified an opportunity for Cavpower to leverage a mix of cloud-based solutions, freeing time and resources for the organisation to focus on activities core to their business. A prerequisite to any modernisation of the necessary activities was a review of their active directory domain. A comprehensive assessment was conducted to determine suitability and identify any immediate concerns.

Part of simplifying Cavpower’s current infrastructure included a clean-up of their Active Directory (AD) environment, including the hardware used and the removal of virtual machines which were no longer in use. Cavpower’s desktop fleet SOE was also upgraded to improve configuration and server management. A Windows 7 based SOE was developed and deployed using Microsoft System Center Configuration Manager 2012 R2.

Moving to Office 365 will free up existing storage relieving capacity constraints as well as reducing the need to administer on-premises Exchange servers. Office 365 will also to enable greater collaboration amongst Cavpower’s distributed workforce via Sharepoint and Lync Online.

Benefits

Throughout the engagement, Kloud identified opportunities to:

  • Bolster security
  • Improve efficiencies
  • Automate delivery of applications
  • Minimise administrative overheads
  • Reduce complexities associated with ageing infrastructure

Cavpower is now able to reap the benefits of migrating to Office 365 which will improve storage space and reduce overall complexities with server configurations and email for users moving forward.

Kloud’s detailed approach allowed us to identify all of the potential issues before embarking on a high risk project that would impact every user in the business. Their collaborative style and flexible approach to project management allowed for a seamless integration of the project into our day to day operations with minimal disruption. We forecast a long and successful partnership with Kloud. – Joanne Jones, Manager Operations Support, Cavpower

 

Department uses cloud-based technologies to enable its ‘ICT as a service’ strategy

Customer Overview

The Department of State Development and Infrastructure Planning (DSDIP) plays a critical role in leading a state-wide, coordinated approach to infrastructure, planning and development whilst ensuring a sustainable future for Queensland communities.

The Department of Local Government, Community Recovery and Resilience (DLGCRR) is responsible for overseeing the legislative framework in which local governments operate and enhance community recovery and future resilience.

Business Situation

The Department of State Development and Infrastructure Planning was using an externally hosted Microsoft Exchange 2003 environment for their email. This environment was built on aging physical hardware and had approached Microsoft’s end of life support.

Aside from the need to provide a supported environment for departmental email, strategies defined in the Queensland Government ICT Strategy 2013-17 and the Queensland Commission of Audit (2013), required the department to adopt a new approach to ICT, with key examples including cloud computing, cloud email and ICT-as-a-service.

The integration of Microsoft Exchange Online as the new email platform addressed some of the Agency and Government programs and strategies, as a result.

Solution

Kloud provided the department with a hybrid Exchange 2010/Office 365 environment, which enabled mailboxes to be moved from Exchange 2003, via Exchange 2010 onto Office 365.  Proper planning of mailbox moves allowed for minimal interruption to the business and end users.

Using the hybrid setup with Exchange 2010 also enabled Kloud to implement a process for merging existing PSTs into the centralised email solution. By doing this not only was the organisation’s PST footprint significantly reduced, but the content of the PSTs were now open for new features like eDiscovery. Any additional remediation activities were also performed or coordinated by Kloud.

Benefits

  • Provide a full-featured enterprise messaging platform.
  • Delivery of email services in the cloud, creating an easier to manage environment with improved reliability and availability.
  • New features and capabilities can be added without the need to physically upgrade on-premises infrastructure.
  • Easy email delivery across PC, phone, and browser.
  • Reduce IT administration burden as there is no longer a requirement for server maintenance.
  • PST migrations allowed for freeing up disk space and eliminating support calls for PST related issues.

I was very impressed with the professionalism, expertise and responsiveness which Kloud provided during our complex yet successful transition from Exchange 2003 to Office 365.  The overall transition was delivered with minimal impact to daily business operations, which is a credit to all involved on the email migration project. – Mark Cushing, Chief Information Officer,  Department of State Development and Infrastructure Planning

How to create custom images for use in Microsoft Azure

In this post I will discuss how we can create custom virtual machine images and deploy them to the Microsoft Azure platform. To complete this process you will need an Azure Subscription, the Azure PowerShell module installed and a pre-prepared VHD which you would like to use (VHDX is not supported at present.)

You can sign up for a free trial of Microsoft Azure here if you don’t currently hold a subscription.

Completing this process will allow you take advantage of platforms which aren’t offered “out of the box” on Microsoft Azure eg, Server 2003 and Server 2008 for testing and development. Currently Microsoft offers Server 2008 R2 as the minimum level from the Azure Image Gallery.

What do I need to do to prepare my image?

To complete this process, I built a volume license copy of Windows Server 2008 Standard inside a generation one Hyper-V guest virtual machine. Once the installation of the operating system completed I installed Adobe Acrobat Reader. I then ran sysprep.exe to generalise the image. This is important, if you don’t generalise your images, they will fail to deploy on the Azure platform.

I will detail the steps carried out after the operating system install below.

  1. Log into the newly created virtual machine
  2. Install the Hyper-V virtual machine additions (if your guest doesn’t already have it installed)
  3. Install any software that is required in your image (I installed Acrobat Reader)
  4. From an Administrative command prompt, navigate to %windir%\system32\sysprep and then execute the command “sysprep.exe”

  1. Once the SysPrep window has opened, select Enter System Out of Box Experience (OOBE) and tick the Generalize check box. The shutdown action should be set to Shutdown, this will shut down the machine gracefully once the sysprep process has completed.
  2. Once you are ready, select OK and wait for the process to complete.

I built my machine inside a dynamically expanding VHD, the main reason for doing so was to avoid having to upload a file size which was larger than necessary. As a result of this, I chose to compact the VHD before moving on to the next step by using the disk wizard inside the Hyper-V management console. To complete this process, follow the steps below.

  1. From the Hyper-V Host pane, select Edit Disk
  2. Browse to the path of VHD we were working on, in my case it is “C:\VHDs\Server2008.vhd” and select Next
  3. Select Compact and Finish.
  4. Wait for the process to complete. Your VHD file is now ready to upload.

What’s next?

We are now ready to upload the virtual machine image, to complete this process you will need access to the Azure PowerShell cmd-lets and a storage account for the source VHD. If you do not already have a storage account created, you can follow the documentation provided by Microsoft here.

IMPORTANT: Once you have a storage account in Azure, ensure that you have a container called VHDs. If you don’t have a container you can create on by selecting Add from the bottom toolbar, name it vhds and ensure the access is set to Private (container shown below.)


We are now ready to connect to the Azure account to kick off the upload process. To do so, launch an Administrative Azure PowerShell console and follow the following steps.

  1. Run the cmd-let Add-AzureAccount, this will present a window which will allow you to authenticate to Azure.

  1. On the next screen, enter your Password. The PowerShell session is now connected.
  2. To verify that the session connected successfully, run the cmd Get-AzureAccount, you should see your account listed below.

We are now ready to commence the upload process. You will need your storage blob URL. You can find this on the container page we visited previously to create the vhds container.

The complete command is as follows.

Add-AzureVhd -Destination “<StorageBlobURL>/vhds/Server2008.vhd” -LocalFilePath “C:\VHDs\Server2008.vhd”

Once you have executed the command, two things happen..

  1. The VHD file is indexed by calculating the MD5 hash

  1. Once the indexing process is completed, the upload starts.


This is very neat, as the demo gods often fail us… (my upload actually failed part way through.) Thankfully I was able to re-execute the command, which resumed the upload process where the first pass left off (see below.)

  1. Wait for the upload process to complete.

Creating the Image in the Azure console.

Now that our upload has completed, we are ready to create an image in the Azure console. This will allow us to easily spawn virtual machines based on the image we uploaded earlier. To complete this process you will need access to the Azure console and your freshly uploaded image.

  1. Select Virtual Machines from the management portal.
  2. Select Images from the virtual machines portal.
  3. Select Create an Image

  1. A new window titled Create an image from a VHD will pop up. Enter the following details (as shown below.)
  • Name
  • Description
  • VHD URL (from your storage blob)
  • Operating System Family


Ensure you have ticked I have run Sysprep on the virtual machine or you will not be able to proceed.

  1. The Image will now appear under MY IMAGES in the image gallery.

Deploying the image!

All the work we have completed so far won’t be much use if the deployment phase fails. In this part of the process we will deploy the image to ensure it will work as expected.

  1. Select Virtual Machines from the management portal.
  2. Select New > Compute > Virtual Machine > From Gallery
  3. From the Choose an Image screen, select MY IMAGES. You should see the image that we just created in the gallery (shown below.)

  1. Select the Image and click Next
  2. Complete the Virtual Machine Configuration with your desired settings.
  3. Wait for the virtual machine to complete deployment provisioning.

Connecting to the virtual machine.

The hard work is done! We are now ready to connect to our newly deployed machine to ensure it is functioning as expected.

  1. Select Virtual Machines from the management portal.
  2. Select the Virtual Machine and then click Connect from the toolbar down the bottom. This will kick-off a download for the RDP file which will allow you to connect to the virtual machine.
  3. Launch the RDP file, you will be asked to authenticate. Enter the credentials you specified during the deployment phase and click OK


  1. You will now be presented with your remote desktop session, connected to your custom image deployed on Microsoft Azure.

I went ahead and activated my Virtual Machine. To prove there is no funny business involved, I have provided one final screenshot showing the machine activation status (which details the Windows version) and a snip showing the results of the ipconfig command. This lists the internal.cloudapp.net addresses showing that machine is running on Microsoft Azure.

Enjoy!

How to fix 403 errors when managing Azure SQL Database from Visual Studio

I was recently trying to manage Azure SQL Databases via Visual Studio in a new Azure subscription and was unable to open the SQL Databases node at all and received the following error message.

Screenshot of Visual Studio error dialog.

The text reads:

Error 0: Failed to retrieve all server data for subscription ‘GUID’ due to error ‘Error code: 403 Message: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.’.

and my Server Explorer window looked like this:

How Server Explorer Looked

I must admit that I don’t often manage my Azure assets via Visual Studio so it had been a while since I’d used this tooling. I tried a few ways to get this to work and double checked that I had the right details for the subscription registered on my local machine. Storage worked fine, Virtual Machines worked fine… everything looked good except SQL Databases!

(At this point I’d say… hint: I should have read the error message more closely!)

After some additional troubleshooting it turns out that unlike many other Azure offerings, Azure SQL Database does not support OAuth-based connections and instead uses certificates (you know, like the error message says…).

Unfortunately, it turns out that if you have an expired or otherwise invalid certificate for any Azure subscription registered then Visual Studio will fail to enumerate SQL Database instances in the subscription you are currently using even if its certificate is fine.

The use of a subscription GUID isn’t that helpful when troubleshooting because I completely missed that the problematic certificate wasn’t even from the subscription I was currently trying to use!

You can fix this issue by managing your registered Azure subscriptions from within Visual Studio as follows.

Manage Certificates in Visual Studio

  • Right-click on the top Azure node in Server Explorer.
  • Select Manage Subscriptions… from the menu.
  • Click on the “Certificates” tab and find the Subscription with the GUID matching the error.
  • Click “Remove…” and then close the dialog.

You should now be able to open the Azure SQL Database node in Server Explorer and manage them as you expect!

HTH.

Kloud Solutions — MAPA 2014 Winner

Kloud is thrilled to announce we will take home 2 gongs as part of the 2014 MAPA Microsoft Partner of the Year Awards. We were named both Microsoft’s Cloud Solutions Partner of the Year as well as Collaboration and Content Partner of the Year. The awards, won in the face of tough competition, were awarded for a range of customer solutions, including the Coles intranet portal, mycoles and the Spotless mobility app, MyWork. Both innovative solutions empower workforces by enabling anytime, anywhere access and closely align to Microsoft’s mobile-first, cloud-first strategy.

Kloud were selected from a national field of top Microsoft Partners for delivering market-leading customer solutions built on the Microsoft platform. The 21 categories of the MAPA programme recognise Microsoft Partners that have developed and delivered exceptional Microsoft-based solutions during the year.

You can find out more information on the solutions that were featured as part of our winning submissions by visiting our Spotless and Coles case studies on kloud.com.au.

Forget about the Internet and Sync Locally

If you have missed our talk in Melbourne Mobile, now you have the chance to flip through the slides and have a look at the code/demo.

Talk Details

Cloud technologies have changed the way users interact with their devices and the way they keep their data. Users now expect their data to be synced always to all devices in real-time. This has been facilitated through many cloud providers such as iCloud, Google services, and Azure. Most of the current Sync models rely on having a server to facilitate data sync between devices. This requires having internet connectivity, transferring all the data to the server, and handling multiple aspects of the data security. This could be avoided by syncing the data locally. Most smartphones support Peer-to-Peer connections. This could be used to sync data and facilitate Sync in offline mode (when no internet connection is available), saving on data bandwidth, and keeping the data more secure as it does not need to go through the wire to the server. In this talk, I review the current state of Peer-to-Peer, why would we use it, how would we use it, and what could it be used for. I will also show a demo of the peer to peer connections and messaging in action on few smartphones.

The Slides

Demo Code

At this stage, the demo is very basic. It establishes a peer-to-peer connection between two mobile devices and send basic messages across (typical chat app). We are working on building a better framework that would facilitate cross-platform p2p sync with flexible conflict-resolution and robust sync mechanisms. The demo source could be found on GitHub.

Mobile platform increases productivity for integrated services company

Customer Overview
Spotless Group is an Australian owned, managed and operated provider of integrated facility management services. With operations across Australia and New Zealand, the Group’s 33,000 employees deliver millions of service hours a year across hundreds of specialist services to industry sectors including:

  • Health
  • Education
  • Leisure, Sport and Entertainment
  • Defence
  • Government
  • Resources
  • Business and Industry (AU) (NZ)
  • Laundries

Business Situation
Spotless service workers and supervisors (employees and sub-contractors) generally work at customer locations rather than offices. Spotless always has a need to supervise the work, capture work related information, disseminate information to, and receive information from these workers.

With the distributed nature of Spotless’ workforce and interaction predominately paper-based or verbal, the organisation was experiencing inefficiencies due to the amount of manual processes involved.  Continuing along this path of a manual mode of operation was resulting in time delays with inefficient use of support staff and an ongoing lack of visibility and effective oversight of service workers and their activities.

In response to the business situation, Kloud proposed a secure mobile computing platform for Spotless’ customers, service managers, and service workers named “MyWork”.

Solution
MyWork is a cloud based mobile platform that comprises the following components:

  • Customer specific web portal for customers to raise, query, and track service requests, query asset registers, query maintenance schedules, etc. It was developed as a set of single-page applications (SPAs) in HTML, JavaScript, and CSS and is hosted on SharePoint Online.
  • Service team web portal for managing timesheets, jobs, audits, etc. It was developed as a set of single-page applications (SPAs) in HTML, JavaScript, and CSS and it is hosted on SharePoint Online.
  • Service team mobile phone apps for managing time sheets, jobs and audits. They were developed as Android and iOS apps using Xamarin’s cross-platform runtime and are distributed via AirWatch mobile device management solution.
  • Platform services that implement a range of HTTP-based services, including integration services with Spotless’ on-premises systems, for the customer web portals, the service team web portal, and the mobile phone apps. They were developed in ASP.NET Web API and are hosted in Microsoft Azure.

Benefits
The MyWork solution was designed with Microsoft’s cloud services, which leveraged Spotless’ existing investments in the Microsoft Azure Platform and SharePoint Online (Office 365).

MyWork supports Spotless’ values that that if a job is worth doing, it is worth doing well by:

  • Putting people first: Engaging customers, service managers, and service workers by providing systems they can access in the office, out of the office, and at remote work sites.
  • Rolling up our sleeves: Ability to assign service requests directly to service workers thus turning around jobs in a more timely and efficient manner.
  • Leading not following: Providing leading edge technology and mobile systems that makes workforce more closely connected to its customers and service workers.
  • Make every dollar count: MyWork provides customers, staff and subcontractors a new way of doing business by moving to a more automated, online and mobile focused platform.

Kloud understood Spotless’ requirement and delivered to this requirement on time and on budget. Their knowledge of the subject matter and technologies combined with their track record of other mobility projects resulted in a positive project outcome. – Peter Lotz, Chief Information Officer, Spotless Group

Purchasing Additional SharePoint Online Storage for Office 365

There are a number of different options for customers to purchase Office 365.  In the U.S.A. and the majority of markets, customers can purchase Office 365 directly from Microsoft via MOSP (Microsoft Online Subscription Program).  This is the most common way for small businesses to purchase Office 365.  Customers can purchase licenses using a credit card.  There is no minimum license quantity for MOSP.  Customers pay for Office 365 via an automatic monthly subscription.

In Australia, Telstra has a syndication agreement with Microsoft.  This means that customers who want to purchase Office 365 in Australia transact the purchase with Telstra.  This service is known as T-Suite.  Billing for T-Suite can be via a monthly credit card payment or the customer’s existing Telstra account.  After purchasing the licenses from Telstra, customers are provided with an Office 365 Org ID and password to access the new tenant.

Another option for customers to purchase Office 365 is via a volume license (VL) agreement.  For large enterprises that require 250 licenses and above, customers can purchase via an Enterprise Agreement (EA) or Enterprise Subscription Agreement (EAS).  Smaller customers that require between 5 – 249 licenses can purchase Office 365 via an Open Agreement.  VL agreements require a commitment of 1 – 3 years, depending on the agreement.  VL agreements are billed annually.  Customers who are based in Australia and wish to buy Office 365 directly from Microsoft can do so with a VL agreement.

There are many differences between Office 365 purchases via MOSP vs. VL.  The differences include:

1) The prices of the licenses

2) The frequency of the payments

3) The length of commitment

4) The types of SKUs which are available

It is important to consider all of these factors before making a decision on the best way to purchase Office 365 for your organization.

This blog will focus on one of the major differences between the Office 365 SKUs offered via MOSP vs. an Open agreement.

When customers purchase Office 365 and SharePoint Online, they are provided with 10 GB of storage by default.  This storage can be used to provision a number of different SharePoint Online websites including public and internal websites.  For each Office 365 and SharePoint Online user license purchased, the tenant is provided with an additional 500 MB of storage.  For example, a customer who purchases 10 E3 licenses will receive 10 GB + (10 users) * (500 MB) = 10 GB + 5 GB = 15 GB.  Please note that this pool of SharePoint Online storage is separate from the storage used by OneDrive for Business. Each users who runs OneDrive for Business is now given 1 TB of storage for personal files.

In some instances, customers may want to increase the amount of storage available for SharePoint Online.  Kloud Solutions works with many customers who would like to move their corporate file shares from an on-premises server to SharePoint Online.  The storage required for your file shares may exceed the default storage allocation in SharePoint Online.  Therefore, Microsoft has introduced the option for customers to purchase additional SharePoint storage on a per GB basis.

There are many different types of Office 365 plans that can be purchased.  You will first need to determine if your existing Office 365 subscription is eligible for additional storage.  SharePoint Online storage is available for the following subscriptions:

  • Office 365 Enterprise E1
  • Office 365 Enterprise E2
  • Office 365 Enterprise E3
  • Office 365 Enterprise E3 for Symphony
  • Office 365 Enterprise E4
  • Office 365 Midsize Business
  • Office Online with SharePoint Plan 1
  • Office Online with SharePoint Plan 2
  • SharePoint Online (Plan 1)
  • SharePoint Online (Plan 2)

SharePoint Online Storage for Small Business is available for the following subscriptions:

  • Office 365 (Plan P1)
  • Office 365 Small Business Premium
  • Office 365 Small Business

If your subscription is one of the above eligible plans, you can purchase Office 365 via MOSP or the T-Suite portal for customers in Australia.

One of the key limitations to consider is that Microsoft does NOT offer the option to purchase additional SharePoint Online storage via an Open Agreement for small and medium businesses.  For instance, you can purchase 10 E3 licenses via an Open Agreement. This would provide 15 GB of SharePoint Online storage using the example above.  However, you would NOT be able to purchase additional GB of storage as the SKU is not available on the Open price list.

You can mix Open and MOSP licensing in the same Office 365 tenant.  For example, you could buy 10 E3 license via an Open agreement and then apply them to a tenant using an Office 365 product key.  If you wanted to buy an additional 3 GB of storage, you could do so via a credit card in the same tenant.  However, SharePoint Online storage must be tied to another license.  It cannot be purchased by itself.  So you would have to buy at least 1 additional E3 license via MOSP in order to add the additional 3 GB of storage.  This is something to consider when you are pricing an Office 365 solution.

For reasons of both simplicity and flexibility, Kloud Solutions recommends purchasing Office 365 via MOSP or T-Suite if you need additional SharePoint Online storage today, or if you think you may need it in the future.  Purchasing via MOSP or T-Suite allows you to keep your options open and plan for future storage growth.  Buying Office 365 via Open means that you are locked in to a certain storage allocation as determined by Microsoft.   There is no guarantee that Microsoft’s default storage allocation will meet your requirements.

It is very likely that Microsoft will increase the default storage allocation for SharePoint Online in the future.  The cost of storage is always declining according to Moore’s Law.  For example, Microsoft recently increased the amount of storage available in OneDrive from 25 GB to 1 TB.  Here is a blog post which references this change:

http://blog.kloud.com.au/2014/05/04/sharepoint-online-storage-improvements-in-office-365/

However, there have been no announcements from Microsoft to date indicating that they plan to increase the default storage for SharePoint Online beyond 10 GB per tenant or 500 MB per user.  There will be future posts to this blog about this topic if there are any relevant updates in the future.

If you have any questions about the different options for purchasing Office 365 from Microsoft or Telstra, please contact Kloud  Solutions using the following URL:

http://www.kloud.com.au/

Mobile Test-Driven Development Part (3) – Running your unit tests from your IDE

TDD in Mobile Development – Part 3
1. Unit Testing of Platform-Specific Code in Mobile Development.
2. Portable IoC (Portable.TinyIoC) for Mobile Development
3. Mobile Test-Driven Development – Running your unit tests from your IDE

This is the third post in my TDD for Mobile Development series. This post shows how we can have test driven development for mobile. We will look at options for running our tests from within our IDE and finding the right test runner for our development env without the need to launch an emulator or deploy to a device every time we want to run the tests.

In a Previous post I showed how to use NUnitLite to write unit/integration tests on Android and iOS. This post shows how you could write your unit tests with NUnit framework and running them from your IDE.

Problems with NUnitLite

NUnitLite does not have a test runner that could be used outside of the mobile OS. This holds true for both Android and iOS. That’s why every time we need to run the tests, we have to deploy into a real device or a simulator/emulator to run the tests.
Now this could be ok and necessary for some platform-specific logic. However, in most cases, we do not have to test the code on the exact platform. Take the example that we had in the previous post,

    public int GetTotal(int first, int second)
    {
        return first + second;
    }

This code is just plain c# code that could be placed outside of the platform specific code and could be used on multiple platforms, and could then be tested conveniently using NUnit.

Portable Class Library (PCL)

This brings us to using PCL (Portable Class Libraries). The beauty of using PCLs is not only in sharing code across multiple platforms, but it also enables us to test our code using full frameworks like NUnit or Microsoft Test (although I would really stick with NUnit :) ).
Bear in mind that PCLs are evolving and everyday there are quite few packages for PCLs are coming up.

Some developers might argue that it is trouble-some to write your code in PCLs since it adds restrictions and only allows you to use a subset of .net that is supported on all configured platforms.

This could be true, but you could get around it by three ways:

1- Only support the platforms that you really need.
I normally use PCL profile 78 or 158. This gives me the two main platforms that I am working on Android and iOS, plus some later versions of Windows phone (8.1), and Silver light. You do not have to use a profiles that tries to support older versions, and you will have less limitations by following this approach.

2- Make use of Nuget Packages.
Installing Nuget packages is a great way of going PCL. Whenever I am trying to do something that is not supported in the .NET subset, I look up the Nuget store and most of the time I would find that somebody has already developed a package that I could just use directly. The other nice thing about Nuget packages, Nuget supports distributing multiple platforms libraries. This means that sometimes you get a package that could support Android, and iOS. In this case you would find two separate folders under /lib (inside the Nuget package) one folder for each platform (Android, iOS). In some other cases, Nuget could give you a portable library where you would get folders (under /lib) like portable-win81+net54+ etc. This means that the dlls inside this folder could be used and referenced from within this kind of profiles (Platforms). This is great news because you could just use the code without worrying about changing anything. Examples of such package are:

a. SQLite.NET-PCL
b. PCLWebUtility
c. Microsoft.Bcl
d. Microsfot.Bcl.Build
e. Microsoft.Bcl.Async
e. Newtonsoft.Json

3. Abstract your platform specific logic and use a platform specific implementation.
Sometimes your logic has to have a platform specific version, let’s say you are doing something with animation, or cryptography where you need to use the platform specific libraries.
The best way to go about this is to have an abstraction that gets injected into the libraries/classes that depends on these (platform-specific) components. This means that your classes/libraries does not have any dependency on the platform specific code. It is only dependent on abstraction. During run-time, you could inject your platform specific implementation via any IoC container or even manually. I have a full post on IoC in Cross-platform here. Also it is worth looking at SQLite.NET-PCL implementation as it follows exactly this approach.

MVVM

MVVM is a great approach for developing software because it ensures that your business logic is not coupled into any presentation layer/component.
There is even MVVMCross which allows you to build apps in a cross-platform fashion. However, I do not prefer to go with MVVMCross because it adds much more complexity than I need and in case I need to develop and change something out of the framework, then I would need to invest a lot in learning and building workarounds. Therefore, what I do is just stick with my ViewModels.
This means I take advantage of the MVVM pattern by having my ViewModels holding all my business logic code and injecting these viewmodels into my controllers/presenters.
The viewModels could also have other services, factories, repositories injected into them (using IoC container or manually) and that way our code is all cross platform and very testable.

        public class CalculatorViewModel : ViewModelBase 
	{
		public int GetTotal(int first, int second)
		{
			return first + second;
		}
	}

        //iOS Controller
	public class CalculatorController : UIViewController
	{
		private readonly CalculatorViewModel _viewModel;

		public CalculatorController (CalculatorViewModel viewModel)
		{
			_viewModel = viewModel;
		}
	}

        //android Controller
        public class CalculatorController : Fragment
	{
		private readonly CalculatorViewModel _viewModel;

		public CalculatorController (CalculatorViewModel viewModel)
		{
			_viewModel = viewModel;
		}
	}

Writing Tests

As you can see from above, our logic is now sitting in the ViewModel and it is all testable regardless of the platform. This also make it easy for us to use any test framework and test runners. This includes NUnit or Microsoft Test. It gets even better, we could even have our test libraries targeting .NET 4.0 or 4.5, which means we could use all the goodness of .NET in writing our tests. This includes using FakeItEasy and RhinoMock.

Running the Tests

Now that we have all this great setup, then we could look at running our tests. For using Microsoft Test, this comes out of the box so no need to install anything extra. If you prefer using NUnit like me, then you could either install the latest version of NUnit (this includes the adapter and the runner). However, there is even a better way, you could just install NUnit Adapter (with Runner) from the Nuget store. This will make the NUnit adapter and runner part of your solution and you would not need to install the framework on all developers machines and your build server (as we will see in the Continuous Integration Server setup later).
To start writing your tests, you could create a class library that targets .NET 4.0 or .NET 4.5, and install NUnit Adapter Nuget package, and start writing your tests like below:

dd Mobile Common Tests Visual_Studio

dd Mobile Common Tests Visual_Studio

Running Mobile TDD Tests Visual Studio

Running Mobile TDD Tests Visual Studio

Tdd Mobile Common Tests in Xamarin Studio

Tdd Mobile Common Tests in Xamarin Studio

Conclusions

In Conclusion, I have demoed in the last three posts (1, 2, and 3) how to have a mobile test-driven development. I hope this motivates you to start looking at improving your code quality and employ some of the tacktics we talked about here. If you have any comments and questions, I would love to hear them so get in touch.

TDD in Mobile Development – Part 3
1. Unit Testing of Platform-Specific Code in Mobile Development.
2. Portable IoC (Portable.TinyIoC) for Mobile Development
3. Mobile Test-Driven Development – Running your unit tests from your IDE

PowerShell Detection Method for SCCM 2012 Application Compliance management

Microsoft System Center Configuration Manager (SCCM) 2012 has a very powerful Application Detection and Delivery model, separate from the existing ‘package and program delivery model’ of previous versions of SCCM & SMS.

The power of this new model is not having to ‘daisy chain’ packages and executables together to achieve a desired outcome.  Using SCCM’s Detection Model reduces the burden in managing a Windows client base in terms of keeping its baseline configuration the same across every client in the Organisation.

I recently assisted a Kloud customer to configure a script delivery application, that was using the Application delivery model and the ‘Detection Method’ to ensure files reached their local Windows 8 folder destinations successfully.  The script simply copies the files where they need to go and the Detection Method then determines the success of that script. If SCCM does not detect the files in their correct destination locations, it attempts again at executing the script.

Benefits in using SCCM 2012 Application and Detection Method Delivery

Using this Application and Detection method provided Kloud’s customer with the following business benefits:

  • Increased reliability of delivering Office template files to a Windows 8 machine and therefore reduced TCO in delivering software to authorised workstations.  If the application files were corrupted or deleted during installation or post-installation (for example a user turning their workstation off during an install), then SCCM detects these files are missing and re-runs the installation
  • Upgrades are made easier, as it does not depend on any Windows 8 workstation having to run a previous installation or ‘package’.  The ‘Detection Method’ of the Application object determines if the correct file version is there (or not) and if necessary re-runs the script to deliver the files.  The ‘Detection Method’ also runs after every install, to guarantee that a client is 100% compliant with that application delivery.
  • Uses SCCM client agent behaviour including BITS, restart handling, use of the ‘Software Center’ application for user initiated installs and Application package version handling – for example, if a single file is updated in the Application source and re-delivered to the Distribution Point, the SCCM client detects a single file has changed, and will only downloads the changed file saving bandwidth (and download charges) from the Distribution Point

Customer Technical Requirements

Kloud’s customer had the following technical requirements:

1. My customer wanted to use an SCCM Application and Detection Rule to distribute ten Office 2010 template files to Windows 8 workstations (managed with the SCCM client)

2. They wanted to be able to drop new Office 2010 template files at any stage into the SCCM source application folder, distribute the application and the SCCM clients download and install those new templates with minimum interference to end users.

3. They also wanted the minimum number of objects in SCCM to manage the application, and wanted the application to ‘self heal’ if a user deleted any of the template files.

4. All code had to be written in PowerShell for ease of support.

Limitations of Native Detection Methods

SCCM 2012 has a great native Detection Rules method for MSI files and file system executables (see native Detection Rule image below:).

NativeDetectionRules

However we quickly worked out its limitations with this native Detection Rule model, namely for the ‘File System’ setting type:

1. Environment variables for user accounts, such as %username% and %userprofile% are not supported

2. File versioning can only work with Windows executables (ie. .EXE) and not metadata embedded in files, for example Word files.

SCCM comes with the ability to run Powershell, VBScript or JScript as part of its Detection Model, and it is documented with VBScript examples at this location:

TechNet Link

Taking these examples, the critical table to follow to get the Detection Model working correctly (and improving your understanding of how your script works in terms of ‘error code’, ‘stdout’ and ‘stderror’) is the following table, kindly reproduced from Microsoft from the TechNet Link above:

Script exit code Data read from STDOUT Data read from STDERR Script result Application detection state
0 Empty Empty Success Not installed
0 Empty Not empty Failure Unknown
0 Not empty Empty Success Installed
0 Not empty Not empty Success Installed
Non-zero value Empty Empty Failure Unknown
Non-zero value Empty Not empty Failure Unknown
Non-zero value Not empty Empty Failure Unknown
Non-zero value Not empty Not empty Failure Unknown

This table tells us that the key to achieving an Application delivery ‘success’ or ‘failure’ using our PowerShell Detection script boils down to achieving either of the rows highlighted in red – any other result (i.e. “Unknown” for the “Application Detection State”) will simply just result in the application not delivering to the client.

The critical part of any Detection Model script is to ensure an error code of ‘0’ is always the result, regardless if the application is installed successfully or has failed. The next critical step is the Powershell object equivalent of populating the ‘stdout’ object. Other script authors may choose to test the ‘stderror’ object as well in their scripts, but I found it unnecessary and preferred to ‘keep it simple’.

After ensuring my script achieved an exit code of ‘0’, I then concentrated on my script either populating the ‘stdout’ object or not populating the ‘stdout’ object – I essentially ignored the ‘stderror’ object completely and ensured my script ran ‘error free’. At all times, for example, I used ‘test-path’ to first test to see a file or folder exists before then attempting to grab its metadata properties. If I didn’t use ‘test-path’, then the script would error if a file or folder was not found and then it would end up in an “unknown” detection state.

I therefore solely concentrated on my script achieving only the highlighted rows (in red) of the table above.

Microsoft provides example of VBScript code to populate the ‘stdout’ (and ‘stderror’) objects and can be found in the TechNet link above – however my method involves just piping a single PowerShell ‘write-host’ command if the Detection Script determines the application has been delivered successfully.  This satisfies populating the ‘stdout’ object and therefore achieving Detection success.

Limitations in using Scripts for Detection

There were two issues in getting a Detection Method working properly: an issue related to the way SCCM delivers files to the local client (specifically upgrades) and an issue with the way Office template files are used.

One of the issues we have is that Word and Excel typically changes a template file (however small the change!) when either application is loaded, by changing either its ‘Date Modified’ timestamp or modifying the file length in bytes of the file (or both). Therefore, using a detection method that determines whether a file has been delivered successfully to the workstation should avoid using a file’s length in byes or its modified timestamp.

The other issue we found is that SCCM has a habit of changing the ‘Date Modified’ timestamp of all files it delivers when it detects an ‘upgrade’ of the source files for that application. It typically does not touch the timestamp of the source files if it delivers a brand new install to a client that has never received the software, however if a single file in the source folder is changed for that application, then SCCM tries to use a previous version of the application in the cache (C:\windows\ccmcache) and only downloads the new file that has change. This results in all files having their ‘Data Modified’ timestamp changing (except for the brand new file). Therefore determining if that application has delivered successfully using ‘Date Modified’ timestamps is not recommended. The key to seeing this process in action is looking at the file properties in the C:\windows\ccmcache\<sccm code> folder for that application, particularly before and after a file is updated in the original source SCCM application folder.

Ultimately, for Kloud’s customer, we used a file’s Metadata to determine the file version and whether the application has been delivered successfully or not. In this example, we used the ‘Company’ metadata field of the Word and Excel template file (found under a file’s ‘Properties’):

Metadata1

I used this Scripting Guy’s TechNet Blog to form the basis of retrieving a file’s metadata using a PowerShell function, and then using that information pulled from the file to determine a good attribute to scan for, in terms of file version control.

One of the limitations I found was that this function (through no fault of its author: Ed Wilson!) does not return ‘Version number’, so we used the ‘Company’ field instead. If someone has worked out a different PowerShell method to retrieve that ‘Version number’ metadata attribute, then feel free to tell me in the comments section below!

The next step in getting this PowerShell script to work correctly, is ensuring that only ‘error code = 0′ is returned when this script is executed.  Any other error code will result in breaking the delivery of that application to the client. The next step is then only ensuring that a ‘write-host’ is executed if it detects that all detected files are installed – in this example, only 10 files that are 100% detected in my array ‘Path’ will result in a ‘write-host’ being sent to the SCCM client and therefore telling SCCM that client has been successfully delivered. If I were to copy that Powershell script locally, run that script and not detect all files on that machine, then that script will not display anything to that Powershell window. This tells the SCCM client that the delivery has failed.  If that script ran locally and only a single ‘write-host’ of ‘all files accounted for!’ was shown to the screen, this tells me the Detection is working.

The sample code for our Detection Method can be found below (all filenames and paths have been changed from my customer’s script for example purposes):


# Authors: Michael Pearn & Ed Wilson [MSFT]
Function Get-FileMetaData
{
  <#
   .Synopsis
    This function gets file metadata and returns it as a custom PS Object
 #Requires -Version 2.0
 #>
 Param([string[]]$folder)
 foreach($sFolder in $folder)
  {
   $a = 0
   $objShell = New-Object -ComObject Shell.Application
   $objFolder = $objShell.namespace($sFolder) 

   foreach ($File in $objFolder.items())
    {
     $FileMetaData = New-Object PSOBJECT
      for ($a ; $a  -le 266; $a++)
       {
         if($objFolder.getDetailsOf($File, $a))
           {
             $hash += @{$($objFolder.getDetailsOf($objFolder.items, $a))  =
                   $($objFolder.getDetailsOf($File, $a)) }
            $FileMetaData | Add-Member $hash
            $hash.clear()
           } #end if
       } #end for
     $a=0
     $FileMetaData
    } #end foreach $file
  } #end foreach $sfolder
} #end Get-FileMetaData

$TemplateVersions = "5.0.2"

$wordStandards = "C:\Program Files (x86)\Customer\Customer Word Standards"
$wordTemplates = "C:\Program Files (x86)\Microsoft Office\Templates"
$wordTheme = "C:\Program Files (x86)\Microsoft Office\Document Themes 14\Theme Colors"
$excelAddins = "C:\Program Files (x86)\Customer\Customer Excel Addins"
$xlRibbon = "C:\Program Files (x86)\Microsoft Office\Office14\ADDINS"
$PPTribbon = "C:\Program Files (x86)\Customer\PowerPoint Templates"
$PPTtemplates = "C:\Program Files (x86)\Microsoft Office\Templates\Customer"

$strFile1 = "Bridge Template.xlsm"
$strFile2 = "Excel Ribbon.xlam"
$strFile3 = "NormalEmail.dotm"
$strFile4 = "PPT Ribbon.ppam"
$strFile5 = "Client Pitch.potx"
$strFile6 = "Client Presentation.potx"
$strFile7 = "Client Report.potx"
$strFile8 = "Blank.potx"
$strFile9 = "Blocks.dotx"
$strFile10 = "Normal.dotm"

$Path = @()
$Collection = @()

$Path += "$excelAddins\$strfile1"
$Path += "$xlRibbon\$strfile2"
$Path += "$PPTribbon\$strfile3"
$Path += "$PPTtemplates\$strfile4"
$Path += "$PPTtemplates\$strfile5"
$Path += "$PPTtemplates\$strfile6"
$Path += "$wordStandards\$strfile7"
$Path += "$excelAddins\$strfile8"
$Path += "$xlRibbon\$strfile9"
$Path += "$PPTribbon\$strfile10"

if (Test-Path $wordStandards) {
$fileMD = Get-FileMetaData -folder $wordStandards
$collection += $fileMD | select path, company
}
if (Test-Path $wordTemplates) {
$fileMD = Get-FileMetaData -folder $wordTemplates
$collection += $fileMD | select path, company
}
if (Test-Path $wordTheme) {
$fileMD = Get-FileMetaData -folder $wordTheme
$collection += $fileMD | select path, company
}
if (Test-Path $excelAddins) {
$fileMD = Get-FileMetaData -folder $excelAddins
$collection += $fileMD | select path, company
}
if (Test-Path $xlRibbon) {
$fileMD = Get-FileMetaData -folder $xlRibbon
$collection += $fileMD | select path, company
}
if (Test-Path $PPTribbon) {
$fileMD = Get-FileMetaData -folder $PPTribbon
$collection += $fileMD | select path, company
}
if (Test-Path $PPTtemplates) {
$fileMD = Get-FileMetaData -folder $PPTtemplates
$collection += $fileMD | select path, company
}
$OKCounter = 0
for ($i=0; $i -lt $Path.length; $i++) {
     foreach ($obj in $collection) {
     If ($Path[$i] -eq $obj.path -and $obj.company -eq $TemplateVersions) {$OKCounter++}
     }
}
if ($OKCounter -eq $path.length) {
write-host "all files accounted for!"
}


I then posted this code into the Detection Model of the application resulting in something similar to the following image:

DetectionModel

If the application has delivered successfully (and the script results in ‘Exit Code = 0′ and a ‘write-host = “all files accounted for!”‘ piping to the ‘Stdout’ object, then the following entry (critical values highlighted in red text below) should appear in the local SCCM client log: C:\Windows\CCM\Logs\AppEnforce.log:


<![LOG[    Looking for exit code 0 in exit codes table...]LOG]!><time=”12:29:13.852-600″ date=”08-08-2014″ component=”AppEnforce” context=”” type=”1″ thread=”2144″ file=”appexcnlib.cpp:505″>
<![LOG[    Matched exit code 0 to a Success entry in exit codes table.]LOG]!><time=”12:29:13.853-600″ date=”08-08-2014″ component=”AppEnforce” context=”” type=”1″ thread=”2144″ file=”appexcnlib.cpp:584″>
<![LOG[    Performing detection of app deployment type User Install - Prod - Office 2010 Templates 5.0.2(ScopeId_92919E2B-F457-4BBD-82FF-0765C1E1E696/DeploymentType_0f69fa14-549d-4397-8a0b-004f0d0e85e7, revision 4) for user.]LOG]!><time=”12:29:13.861-600″ date=”08-08-2014″ component=”AppEnforce” context=”” type=”1″ thread=”2144″ file=”appprovider.cpp:2079″>
<![LOG[+++ Discovered application [AppDT Id: ScopeId_92919E2B-F457-4BBD-82FF-0765C1E1E696/DeploymentType_0f69fa14-549d-4397-8a0b-004f0d0e85e7, Revision: 4]]LOG]!><time=”12:29:16.977-600″ type=”1″ date=”08-08-2014″ file=”scripthandler.cpp:491″ thread=”2144″ context=”” component=”AppEnforce”>
<![LOG[++++++ App enforcement completed (10 seconds) for App DT “User Install – Prod – Office 2010 Templates 5.0.2″ [ScopeId_92919E2B-F457-4BBD-82FF-0765C1E1E696/DeploymentType_0f69fa14-549d-4397-8a0b-004f0d0e85e7], Revision: 4, User SID: S-1-5-21-1938088289-184369731-1547471778-5113] ++++++]LOG]!><time=”12:29:16.977-600″ date=”08-08-2014″ component=”AppEnforce” context=”” type=”1″ thread=”2144″ file=”appprovider.cpp:2366″>


We should also see a status of ‘Installed’ in the ‘Software Center’ application (part of the SCCM client):

SoftwareCenter

Hope this helps with using SCCM application and Detection Method scripting! Any questions, please comment on my post below and I’ll endeavour to get back to you.