Creating your own PowerShell modules for Azure Automation – Part 2

In my previous article I explained the basic steps for creating a PowerShell module that you can upload to Azure Automation. Now, being pedantic by nature, I see a lot of scripts of really poor quality and as I write modules quite regularly, I want to share some useful tips to turn a rubbish module into something significantly better.

Previously, I wrote a basic module that looks like this:

You can run the module by using the command new-compliment -name “someone”. However, the power of PowerShell is in its object-oriented nature and pipelining, so we really need to spruce this up so that it becomes a better quality module. Note, it’ll still be an absurdly useless module, but it’ll give you an idea how to write them.

The parameters section

As you can see at the top, we have a parameter block. That should be standard, but we are not providing any additional information about what the parameter itself is, nor do we specify the rules. That’s no good. Right now, you cannot use pipelining which limits the usability. Always write your modules with pipelining in mind. It’s simply good practice and makes for much more powerful use later one.

So, decorate the variable $name with a parameter block: [Parameter()]$name

This doesn’t do anything and won’t make a difference yet, but now we can start providing some additional settings for the parameters and that’s where it becomes more useful.

So make the following change: [Parameter(Mandatory=$true)]$name. Now we have made this a mandatory parameter. If it is not provided, an error is shown. Good, but still no pipelining.

So, do the following [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelinebyPropertyName=$true)]$name. The two pipeline options do different things. The first “ValueFromPipeline” essentially states that whatever is sent through the pipeline goes into this parameter. Logically, you can really only do this once. If the pipeline contains something with multiple values (like an object), this won’t work because it won’t match a simple string parameter. That’s what the second option (valueFromPipelinebyPropertyname) is for. If you send an object through the pipeline, and that object as a property called “name”, then it will be automatically mapped to this $name parameter. You want to set this option on almost all your parameters. If you do that, you can run your command many times based on the output of an import-csv  (where you need a column called ‘name’).

Pipelining arrays

With the above changes you can run the following

$name = “Ward”
$name | new-compliment

However, what happens if you create an array of names and pipeline that:

$names = @(“Ward”,”James”,”Chris”)
$names | new-compliment

As you’ll see, only the last name in the array is used. That’s not what we want. We want each entry in the array be processed. So, the best way of doing that is to use a begin, process and end block in the function. I rarely see people using this, which is unfortunate, as this is really useful functionality.

So, in the main section of the function, create the following three sections:

Begin { }
Process { }
End { }

Now, move the Write-host line into the curly braces of the Process section. Test, and suddenly our command runs for each time in the array.

What happened? Quite simple, the Begin section will contain code that you want to run only once. This isn’t always used, but can be useful for checking certain things before proceeding such as whether or not there is connectivity. The process section, will run for each item in the pipeline. As our array contains 3 items, this runs 3 times. The End section is like the Begin section in that it runs only once, after the process section has completed.

Our module is starting to be already a bit more professional.

Adding help

The next overlooked thing is adding help information. It’s easy to skip. Don’t! Six months from now, you will have forgotten how to use your module and this help information is going to be really useful. Adding help is easy. Add the following comment section right before the function in the module file. If you have multiple functions, you’ll add this separately for each function.

Avoiding disasters

What if instead of a compliment function, we created an insult function. Perhaps, we want to make sure that before you run it, you are really sure. So, we want confirmation before proceeding. You’d do this for any command that does something destructive.

Fortunately, PowerShell contains all we need, you just need to enable it. So, add the following line right before your param block:

[CmdletBinding(SupportsShouldProcess=$true,ConfirmImpact=”High”)]

Next, in your process block, you want to put an if statement around the actual actions so that they are only carried out in case there is a positive confirmation:

If you run the function now, you’ll be asked to confirm. If you include this in script, you can override the confirmation with -confirm:$false

Lastly exporting functions

In a larger module, you will have functions that are only used by the module itself. These functions are not used by the end users, they should even be able to run them. In a module we can actually specify which functions are available through the Export-modulemember command.  Add the following to the end of the module file:

Export-modulemember new-compliment.

The complete module now looks like this:

You can use this as a template for a PowerShell module as there’s only one line that actually does something, so change that instruction and change the parameters. I know that some of this stuff isn’t needed in Azure, but better build good habits.

Creating your own PowerShell modules for Azure Automation – Part 1

Creating a PowerShell module is an easy way to create scripts you can use over and over again. If you Google  this you’ll find that to create a module is as simple as creating a PowerShell Script with the psm1 extension. However, that won’t work for Azure. Azure loads modules automatically, so you need to write your module to load automatically as well. To ensure a module loads correctly, you’ll need to create a module manifest file.

Here’s how you do it.

Step1. Decide on a name.

You can name your modules whatever you want. However, you cannot have the same module loaded twice. Similarly, should you want to list your module in the module gallery, you’ll need to ensure the name is unique. What convention you use, is up to you, but I typically use the company domain name backwards and add the purpose of the module. So, here I am creating a module called au.com.kloud.demo.

Step 2. Create the necessary files

Following my example, I create a folder called au.com.kloud.demo, and inside I need two files: the module and the manifest: au.com.kloud.demo.psm1, and au.com.kloud.demo.psd1.

The easiest way to create the manifest is to run the following command in powershell: new-modulemanifest -path .\au.com.kloud.demo.psd1

Step 3. Update the manifest

The new-modulemanifest command always creates the same file with a unique GUID so you’ll need to edit it before you can use it.

The first thing you need to do with the manifest is adjust the content for your module. Specifically do the following:

  • uncomment RootModule and change it to RootModule = ‘au.com.kloud.demo.psm1’
  • Change FunctionsToExport to FunctionsToExport = ‘*’
  • update the version number

All three steps are important, without the first, no module will actually be loaded, the second step makes sure the actual functions from the module are available and the last is to ensure that you actually track when the module has been updated.

Here is an example of a module manifest.

When creating a module in PowerShell you specify the functions to include using the FunctionsToExport statement. The CmdletsToExport statement is what you would use for binary modules created in C#. As you can see, there is a section for ‘RequiredAssemblies’. You can use that to load dlls that you use in your script and is one way of getting around some of the limitations using assemblies in Azure Automation.

Step 4. Create the module

Now, a module is a basic Powershell script written as a function. Here is an absurdly  (and badly written) module:

Step 5. Test

Following tests must be successful, or your module is not going to work when uploading to Azure:

  • from PowerShell run test-modulemanifest au.com.kloud.demo\au.com.kloud.demo.psd1  (the command in the module should be listed.)
  • copy the au.com.kloud.demo folder with the 2 files into the modules folder of PowerShell. There are several locations such as C:\Windows\System32\WindowsPowerShell\v1.0\Modules. Now close all PowerShell shells, and if you re-open PowerShell, the module should be loaded automatically which you can confirm by trying to run new-compliment -name “something”.

If the above doesn’t work, then the module won’t work in Azure either.

Step 6. Upload

  1. increment the version number!
  2. select the au.com.kloud.demo folder, and compress it to au.com.kloud.demo.zip.
  3. In the automation account, go to modules, click add new module and point to the zip file.
  4. click OK etc, and after a couple of minutes the module should be available and if you click it, you should see the command you’ve made available.

The module itself is pretty rubbish. We all love cheap flattery, but writing a good PowerShell module requires more than that. See, my step 2 on how make a better module.