Authoring Integration Modules for SMA
Published Feb 15 2019 10:21 PM 312 Views
First published on TECHNET on Jun 11, 2014

Introduction


If you’re familiar with Service Management Automation (SMA), you’re probably aware that PowerShell is the fundamental technology that makes SMA so great. But PowerShell itself is great due to its extensibility through PowerShell modules. Since SMA is built on PowerShell, PowerShell modules are of course key to the SMA extensibility story as well. This blog post will guide you through the specifics of SMA’s “spin” on PowerShell modules – “Integration Modules,” and best practices to incorporate in your own PowerShell modules to make sure they work great as Integration Modules within SMA.



What’s a PowerShell Module?


So, before we get into Integration Modules specifically, what are PowerShell modules? Well, if you’ve ever called a cmdlet in PowerShell, such as Get-Date or Copy-Item, you’ve used a PowerShell module. A PowerShell module is a group of PowerShell cmdlets that can be used from the PowerShell console, as well as from PowerShell scripts, workflows, and runbooks. All of the functionality of PowerShell is exposed through cmdlets, and every cmdlet is backed by a PowerShell module, many of which ship with PowerShell itself. For example, the Get-Date cmdlet is part of the Microsoft.PowerShell.Utility PowerShell module, and Copy-Item cmdlet is part of the Microsoft.PowerShell.Management PowerShell module. Both of these modules ship with PowerShell. But many PowerShell modules do not ship as part of PowerShell, and either ship with products (such as the Virtual Machine Manager PowerShell module ), or are distributed by the vast PowerShell community to make complex tasks simpler through encapsulated functionality (such as the Windows Update PowerShell module ). You can learn more about PowerShell modules on MSDN .




What’s an SMA Integration Module?


Ok, so now that we’re familiar with regular old PowerShell modules, what are these SMA “Integration Modules”, and how are they different from a standard PowerShell module? Turns out, there really isn’t much difference! An Integration Module is just a PowerShell module that optionally contains one additional file – a metadata file specifying an SMA connection type to be used with this module’s cmdlets in runbooks. Optional file or not, these PowerShell modules can be imported into SMA to make their cmdlets available for use within runbooks. Behind the scenes, SMA stores these modules in the database, and at runtime loads them into the Runbook Worker sandboxes that run runbooks.


Another tiny caveat of PowerShell modules vs Integration Modules – in order to import one of these modules into SMA, you need to zip up the module folder so the module can be imported as a single file. The zip file should have the same name as the module folder it contains. The module folder in the zip needs to contain at least a .psd1, .psm1, or PowerShell module .dll file with the same name as the module folder. For example, for the Twilio module discussed below, the proper structure is:



  • Twilio.zip

    • Twilio folder

      • Twilio.psd1






It ends up looking like this:



You can find more information on how to package an Integration Module for import into SMA on TechNet , under the “Building an Integration Module” section.


Fun fact: We SMA folks often refer to Integration Modules as just “modules” for short, since they are basically just PowerShell modules with an optional extra metadata file. Together, a PowerShell module + an Integration Module metadata file is conceptually equivalent to the Integration Pack concept of Orchestrator. In fact, the term “Integration Module” comes from a combination of the term “Integration Pack” from Orchestrator, and “PowerShell module,” from PowerShell.



The SMA Integration Module Metadata File


So what are the specifics of this extra, optional file, which holds an SMA connection type for the module it is a part of? The file is named based on the name of the module, in the form <ModuleName>-Automation.json , and should be placed within the module folder (within the module zip file).


For an example of a PowerShell module that has the Integration Module metadata file, check out the Twilio PowerShell module on Script Center. As you can see below, it contains the Integration Module metadata file:



The contents of this file is a JSON-formatted object. This object contains the fields of a “connection” that is required to connect to the system or service the module represents. This will end up creating a connection type in SMA. Using this file you can set the field names, types, and whether the fields should be encrypted and / or optional, for the connection type of the module. Valid field types are "System.String", "System.Int32", and "System.Boolean". Connection types are not updateable -- If after importing a module with a connection type, you want to change the fields of the connection type, add more fields, etc, in addition to changing the ConnectionFields property of the file to contain the fields you want, you also need to change the ConnectionTypeName property to something new, such as "<ModuleName>V2" in order for module import to succeed. The "IntegrationModuleName" property of this file currently has no effect.


Below is an example of the format and contents of this file for the Twilio module mentioned above. Requests to Twilio require passing a Twilio AccountSid and authentication token (AuthToken) in order to authenticate, so the connection type for Twilio contains these fields:



After importing the Twilio Integration Module into SMA, when you create a connection, a new connection type will be present – the Twilio connection type. Creating an instance of this connection type (also known as a connection asset) lets you specify the fields needed to connect to Twilio, in this case the AccountSid and AuthToken:



Of course, if you have multiple Twilio accounts, you can create a connection asset for each Twilio account so that you can connect to Twilio as any of these accounts (depending on which connection you choose in your runbook for the Twilio cmdlets).



Integration Module Authoring - Best Practices


Now, just because Integration Modules are essentially just PowerShell modules, that doesn’t mean we don’t have a set of best practices around authoring them. There’s still a handful of things we recommend you think about while authoring a PowerShell module, to make it most usable in Service Management Automation. Some of these are SMA specific, and some of them are useful just to make your modules work well in PowerShell Workflow, regardless of whether or not you’re using SMA.


1. Include a synopsis, description, and help URI for every cmdlet in the module


In PowerShell, you can define certain help information for cmdlets to allow the user to receive help on using them with the Get-Help cmdlet . For example, here’s how you can define a synopsis, description, and help URI for a PowerShell module written in a .psm1 file:



Providing this info will not only show this help using the Get-Help cmdlet in the PowerShell console, it will also expose this help functionality within SMA, for example when inserting activities during runbook authoring:



Clicking “View detailed help” will actually open the help URI in another tab of the web browser you’re using to access SMA.



2. If the module works against a remote system:



a. It should contain an Integration Module metadata file that defines the information needed to connect to that remote system, aka the connection type




You’re an expert on this one already




b. Each cmdlet in the module should be able to take in a connection object as a parameter




Cmdlets in the module become easiest to use in SMA if you allow passing an object with the fields of the connection type as a parameter. This way users don’t have to map parameters of the connection asset to their corresponding parameters each time they call a cmdlet.




As an example of what I’m talking about, see below. This runbook uses a Twilio connection asset called joeTwilio to access Twilio and return all my Twilio phone numbers. See how I have to map the fields of the connection to the parameters of the cmdlet?





Now compare that with the below, better way of calling Twilio. In this case I am directly passing the connection object to the cmdlet, which is easier:





You can enable behavior like this for your cmdlets by allowing them to take a connection object directly as a parameter, instead of just connection fields for parameters. Usually you’ll want a parameter set for each, so that a user not using SMA can call your cmdlets without constructing a hashtable to act as the connection object. Parameter set “SpecifyConnectionFields” below is used to pass the connection field properties one by one. “UseConnectionObject” let’s you pass the connection straight through. As you can see, the Send-TwilioSMS allows passing either way:






3. Define output type for all cmdlets in the module


Defining an output type for a cmdlet allows design-time IntelliSense to help you determine the output properties of the cmdlet, for use during authoring. As you can see below, the OutputType cmdlet attribute allows you to get “type ahead” functionality on a cmdlet’s output, without having to run it.




While SMA doesn’t use this data today, in the future we hope to enable it to help you construct runbooks more easily.



4. Cmdlets in the module should not take complex object types for parameters


PowerShell Workflow is different from PowerShell in that it stores complex types in deserialized form. Primitive types will stay as primitives, but complex types are converted to their deserialized versions, which are essentially property bags. For example, if you used the Get-Process cmdlet in a runbook (or just PowerShell Workflow for that matter), it would return an object of type [Deserialized.System.Diagnostic.Process], not the expected [System.Diagnostic.Process] type. This type has all the same properties as the non-deserialized type, but none of the methods. And if you try to pass this value as a parameter to a cmdlet, where the cmdlet expects a [System.Diagnostic.Process] value for this parameter, you’ll get a nasty error:


Cannot process argument transformation on parameter 'process'. Error: "Cannot convert the "System.Diagnostics.Process (CcmExec)" value of type


"Deserialized.System.Diagnostics.Process" to type "System.Diagnostics.Process".


This is of course because there is a type mismatch between the expected [System.Diagnostic.Process] type and the given [Deserialized.System.Diagnostic.Process] type. The way around this issue is to ensure the cmdlets of your module do not take complex types for parameters. Here’s the wrong way to do it -- as you can see, the cmdlet takes in a complex type as a parameter:



And here’s right way -- taking in a primitive that can be used internally by the cmdlet to grab the complex object and use it. Since cmdlets execute in the context of PowerShell, not PowerShell Workflow, inside the cmdlet $process becomes the correct [System.Diagnostic.Process] type.




Runbook Authoring tip: If for some reason your cmdlets need to take a complex type parameter, or you are using someone else’s module that requires a complex type parameter, the workaround in runbooks / PowerShell Workflow is to wrap the cmdlet that generates the complex type, and the cmdlet that consumes the complex type, in the same InlineScript activity. Since InlineScript executes its contents as PowerShell rather than PowerShell Workflow, the cmdlet generating the complex type would produce that correct type, not the deserialized complex type.




5. Make all cmdlets in the module stateless


PowerShell Workflow runs every cmdlet called in the workflow in a different session. This means any cmdlets that depend on session state created / modified by other cmdlets in the same module will not work in PowerShell Workflow or runbooks. Here’s an example of what not to do:



As you can see, Get-GlobalNumTimesTwo depends on session variable $globalNum to be set by Set-GlobalNum. This won’t work in workflow and $globalNum will always be 0. Get-GlobalNumTimesTwo should either take in the number as a parameter, so that it doesn’t depend on session state, or else Set-GlobalNum and Get-GlobalNumTimesTwo should be wrapped in the same InlineScript activity in order to run both cmdlets in the same session, in PowerShell context.



6. The module should be fully contained in an Xcopy-able package


Because SMA modules are distributed to all Runbook Worker hosts in order to run runbooks, they need to work independently of the host they are running on. What this means is that you should be able to zip up the module package, move it to any other host with the same or newer PowerShell version, and have it function as normal when imported into that host’s PowerShell environment. In order for that to happen, the module should not depend on any files outside the module folder (the folder that gets zipped up when importing into SMA), or on any unique registry settings on a host, such as those set by the install of a product. If this best practice is not followed the module will not be useable directly in SMA, and generating a portable module will be required.



Conclusion


By now you’re fully up to speed on what an SMA Integration Module is, how you’d write one, and what best practices to follow to make your Integration Modules truly shine in SMA. Integration is core to a successful orchestration strategy, and you should now have everything you need to build SMA integrations into any system.


Until next time, Keep Calm and Automate On.

Version history
Last update:
‎Mar 11 2019 10:07 AM
Updated by: