Querying vRO Workflows with the REST API and Multiple Conditions

Querying vRO workflows using the vRO REST API is relatively straightforward task. Use the below URL with a GET request and specify a condition to search for them, e.g. to find one by it’s name use:


You should get a response similar to the following if a workflow of that name exists:

"link": [
"attributes": [
"value": "https://vroserver.domain.local:8281/vco/api/workflows/cb9264b1-c832-4ee3-b95a-db73f7a2fedf/",
"name": "itemHref"
"value": "cb9264b1-c832-4ee3-b95a-db73f7a2fedf",
"name": "id"
"value": "Test",
"name": "categoryName"
"value": "true",
"name": "canExecute"
"value": "https://vroserver.domain.local:8281/vco/api/catalog/System/WorkflowCategory/40281e864cebecc5014cebfdd3ab0f25/",
"name": "categoryHref"
"value": "",
"name": "description"
"value": "Test01",
"name": "name"
"value": "false",
"name": "customIcon"
"value": "Workflow",
"name": "type"
"value": "true",
"name": "canEdit"
"value": "0.0.1",
"name": "version"
"href": "https://vroserver.domain.local:8281/vco/api/workflows/cb9264b1-c832-4ee3-b95a-db73f7a2fedf/",
"rel": "down"
"total": 1

I wanted to query with multiple conditions though, e.g. name and categoryName or version and categoryName. Possibly I’m not looking in the right place, but I couldn’t find any documentation or examples for the syntax of using multiple conditions:



A few attempts of trial and error revealed separating conditions with a comma is the way to go, for example:

1) Look for workflows named New-Array in the vRO folder (categoryName) Test


2) Look for workflows in the vRO folder (categoryName) Test with version number 0.0.1


Note: It’s worth bearing in mind Conditions are case sensitive, so categoryName not categoryname, otherwise the query will fail. Their associated values e.g Test or New-Array are not case sensitive.

Setting up a Minecraft Spigot Server in Windows Azure

I needed to setup a Minecraft server so that one of my kids could play online against a friend who had moved to another continent and they wanted a few different ways to stay in touch. Since one half of the friendship doesn’t have Xbox Live, but both have the PC / Mac version I figured I could sort out a hosted server for them to play on. There are plenty of places around that will host one for you for a small fee, but since I had some monthly Windows Azure credits via my MSDN subscription I figured I’d have a go setting up my own one and see how that went.

Initially I looked at deploying a pre-packaged Minecraft server from the Azure Marketplace, but the first two attempts failed to deploy so I looked at other possible options.

Minecraft01It seemed a popular choice when people were setting up their own was a Spigot version, partly because it looks like there are loads of different plugins which can be added at a later date. So I went down that route.

Deploying an Ubuntu VM

From the Azure portal I selected to deploy a new VM from the Gallery:


Then picked the latest version of Ubuntu available in there:


Fill out the Virtual Machine configuration dialogue. Note: to use SSH key authentication you need to supply an X.509 certificate.


Enter some more configuration details on the next page, including which datacenter region to host the VM in. Also be sure to add an additional port as an endpoint to include that which the Minecraft game needs, 25565.


On the last page, check you are happy with the selections and then tick to go!


Sit back in comfort and wait for the VM to deploy:



Once complete, you’ll have a nice VM ready for you:



I hadn’t used Azure before and found it to be quite a nice experience overall.

Install Java

Connect with SSH to the VM and check whether Java is installed:


Install Java:

sudo apt-get install openjdk-7-jre-headless

then confirm it is installed:

java -version


Install Minecraft Spigot

*Full details can be found here*

Make a directory for downloading the Spigot build tools to and download the file:

sudo mkdir -p /opt/Minecraft/build

cd /opt/Minecraft/build/

sudo wget https://hub.spigotmc.org/jenkins/job/BuildTools/lastSuccessfulBuild/artifact/target/BuildTools.jar

Run BuildTools.jar from the terminal – this took about 10+ mins for me:

sudo git config --global --unset core.autocrlf

sudo java -jar BuildTools.jar



Eventually you should get something like this for a successful completion:


Create a new directory to host the compiled jar file and copy it there:

sudo mkdir /opt/Minecraft/play

sudo cp /opt/Minecraft/build/spigot-1.8.7.jar /opt/Minecraft/play/spigot.jar

Create a new startup script (start.sh) in the directory to launch the the JAR:

sudo vi /opt/Minecraft/play/start.sh

paste the following code into the start.sh file:


java -Xms512M -Xmx1024M -XX:MaxPermSize=128M -jar spigot.jar

Add run capabilities to the start.sh script. Run the start.sh script to start the initial run of the server, you’ll be prompted that the EULA needs agreeing to:

sudo chmod +x start.sh

sudo ./start.sh


Edit eula.txt and set eula=true

sudo vi eula.txt



Run the startup script again and this time the server will start fully:

sudo ./start.sh


Typing help at this point will give a list of commands that can be used interactively. You can also set the configuration of the game by issuing a stop then editing the server.properties file.


Now would be a good time to test you have set things up correctly. Fire up the Minecraft game and head into Multiplayer. Add a server and enter the details:


Once complete, the server should appear as available to connect to:


Oh no, it’s night time already!


Run Spigot as a service

Now all we need to do is run the Spigot server as a service, rather than interactively, otherwise the game dies when we drop the SSH session.

(Note: I expect there is a better way to do this than what I came up with, but I’m by no means a Linux expert, so feel free to leave a comment if you have a better way)

Create a minecraft.service file

sudo vi /etc/systemd/system/minecraft.service

Paste the following into that file:

Description=Minecraft Server

Start the service and check the status:

sudo systemctl start minecraft.service

sudo systemctl status minecraft.service

All being well you should see the service begin to start up:




Onyx for the vSphere Web Client

The original Project Onyx (also here) was a VMware Fling which intercepted calls between the C# vSphere client and vCenter and could translate things you did in the GUI client to code in your choice of languages; SOAP, C#, PowerCLI, and JavaScript. This was really useful if say a thing you needed to do was not yet covered by a PowerCLI cmdlet, or you wanted to try and make it faster by using Get-View, or you didn’t know how to code it in JavaScript for vRO.

An issue arose however with the release of the vSphere Web Client, since Project Onyx did not support that method of intercepting calls. When VMware started moving new functionality only into the Web Client, that meant you could not use Project Onyx to figure out the code for those new features.

Enter a new VMware Fling – Onyx for the vSphere Web Client!

Some smart guys have now brought this same functionality to the vSphere Web Client. Let’s take a look at an example of how it works.

First of all follow the instructions on the Fling site to get it installed. Currently it supports vCenter 6.0. Note the warning about only using this Fling in your test environment (that’s where you write and test all of your code anyway right? 😉 )
WARNING: This fling replaces core Web Client files and may cause issues with stability and
patching of future versions of the web client, please only continue with this installation if you are
using a test or dev environment.

Once installed and the vSphere Web Client service has restarted, you will see a new menu item for Onyx.


Currently there is only PowerCLI.NET available as a code output choice. I’m hoping that since it is a drop down menu, there are more to come. I’ve already put in a Feature Request for JavaScript for handy use with vRO 😉


You can either Start and Stop recording from this area or very handily they have added the same buttons in the top right hand corner so that you can Start and Stop recording from anywhere in the Web Client.


So start the recording and navigate to the thing that you want to do. For this example we’ll turn on HA for a cluster.


Once complete, stop the recording.


Now head back to the Onyx section of the Web Client and observe the code required to make that change.


$spec = New-Object VMware.Vim.ClusterConfigSpecEx
$spec.dasConfig = New-Object VMware.Vim.ClusterDasConfigInfo
$spec.dasConfig.vmComponentProtecting = 'disabled'
$spec.dasConfig.enabled = $true
$spec.dasConfig.admissionControlEnabled = $true
$spec.dasConfig.vmMonitoring = 'vmMonitoringDisabled'
$spec.dasConfig.hostMonitoring = 'enabled'
$spec.dasConfig.HBDatastoreCandidatePolicy = 'allFeasibleDsWithUserPreference'
$spec.dasConfig.admissionControlPolicy = New-Object VMware.Vim.ClusterFailoverLevelAdmissionControlPolicy
$spec.dasConfig.admissionControlPolicy.failoverLevel = 1
$spec.dasConfig.defaultVmSettings = New-Object VMware.Vim.ClusterDasVmSettings
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings = New-Object VMware.Vim.ClusterVmComponentProtectionSettings
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings.vmReactionOnAPDCleared = 'none'
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings.enableAPDTimeoutForHosts = $true
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings.vmStorageProtectionForAPD = 'disabled'
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings.vmTerminateDelayForAPDSec = 180
$spec.dasConfig.defaultVmSettings.vmComponentProtectionSettings.vmStorageProtectionForPDL = 'disabled'
$spec.dasConfig.defaultVmSettings.vmToolsMonitoringSettings = New-Object VMware.Vim.ClusterVmToolsMonitoringSettings
$spec.dasConfig.defaultVmSettings.vmToolsMonitoringSettings.failureInterval = 30
$spec.dasConfig.defaultVmSettings.vmToolsMonitoringSettings.maxFailures = 3
$spec.dasConfig.defaultVmSettings.vmToolsMonitoringSettings.maxFailureWindow = 3600
$spec.dasConfig.defaultVmSettings.vmToolsMonitoringSettings.minUpTime = 120
$spec.dasConfig.defaultVmSettings.restartPriority = 'medium'
$spec.dasConfig.defaultVmSettings.isolationResponse = 'none'
$spec.dasConfig.option = New-Object VMware.Vim.OptionValue[] (0)
$spec.dasConfig.heartbeatDatastore = New-Object VMware.Vim.ManagedObjectReference[] (0)
$spec.dasConfig.hBDatastoreCandidatePolicy = 'allFeasibleDsWithUserPreference'
$modify = $true
$_this = Get-View -Id 'ClusterComputeResource-domain-c22'
$_this.ReconfigureComputeResource_Task($spec, $modify)

This is super cool. I really like the way it’s been implemented and looking forward to further development of this Fling :-)


PowerShell Module for the Brickset API

Brickset.com is one of my favourite sites for finding info about Lego sets and keeping up-to-date with Lego news. I noticed recently that they had an API so I thought I would check it out. I posted a while back on how to do some similar stuff with other websites, but some of the functionality on those websites is no longer available.

This module for Brickset data contains functions for finding information about Lego sets and also makes it possible to download instruction manuals which can be pretty helpful if you have lost one of them or buy a 2nd hand set without the manuals.

1) Installation

Download the module from Github, ensure the files are unblocked and copy it to one of your PowerShell module folder paths. For me that is C:\Users\jmedd\Documents\WindowsPowerShell\Modules .


If you’ve done it successfully then you should be able to see details of the module with Get-Module:

Get-Module Brickset -Listavailable


You can take a look at the commands available by importing the module and then using Get-Command:

Import-Module Brickset

Get-Command -Module Brickset



2) Get a Brickset API Key

To use the Brickset API you need to first of all register for an API key. Fill out the form here and they will send you one. Currently they are free.

All of the Get-* functions in the module require you to use the API key; you won’t get very far without one.

3) Set a Global Variable for the API Key

Each of the Get-* functions has an APIKey parameter. However, to make things easier to use I’ve also added the Set-BricksetAPIKey function which will create a global variable for the API key that will be valid for the duration of that PowerShell session. If you don’t supply an API key to the Get-* functions then they will look for the global key having been set.

So the best thing to do is run this function first, with your API key, to set the API key variable:

Set-BricksetAPIKey -APIKey 'ed4w-KQA2-1Ps2' -Confirm:$false



4) Getting All Lego Sets by Theme

First example of using one of the Get-* functions is Get-BricksetTheme and Get-BricksetSet. Together they can be used to retrieve a list of Brickset Themes and then the sets in a theme. So let’s take a look at the example:

First of all, use Get-BricksetTheme to retrieve all available Themes from the Brickset site (remember you don’t need to specify the API key if it has already been set:

Get-BricksetTheme | Format-Table Theme,SetCount -AutoSize


Now say we want to retrieve all of the sets for the Indiana Jones theme we can use Get-BricksetSet with the Theme parameter:

Tip: You can use the OrderBy parameter to sort by a particular value. This may be more efficient than piping it through to Sort-Object afterwards:

Note: there is a lot more information available for each set than displayed here. I’ve restricted the output to a few key properties.

Get-BricksetSet -Theme 'Indiana Jones' -OrderBy Pieces | Format-Table Name,Number,Theme,SubTheme,Year,Pieces -AutoSize


5) Getting All Lego Sets by Theme and Year

If you want the search to be more specific then you can specify multiple search criteria. For instance one of the largest ranges is the Star Wars theme. Let’s have a look and see what sets have been released this year so far:

Get-BricksetSet -Theme 'Star Wars' -Year 2015 -OrderBy Pieces | Format-Table Name,Number,Theme,SubTheme,Year,Pieces -AutoSize



6) Get a Specific Set by Set Number

All Lego sets are assigned a number by Lego. Use the SetNumber parameter of Get-BricksetSet to retrieve details of a specific set.

Note: the API requires that you supply the set number in the format {number}-{variant}, e.g. 7199-1. If you see the number on the front of the box as say 7199, typically you will need to use 7199-1 for the first variant of that set. However, be careful as this will not always be the case.


Get-BricksetSet -SetNumber '7199-1'



7) Download Lego Set Instructions

The function Get-BricksetSetInstructions will return URLs for links to the set’s instructions on the Lego.com website.

Note: the parameter to use is SetId not SetNumber. SetId is Brickset’s own reference number for the set and not the Lego Set Number we used in the previous example. Use Get-BricksetSet to find this info first. I’ve used PowerShell pipeline parameter binding in the functions so that you can do this easily:

Get-BricksetSet -SetNumber '7199-1' | Get-BricksetSetInstructions


Note: this example returned two pdfs, since it is a large set and had two printed manuals in the box.

If you want to open them straight into your web browser of choice you could do this:

Get-BricksetSet -SetNumber 7199-1 | Get-BricksetSetInstructions | Select-Object -ExpandProperty url | Foreach-Object {Invoke-Expression “cmd.exe /C start $_”}

or if you wanted to say download all of the instructions for a particular theme you could do this using the built-in Windows cmdlet Start-BitsTransfer:

Get-BricksetSet -Theme 'Indiana Jones' | Get-BricksetSetInstructions | Select-Object -ExpandProperty url | ForEach-Object {$file = ($_.split('/'))[-1]; Start-BitsTransfer -Source $_ -Destination "C:\Users\jmedd\Documents\$file"}


8) Other Module Functions

There are a few other functions in the module for finding information around Images, Set Reviews and Recently Updated Sets.

I hope you have fun with this module :-)


PowerShell: Default Value of [Int]$a is 0

I didn’t realise until yesterday that the default value of a variable using the [int] type had a default value of 0, e.g



So what’s the big deal? Say you have a function as follows:

function Test-DefaultInt {





"a is $a"
 "b is $b"
 "c is $c"

and run it like so

Test-DefaultInt -a 5 -b 6 -c 7

Test-DefaultInt -a 5 -b 6

You get the following results:


If you change the function and take away the [int] from $c:

function Test-DefaultInt {





"a is $a"
"b is $b"
"c is $c"

and run the same tests you get the following:


with no value for c. OK, its a contrived example, but I had a reasonably complex situation to deal with. The difference between a variable being the value 0 or not present, inside the workings of the function, made a big difference to the output. Having wasted about an hour last night on it, I hope this saves someone else a headache :-)


Can’t Clone Git Respository in SourceTree: Failed to connect….No error

While attempting to clone a Git based repository in Stash via the SourceTree client, I received the following error:

fatal: unable to access ‘http://[email protected]:443/repo.git/': Failed connect to company.local:443; No error

No error obviously doesn’t give many clues, however this post suggested something to do with proxy settings.

My web browsers were configured to use an internal web proxy via an automatic configuration script and I could successfully navigate to the repository via a web browser. My SourceTree client appeared to be configured correctly since it was set to use the same proxy settings:


The aforementioned post suggested that it was actually the Git command line tools which also need to be configured to use a proxy and the handy checkbox Add proxy server configuration to Git / Mercurial does that for you.


I configured that setting and lo and behold could then clone the repository.


vCloud Automation Center Designer: Setup was Interrupted

If you have the mis good fortune to work with the vCloud / vRA Automation Center Designer, which can still be used / is required for some elements of automation within vCAC / vRA then you may experience issues even installing it.

While attempting to install the Designer client on two different management servers (Windows 2008 R2 and 2012 R2), I received the same error:

vCloud Automation Center Designer: Setup was Interrupted

with no real indication of what the problem was. (Familiar installation story vCAC fans? :-) )

A log file is created in C:\vcacLog from which I got:

MSI (c) (10:BC) [16:39:50:721]: Product: VMware vCloud Automation Center Designer — Installation failed.

MSI (c) (10:BC) [16:39:50:721]: Windows Installer installed the product. Product Name: VMware vCloud Automation Center Designer. Product Version: Product Language: 1033. Manufacturer: VMware, Inc.. Installation success or error status: 1602.

Not too much info around for that error, other than this communities post. I had tried installing it on different OS versions, also made sure .NET 4.5 was installed (4.5.2 in the end), had local admin rights and UAC was turned off.

Eventually I stumbled on this VMware KB with a similar issue. So I added the certificate from the IaaS server (https://iaas.fdqn/Repository) to the Trusted Root Certification Authorities store on the machine Designer was being installed onto.

Note: this was in a lab environment so I was only using self-generated certs throughout vRA.

A subsequent reattempt of the Designer install was successful.


Curiously, a subsequent reattempt on the other management server, without importing the certificate, was also then successful. Possibly it is only needed for the first time any client is setup to use it.




PowerShell Requires -Modules: ModuleVersion “Argument must be constant”

I was looking to make use of a PowerShell feature which prevents a script from running without the required elements. Detailed here (about_Requires) there are examples for requiring specific versions of PowerShell, snap-ins and modules, and administrative rights.

In particular I was looking at the modules example given in the documentation:

#Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=}

Unfortunately, using this example as is given generates an error Argument must be constant:

C:\Scripts\Scratch> .\Test-Requires.ps1
At C:\Scripts\Scratch\Test-Requires.ps1:1 char:20
+ #Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=1.0.0 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Argument must be constant.
+ CategoryInfo : ParserError: (:) [], ParseException
+ FullyQualifiedErrorId : RequiresArgumentMustBeConstant


The correct syntax for the example should read:

#Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=""}

i.e. quotes around the ModuleVersion number. So in a contrived example where I bump the version number up to, running the script now gives me the response I am looking for:


I logged a documentation bug here.

Publish IaaS Blueprint in vRO via the vRA REST API

There is an excellent post over at Automate-IT.today which details how to create a vRA IaaS Blueprint from vRO. Once you have used the workflow from that site to create a Blueprint it still needs to be published before it can be used as a vRA Catalog Item, added to a Service etc.


Note that even updating Christiaan Roeleveld‘s code to set the property IsPublished to true, doesn’t actually publish the Blueprint. Although in the screenshot above it appears to be published, it doesn’t actually show up in Administration / Catalog Items yet.

I needed to be able to do this and found that it is possible via the vRA REST API. Check out  PUT request to “Register a ProviderCatalogItem or update an already registered one.”

1) Get an authentication token

To achieve this in vRO you will first of all need to obtain an authentication token, I detailed how to do that in a previous post.

2) Create a REST operation for Register a ProviderCatalogItem

Run the Add a REST operation workflow and populate with the REST host and PUT request details, using the URL from the above documentation: /catalog-service/api/provider/providers/{providerId}/catalogItems/{bindingId}


3) Generate a workflow for Register a ProviderCatalogItem

Run the Generate a new workflow from a REST operation workflow. Populate with the REST operation created above and set the Content type to application/json .


Give it a name and select a folder to store it in.


4) Update the generated workflow with a token input and additional headers

You’ll need to edit the Scriptable Task in the generated workflow. Add an extra input parameter of token, type string.



On the Scripting tag, add the following lines of code to include the authentication token and an Accept header:

var authorizationToken = "Bearer " + token

request.setHeader("Accept", "application/json");
request.setHeader("Authorization", authorizationToken);

5) Get the IaaS Provider ID

The observant among you will have noticed that from the PUT URL, /catalog-service/api/provider/providers/{providerId}/catalogItems/{bindingId}, we need to supply a providerId and a bindingId. The providerId for our example is the ID of the IaaS provider. This can be determined via a separate REST call.

6) Create a REST operation for Get Providers

Run the Add a REST operation workflow and populate with the REST host and GET request details for this URL: /catalog-service/api/providers


7) Generate a workflow for Get Providers

Run the Generate a new workflow from a REST operation workflow. Populate with the REST operation created above.


Give it a name and select a folder to store it in.



8) Update the generated workflow with a token input and additional headers

You’ll need to edit the Scriptable Task in the generated workflow. Add an extra input parameter of token, type string.


On the Scripting tag, add the following lines of code to include the authentication token and an Accept header:

var authorizationToken = "Bearer " + token

request.setHeader("Accept", "application/json");
request.setHeader("Authorization", authorizationToken);

Note: the response you will receive to GET providers will be something like the following. We are interested in the iaas-service id property:

 "links": [],
 "content": [
 "@type": "Provider",
 "id": "934c88ec-607e-415b-ad38-1290c30d8610",
 "name": "iaas-service",
 "providerTypeId": "com.vmware.csp.iaas.blueprint.service"
 "@type": "Provider",
 "id": "0dd55fd2-2b1a-432e-89af-b77cb6f41b15",
 "name": "Advanced Designer Service",
 "providerTypeId": "com.vmware.csp.core.designer.service"
 "metadata": {
 "size": 20,
 "totalElements": 2,
 "totalPages": 1,
 "number": 1,
 "offset": 0

9) Get the Binding ID

The other item we will need to provide is the Binding ID. Thanks to Christiaan for pointing out that this is the virtualMachineTemplateId property of the IaaS Blueprint.


10) Publishing the Blueprint

Armed with the following info, we are now able to run the New-ProviderCatalogItem workflow:

  • Authentication token
  • ProviderId
  • BindingID

We also need to send the following JSON text with the PUT request as detailed here. Replace the following values:

  • id – virtualMachineTemplateId / BindingID
  • name – Catalog Item name
  • description – Catalog Item description
  • tenantRef – vRA Tenant name
  • tenantLabel – vRA Tenant name

 "id": "357b9240-62bd-4201-9be0-b3c6180643b9",
 "name": "Centos-Small",
 "description": "Centos-Small",
 "iconId": "cafe_default_icon_genericCatalogItem",
 "catalogItemTypeId": "Infrastructure.Virtual",
 "organization": {
 "tenantRef": "Tenant10",
 "tenantLabel": "Tenant10",
 "subtenantRef": null,
 "subtenantLabel": null
 "outputResourceTypeId": "Infrastructure.Virtual",
 "forms": null

Run the workflow and enter these details:




A successful workflow run will see the blueprint published to a catalog item.



vRA: Returning a Catalog Item from a Blueprint ID in vRO

After creating a Blueprint in vRA it is necessary to publish the Blueprint into the Catalog so that it can be consumed by the appropriate set of users. This creates a link between the two different items since the Catalog Item is part of the vRA appliance and the Blueprint can be found in the Windows appliance.

Here’s the Blueprint details from the Inventory tab of vRO, with the virtualMachineTemplateID, a.k.a the Blueprint ID.



Now look at the Binding id on the Catalog Item in vRO to see matching IDs.


Say you now have the BlueprintID and need to return the Catalog Item, how would you do it? The following JavaScript code from a vRO Action I created for this shows you how:

var vCACAdminCatalogItems = Server.findAllForType("vCACCAFE:AdminCatalogItem");

for each (var vCACAdminCatalogItem in vCACAdminCatalogItems) {

try {

var providerBinding = vCACAdminCatalogItem.providerBinding;
var bindingId = providerBinding.getBindingId().toString();

if (bindingId == blueprintId) {

var adminCatalogItem = vCACAdminCatalogItem;
} catch(ex) {

return adminCatalogItem

The getCatalogItemByBlueprintID Action can be downloaded from GitHub.

I put the action into a Test Workflow to demonstrate how it works. This is the Scripting Tab of the action.


Schema of the test workflow:


The Action will output an item of type vCACCAFE:AdminCatalogItem:


The Scriptable Task in the workflow will simply write the ID of the Catalog Item to the system log. We know we should be expecting to see 4e5fe6ee-e8c6-4fcb-8458-9cdbf2cfd465, the ID of the Catalog Item:




Success :-)