Can’t Clone Git Respository in SourceTree: Failed to connect….No error

While attempting to clone a Git based repository in Stash via the SourceTree client, I received the following error:

fatal: unable to access ‘http://[email protected]:443/repo.git/': Failed connect to company.local:443; No error

No error obviously doesn’t give many clues, however this post suggested something to do with proxy settings.

My web browsers were configured to use an internal web proxy via an automatic configuration script and I could successfully navigate to the repository via a web browser. My SourceTree client appeared to be configured correctly since it was set to use the same proxy settings:


The aforementioned post suggested that it was actually the Git command line tools which also need to be configured to use a proxy and the handy checkbox Add proxy server configuration to Git / Mercurial does that for you.


I configured that setting and lo and behold could then clone the repository.


vCloud Automation Center Designer: Setup was Interrupted

If you have the mis good fortune to work with the vCloud / vRA Automation Center Designer, which can still be used / is required for some elements of automation within vCAC / vRA then you may experience issues even installing it.

While attempting to install the Designer client on two different management servers (Windows 2008 R2 and 2012 R2), I received the same error:

vCloud Automation Center Designer: Setup was Interrupted

with no real indication of what the problem was. (Familiar installation story vCAC fans? :-) )

A log file is created in C:\vcacLog from which I got:

MSI (c) (10:BC) [16:39:50:721]: Product: VMware vCloud Automation Center Designer — Installation failed.

MSI (c) (10:BC) [16:39:50:721]: Windows Installer installed the product. Product Name: VMware vCloud Automation Center Designer. Product Version: Product Language: 1033. Manufacturer: VMware, Inc.. Installation success or error status: 1602.

Not too much info around for that error, other than this communities post. I had tried installing it on different OS versions, also made sure .NET 4.5 was installed (4.5.2 in the end), had local admin rights and UAC was turned off.

Eventually I stumbled on this VMware KB with a similar issue. So I added the certificate from the IaaS server (https://iaas.fdqn/Repository) to the Trusted Root Certification Authorities store on the machine Designer was being installed onto.

Note: this was in a lab environment so I was only using self-generated certs throughout vRA.

A subsequent reattempt of the Designer install was successful.


Curiously, a subsequent reattempt on the other management server, without importing the certificate, was also then successful. Possibly it is only needed for the first time any client is setup to use it.




PowerShell Requires -Modules: ModuleVersion “Argument must be constant”

I was looking to make use of a PowerShell feature which prevents a script from running without the required elements. Detailed here (about_Requires) there are examples for requiring specific versions of PowerShell, snap-ins and modules, and administrative rights.

In particular I was looking at the modules example given in the documentation:

#Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=}

Unfortunately, using this example as is given generates an error Argument must be constant:

C:\Scripts\Scratch> .\Test-Requires.ps1
At C:\Scripts\Scratch\Test-Requires.ps1:1 char:20
+ #Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=1.0.0 ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Argument must be constant.
+ CategoryInfo : ParserError: (:) [], ParseException
+ FullyQualifiedErrorId : RequiresArgumentMustBeConstant


The correct syntax for the example should read:

#Requires -Modules PSWorkflow, @{ModuleName="PSScheduledJob";ModuleVersion=""}

i.e. quotes around the ModuleVersion number. So in a contrived example where I bump the version number up to, running the script now gives me the response I am looking for:


I logged a documentation bug here.

Publish IaaS Blueprint in vRO via the vRA REST API

There is an excellent post over at which details how to create a vRA IaaS Blueprint from vRO. Once you have used the workflow from that site to create a Blueprint it still needs to be published before it can be used as a vRA Catalog Item, added to a Service etc.


Note that even updating Christiaan Roeleveld‘s code to set the property IsPublished to true, doesn’t actually publish the Blueprint. Although in the screenshot above it appears to be published, it doesn’t actually show up in Administration / Catalog Items yet.

I needed to be able to do this and found that it is possible via the vRA REST API. Check out  PUT request to “Register a ProviderCatalogItem or update an already registered one.”

1) Get an authentication token

To achieve this in vRO you will first of all need to obtain an authentication token, I detailed how to do that in a previous post.

2) Create a REST operation for Register a ProviderCatalogItem

Run the Add a REST operation workflow and populate with the REST host and PUT request details, using the URL from the above documentation: /catalog-service/api/provider/providers/{providerId}/catalogItems/{bindingId}


3) Generate a workflow for Register a ProviderCatalogItem

Run the Generate a new workflow from a REST operation workflow. Populate with the REST operation created above and set the Content type to application/json .


Give it a name and select a folder to store it in.


4) Update the generated workflow with a token input and additional headers

You’ll need to edit the Scriptable Task in the generated workflow. Add an extra input parameter of token, type string.



On the Scripting tag, add the following lines of code to include the authentication token and an Accept header:

var authorizationToken = "Bearer " + token

request.setHeader("Accept", "application/json");
request.setHeader("Authorization", authorizationToken);

5) Get the IaaS Provider ID

The observant among you will have noticed that from the PUT URL, /catalog-service/api/provider/providers/{providerId}/catalogItems/{bindingId}, we need to supply a providerId and a bindingId. The providerId for our example is the ID of the IaaS provider. This can be determined via a separate REST call.

6) Create a REST operation for Get Providers

Run the Add a REST operation workflow and populate with the REST host and GET request details for this URL: /catalog-service/api/providers


7) Generate a workflow for Get Providers

Run the Generate a new workflow from a REST operation workflow. Populate with the REST operation created above.


Give it a name and select a folder to store it in.



8) Update the generated workflow with a token input and additional headers

You’ll need to edit the Scriptable Task in the generated workflow. Add an extra input parameter of token, type string.


On the Scripting tag, add the following lines of code to include the authentication token and an Accept header:

var authorizationToken = "Bearer " + token

request.setHeader("Accept", "application/json");
request.setHeader("Authorization", authorizationToken);

Note: the response you will receive to GET providers will be something like the following. We are interested in the iaas-service id property:

 "links": [],
 "content": [
 "@type": "Provider",
 "id": "934c88ec-607e-415b-ad38-1290c30d8610",
 "name": "iaas-service",
 "providerTypeId": "com.vmware.csp.iaas.blueprint.service"
 "@type": "Provider",
 "id": "0dd55fd2-2b1a-432e-89af-b77cb6f41b15",
 "name": "Advanced Designer Service",
 "providerTypeId": "com.vmware.csp.core.designer.service"
 "metadata": {
 "size": 20,
 "totalElements": 2,
 "totalPages": 1,
 "number": 1,
 "offset": 0

9) Get the Binding ID

The other item we will need to provide is the Binding ID. Thanks to Christiaan for pointing out that this is the virtualMachineTemplateId property of the IaaS Blueprint.


10) Publishing the Blueprint

Armed with the following info, we are now able to run the New-ProviderCatalogItem workflow:

  • Authentication token
  • ProviderId
  • BindingID

We also need to send the following JSON text with the PUT request as detailed here. Replace the following values:

  • id – virtualMachineTemplateId / BindingID
  • name – Catalog Item name
  • description – Catalog Item description
  • tenantRef – vRA Tenant name
  • tenantLabel – vRA Tenant name

 "id": "357b9240-62bd-4201-9be0-b3c6180643b9",
 "name": "Centos-Small",
 "description": "Centos-Small",
 "iconId": "cafe_default_icon_genericCatalogItem",
 "catalogItemTypeId": "Infrastructure.Virtual",
 "organization": {
 "tenantRef": "Tenant10",
 "tenantLabel": "Tenant10",
 "subtenantRef": null,
 "subtenantLabel": null
 "outputResourceTypeId": "Infrastructure.Virtual",
 "forms": null

Run the workflow and enter these details:




A successful workflow run will see the blueprint published to a catalog item.



vRA: Returning a Catalog Item from a Blueprint ID in vRO

After creating a Blueprint in vRA it is necessary to publish the Blueprint into the Catalog so that it can be consumed by the appropriate set of users. This creates a link between the two different items since the Catalog Item is part of the vRA appliance and the Blueprint can be found in the Windows appliance.

Here’s the Blueprint details from the Inventory tab of vRO, with the virtualMachineTemplateID, a.k.a the Blueprint ID.



Now look at the Binding id on the Catalog Item in vRO to see matching IDs.


Say you now have the BlueprintID and need to return the Catalog Item, how would you do it? The following JavaScript code from a vRO Action I created for this shows you how:

var vCACAdminCatalogItems = Server.findAllForType("vCACCAFE:AdminCatalogItem");

for each (var vCACAdminCatalogItem in vCACAdminCatalogItems) {

try {

var providerBinding = vCACAdminCatalogItem.providerBinding;
var bindingId = providerBinding.getBindingId().toString();

if (bindingId == blueprintId) {

var adminCatalogItem = vCACAdminCatalogItem;
} catch(ex) {

return adminCatalogItem

The getCatalogItemByBlueprintID Action can be downloaded from GitHub.

I put the action into a Test Workflow to demonstrate how it works. This is the Scripting Tab of the action.


Schema of the test workflow:


The Action will output an item of type vCACCAFE:AdminCatalogItem:


The Scriptable Task in the workflow will simply write the ID of the Catalog Item to the system log. We know we should be expecting to see 4e5fe6ee-e8c6-4fcb-8458-9cdbf2cfd465, the ID of the Catalog Item:




Success :-)


Working with the vRealize Automation REST API via vRealize Orchestrator

As of vRealize Automation version 6.2.1 there are a few different approaches to automating elements of the product itself, as opposed to using it for the automation tasks it is designed to help you with. This is along the lines of configuring elements within vRA, some of which I have covered previously within this blog post series. That series focused on using the vRA plugin for vRealize Orchestrator. However, the plugin doesn’t cover everything that you might need to automate within the product. Things are also not helped by the fact that vRA itself at this time is in a split-brain state making some parts of it hard to automate.

The good news is that elements which belong to the vRA Appliance side of the split-brain and are not in the vRO plugin, may well be covered by the vRA REST API. This blogpost from Kris Thieler is a really useful guide to getting started with the vRA REST API.

Taking elements from that post, I have applied them for use in vRO, i.e. I want to be able to run workflows in vRO to use the vRA REST API.

Getting an Authentication Token

The Getting Started blogpost demonstrates that to authenticate with the vRA REST API requires first of all generating an authentication token which can then be used for all subsequent REST requests for up to a 24 hour period.

My previous experience with using REST within vRO had been straightforward cases of adding a REST Host via the Add a REST host configuration workflow and supplying a set of credentials at that point which would then be used for each request. This approach was obviously not going to work in this instance.

The following is the procedure I came up with to work with authentication tokens; more than happy for comments on this post for easier or better ways to do it :-)

First of all run the Add a REST host configuration workflow with the vRA appliance set as the target URL and set the authentication method to None.






Next step is to add a REST operation with the query to generate a token. It’s a POST request to the URL /identity/api/tokens .


This will create an operation which is viewable from the Inventory view:


Now we need to create our own Workflow to use based off of that REST operation. Run the Library workflow Generate a new workflow from a REST operation and select the REST operation just created:


I’ve named it Request-Token and am storing it in the Test vRO folder.


We need to modify this workflow to add an extra header required by the API. The Getting Started blogpost shows that we need an Accept header Accept : application/json . (In this previous post I demonstrate how to add headers) On the Scripting tab add the following code:

request.setHeader("Accept", "application/json");


Once successfully complete, we can make use of it to generate a token. Create a new workflow Request-vRAToken which will take inputs of the info we need to generate a token (vRA username, password and Tenant name) and use the Request-Token workflow to send the request to generate it.


Set inputs for Request-vRAToken to be:

  • username – String
  • password – SecureString
  • tenant – String


Add a scriptable task to the schema, Create POST text, and set the inputs to be the parameters just created. This task will generate the text we need to send as part of the POST request.


Set an attribute output as:

  • postText – String


On the Scripting tab add the following code:

var postText = "{\"username\":\"" + username + "\",\"password\":\"" + password + "\",\"tenant\":\"" + tenant + "\"}";

System.log("PostText is: " + postText);

Note: once you are happy this is working, it would be worth removing the System.log line so that the password is not echoed into the logs.


Close the scriptable task and add a Workflow element next in the schema, selecting the Request-Token workflow previously created.  Set the input as the postText attribute:


Set output attributes to match the standard REST output names:


Close the workflow settings and add a final scriptable task, Output Token. For inputs select contentAsString :


Create an output parameter token, which we will use to get the token out of the workflow:


On the Scripting tab add the following code to parse the JSON response from the vRA API and pick out the token:

var jsonResponse = JSON.parse(contentAsString);

var token =

System.log("Token is: " + token);


Close the scriptable task and the schema will look like this:


Save and close the workflow. Then run it, supplying credentials and a tenant name:


All being well, well get a successful run of the workflow and a generated token:

[2015-05-15 14:50:15.557] [I] PostText is: {“username”:”[email protected]”,”password”:”P@ssword”,”tenant”:”Tenant01″}
[2015-05-15 14:50:15.609] [I] Request: DynamicWrapper (Instance) : [RESTRequest]-[class] — VALUE :
[2015-05-15 14:50:15.609] [I] Request URL: https://vraap01.vrademo.local/identity/api/tokens
[2015-05-15 14:50:16.030] [I] Response: DynamicWrapper (Instance) : [RESTResponse]-[class] — VALUE :
[2015-05-15 14:50:16.031] [I] Status code: 200
[2015-05-15 14:50:16.031] [I] Content as string: {“expires”:”2015-05-16T13:51:55.456Z”,”id”:”MTQzMTY5NzkxNTQ1NDowMGZiNWUyMmNlZjI2ZTI1MTAzYTp0ZW5hbnQ6VGVuYW50MDF1c2VybmFtZTp0ZW5hbnRhZG1pbjAxQHZyYWRlbW8ubG9jYWw6ODVmZDE4MGM2ZTkzZjBkOGRlMzk3MzhkNTQ0NWRlNTU2YjI0ZjFmZmI2OThlNmZjZjI2ZDExZThhNjI0MzY5YzBmMTUzY2Q4M2QwY2JhMjE0ZmRlYjYzNzJjZWEzNTY2YzAzNDFhZGJjOTdkMmI3ZGVmMTY0NjY1OGM2MjE4NmE=”,”tenant”:”Tenant01″}
[2015-05-15 14:50:16.113] [I] Token is: MTQzMTY5NzkxNTQ1NDowMGZiNWUyMmNlZjI2ZTI1MTAzYTp0ZW5hbnQ6VGVuYW50MDF1c2VybmFtZTp0ZW5hbnRhZG1pbjAxQHZyYWRlbW8ubG9jYWw6ODVmZDE4MGM2ZTkzZjBkOGRlMzk3MzhkNTQ0NWRlNTU2YjI0ZjFmZmI2OThlNmZjZjI2ZDExZThhNjI0MzY5YzBmMTUzY2Q4M2QwY2JhMjE0ZmRlYjYzNzJjZWEzNTY2YzAzNDFhZGJjOTdkMmI3ZGVmMTY0NjY1OGM2MjE4NmE=

Using the Authentication Token in other API Requests

Now that we have a mechanism for generating a token, let’s look at an example using the token. The vRA API details a GET request for retrieving all custom groups and SSO groups that correspond to a specified search criteria. For a simple example we can run a GET request against the URL /identity/api/tenants/{tenantId}/groups using tenantId as a parameter.

Firstly we need a REST operation for that URL. Run the Add a REST operation workflow to create an operation Get-Groups:


We now have an additional operation available:


We need a workflow for it, so run the Generate a new workflow from a REST operation workflow:


Give it a name Get-TenantGroups and again put it in the Test folder:


We need to modify this workflow to use the same Accept header added previously and also the authentication token. Add an extra input:

  • token – String



Add the token parameter as an input to the existing scriptable task:


Modify that scriptable task and set the contentType to application / json:

request.contentType = "application\/json";

Then add the following code for the Accept and Authorization headers:

var authorizationToken = "Bearer " + token

request.setHeader("Accept", "application/json");
request.setHeader("Authorization", authorizationToken);


Save and close the workflow changes. Now we can create a workflow Get-vRATenantGroups to put all of the component pieces in place:


Create inputs for username, password and tenant – for future use outside of this example, you might want to think about storing these as vRO Configuration Items instead.

  • username – String
  • password – SecureString
  • tenant – String


In the schema add the Request-vRAToken workflow. Set inputs to match the input parameters:


Set the token output to be an attribute token in this workflow:


Close the tab. Add the Get-TenantGroups workflow to the schema. Set the inputs to be the tenant parameter and the token attribute:


Set the outputs to be the standard REST attribute outputs:


Close the tab. Finally, add a scriptable task to parse the results of the JSON response. For this example we will just output the names of the groups. For the inputs select contentAsString:


On the Scripting tab add the following code:

var jsonResponse = JSON.parse(contentAsString);

var groups = jsonResponse.content

for each (group in groups){

var name =;
System.log("Name is: " + name);



Save and close the workflow. Then run it with suitable parameters:


A successful workflow run will see something similar output to the logs:

[2015-05-15 16:35:54.485] [I] Name is: ExternalIDPUsers
[2015-05-15 16:35:54.485] [I] Name is: ActAsUsers
[2015-05-15 16:35:54.485] [I] Name is: SolutionUsers
[2015-05-15 16:35:54.486] [I] Name is: TenantAdmins01
[2015-05-15 16:35:54.486] [I] Name is: Users
[2015-05-15 16:35:54.486] [I] Name is: Tenant01_Approvers
[2015-05-15 16:35:54.486] [I] Name is: Administrators
[2015-05-15 16:35:54.486] [I] Name is: TenantUsers01
[2015-05-15 16:35:54.486] [I] Name is: TestCustom01
[2015-05-15 16:35:54.486] [I] Name is: TestCustom03
[2015-05-15 16:35:54.486] [I] Name is: TestCustom02
[2015-05-15 16:35:54.486] [I] Name is: TenantInfraAdmins01


Using the vRO 2.0 Plugin for Active Directory to Work with Multiple Domains

When working with vRealize Orchestrator and Active Directory it has been possible for a long time to use the built in Active Directory plugin for many tasks. One of the drawbacks with the various iterations of the 1.0.x version of the plugin however, was the lack of support for multiple domains and multiple domain controllers. This was naturally quite restrictive in environments with more than a single domain which is pretty common for many reasons since as distributed management, mergers & takeovers and poor planning 😉

These issues are addressed in version 2.0 of the plugin, which also supports the latest release of vRO, 6.0.1.

Getting Started

Version 2.0 of the AD plugin did not ship as part of the 6.0.1 vRO release, so it needs to be downloaded and upgraded. In vRO 6.0.1 the version of the AD plugin is




So, firstly download the 2.0 version of the AD plugin and copy the file to somewhere accessible from the vRO Configuration Website. From within the Configuration Website navigate to the Plug-ins page and the Install new plug-in section. Select the downloaded plugin file and choose Upload and install.


Accept the License Agreement


All being well you will be informed that the existing plugin was overwritten and the plugin will be installed at next server startup.


Restart the vRO service to compete the installation


Once complete the version of the plugin should show at



Login to vRO with the Client and navigate to Library / Microsoft / Active Directory / Configuration. If you used previous versions of the plugin, you will notice some changes in this folder:

Version 1.0.x




Run the Add an Active Directory server workflow and configure it for a domain controller in the first domain.



Use a shared session and ideally a dedicated service account with permissions in that AD domain to do what it needs to do:


If everything supplied is correct, then you should receive a successful workflow run:


and then be able to browse through the domain on the Inventory tab:


To add a domain controller from a second domain, run the  Add an Active Directory server workflow again. I’m using a DC from a child domain:


Again, with a successful workflow run you should see the green tick:


and on the Inventory tab it is now possible to browse multiple domains! (Woo hoo – you should be saying at this point, it’s quite a big deal if you’ve been waiting for this functionality :-) )


Use Case

Consider an example where you need to create an Organizational Unit in both AD domains. Prior to version 2 of the AD plugin you would have needed to either use multiple vRO servers or likely use some PowerShell scripting instead.

Create a top level workflow New-ADOUinMultipleDomains workflow:


On the Inputs tab create an input ouName:

On the Schema tab drag in the  Create an organizational unit Library workflow


On the In tab of the Create an organizational unit Library workflow ouName should be automatically populated with the Input parameter of the same name; if not, make it so:


For ouContainer create an Input Parameter of the workflow parentDomainContainer :




On the Out tab set newOU to be an attribute parentDomainOU:




Repeat the above process with an extra workflow item on the schema for the child domain using Input parameter childDomainContainer and attribute childDomainOU.




Update the Presentation for the Domain Container inputs to provide more friendly text when the workflow runs:


So now our top-level workflow looks like this for Inputs:



and the schema looks like this:


Save and close the workflow. Now run the workflow and populate the fields with a name for the new OU and locations in the parent and child domains to create the OUs in. Note that you are able to browse through both domains, similar to the Inventory view – yay :-) :





We are ready to roll, so hit Submit. All being well we will have a successful workflow run and OUs named Multiple created in both domains in the correct locations.




 Final thoughts

When talking with people about vRO I often caution them that just because there is a VMware supplied plugin or one from a third-party, it does not necessarily mean that it will do everything that you need it to do. The AD plugin was a case in point, so the 2.0 version is a welcome and long awaited improvement and reduces the need to fall back to using some form of scripting to achieve AD automation in vRO.

vRO: Missing Line Breaks in SOAP Request

While working in vRealize Orchestrator with an external SOAP based system I was having issues with line breaks being removed from text sent across as part of a reasonably large SOAP request containing multiple items.

Say we have the following text strings and want to pass them into the SOAP request with line breaks in-between each one:

text1 = 'This is text1';
text2 = 'This is text2';
text3 = 'This is text3';

textToSend = '\n' + text1 + '\n' + text2 + '\n' + text3;

Place that code into a scriptable task in a workflow, output textToSend to the vRO SystemLog and you will observe the text with line breaks in them, placing each one onto its own line:


However, when textToSend is sent through to the SOAP request, the line breaks have been removed and the text appears in the interface all on one line, displaying it like so:


Turns out in this instance the SOAP request would support HTML tags for the text, so using ‘<br />’ instead of ‘\n’ would give the line break.

text1 = 'This is text1';
text2 = 'This is text2';
text3 = 'This is text3';

textToSend = '<br />' + text1 + '<br />' + text2 + '<br />' + text3;

The SystemLog now looks like this:


However, we don’t really care what it looks like in there, the important thing is how it translates through in the SOAP request. It is now displayed as desired:


This also means that any HTML formatting tag could potentially be used if say the text needed to be made Bold or a different size.



vRO, an External SQL Database, and the case of the Missing Plugins

After setting up a fresh deployment of the vRO appliance and configuring it to use an external SQL database I noticed that many of the default plugins appeared to be missing in the Workflow library folder:

(there should be a lot more than listed here)


Logging into the vRO configuration page  showed the below list of plugins (and more going off the screen) appeared to exist and be installed correctly.



Having mostly worked with Windows based vRO servers before and not seen this issue I got a few clues from this blogpost and this communities post  which suggests it appears to be a bug relating to configuring vRO to work with a different database.

The workaround is to navigate to the Troubleshooting section of the configuration page and select Reset current version


All being well you will receive the below green message:


I then restarted the vRO appliance, logged back in with the vRO client and lo and behold all of the plugins were then present.


I checked this against a default deployment of the vRO appliance with the embedded database and the issue is not present.

PowerCLITools Community Module: Now on GitHub

Over the last few years I have built up a number of functions to use alongside the out of the box functionality in PowerCLI. I’ve posted some of the content before on this blog, but have at last got round to publishing all of the content that I am able to share, in a module available on GitHub – I’ve named it the PowerCLITools Community Module in the hope that some others might want to contribute content to it or improve what I have already put together.


This took a fair amount of effort since it is not possible for me to share everything that I have as part of my locally stored version of this toolkit. Some of it was developed by others I was working on projects with (and are not as necessarily so keen to share certain parts of their work) and some can’t be shared for commercial reasons. However, I found some time recently to split out everything that could be shared into a new module and also updated some of the code – typically to add some nice features in PowerShell v3 and later which weren’t available when a lot of the code was developed during PowerShell v2 days.

Since the content has been developed over a few years, consistency and standardisation of approach may not be 100% there. A quick look back over them showed some looking a bit dated – I have spent a bit of time tidying them up, but part of the reason for sharing them  was to take feedback and some prompting on where they could be improved. If I left them until I thought they were just right I’d probably never end up publishing them. So your feedback is the impetus I need to go and improve them :-)

A lot of the functions are there to fill in gaps in cmdlet coverage with PowerCLI and there are a few which I made more for convenience where I have bundled together a few existing cmdlets into one function. These don’t particularly add a lot of value, but maybe demonstrate how you can tighten up your scripts a bit


Ensure that VMware PowerCLI is installed. Functions have been tested against v5.8 R1.


1) Download all files comprising the PowerCLITools module. Ensure the files are unblocked and unzip them.
2) Create a folder for the module in your module folder path, e.g. C:\Users\username\Documents\WindowsPowerShell\Modules\PowerCLITools
3) Place the module files in the above folder

So it should look something like this:


The below command will make all of the functions in the module available

Import-Module PowerCLITools

To see a list of available functions:

Get-Command -Module PowerCLITools


Nested Modules

You will note that each function is itself a nested module of the PowerCLITools module. In this blog post I describe why I make my modules like this.

VI Properties

If you take a look inside the PowerCLITools.Initialise.ps1 file you’ll notice a number of VI Properties. Some of these are required by some of the functions in the module and some are just there for my convenience and make using my PowerCLI session simpler. You can add and remove VI Properties as to your own personal preference, but watch out that some are actually needed.  You can find out more about VI Properties here.


I really hope people find these functions useful. I have a number of ideas on where some can be improved, but please provide your own feedback as it’ll be the nudge I need to actually go and make the changes :-)