London VMUG January 2015

The first London VMUG of 2015 is almost upon us and as usual looks like a great line up of activities. My employer Xtravirt is sponsoring the labs and have a tech-preview of some software that you may be interested to check out. Plus one of my colleagues Michael Poore will be talking about a real world automation project.

VMUG

Make sure you register and get along to the event.

Rooms Capital A Capital B Central Room
0830 - 0945 EVENT REGISTRATION
1000 - 1015 Welcome
1015 - 1100 Frank Denneman, PernixData & James Leavers, Cloudhelix - FVP
Software in a real-world environment
1100 - 1145 vFactor Lightning Talks - Philip Coakes, Alec Dunn, Dave Simpson, Gareth Edwards, Chris Porter
1145 - 1215 Break in Thames Suite
1215 - 1300 Robbie Jerrom, VMware - What is Docker and Where Does VMware VMware GSS Xtravirt Lab -
Fit? SONAR Tech
Preview - an easyto-use SaaS service providing on-
demand vSphere
automated analytics and reporting
1300 - 1400 Lunch
1400 - 1450 Simplivity - Stuart Gilks, Making Sense of Converged Infrastructure Unitrends - Ian Jones, Datacentre Failover
1500 - 1550 Phil Monk, VMware, Bringing Andy Jenkins, VMware - Cloud Xtravirt Lab -
SDDC to Life - A Real World Native @ VMware - Give your developers & ops teams SONAR Tech
Deployment with Michael Poore everything they want without Preview - an easyto-use SaaS service providing on-
losing control demand vSphere
automated analytics and reporting
1550 - 1600 Break in Thames Suite
1600 - 1650 VMware GSS Dave Hill, VMware - 5 Starting Points for Cloud Adoption Xtravirt Lab -
SONAR Tech
Preview - an easyto-use SaaS service providing on-
demand vSphere
automated analytics and reporting
1700 - 1715 C l o s e
1715vBeers - Pavilion End – sponsored by 10ZIG

How To Make Use Of Functions in PowerShell

Over the last few weeks I’ve had a number of comments on posts essentially asking the same question: “How do I use the functions that you publish on your blog?”. So I thought it worth making a post to refer people to, rather than trying to respond in kind to each comment. There are a number of ways it can be done depending on your requirements and they are listed below.

First of all, let’s create a simple function to use for testing:


function Get-TimesResult {

Param ([int]$a,[int]$b)

$c = $a * $b

Write-Output $c
}

1) Paste Into Existing PowerShell Session

If you are working interactively in the console then the function can be copy / pasted into that session and is then available for the duration of that session. I find this easier to do via the PowerShell ISE than the standard console.

Copy the function into the script pane:

Functions01

Click the Green Run Script button or hit F5 and the code will appear in the console pane:

Functions02

The function is now available for use and if using the ISE will appear interactively when you start typing the name:

Functions03

Functions04

 

2) PowerShell Profile

If the function is something that you wish to use regularly in your interactive PowerShell sessions then you can place the function in your PowerShell Profile and it will be available every time you open your PowerShell console.

If you are unsure what a PowerShell profile is or how to use one, there is some good info here. A quick way to create one is:


New-Item -Path $profile -ItemType File -Force

Once you have created a PowerShell profile, place the function in the profile and save and close. Now every time you open your PowerShell console the function will be available.

Functions05

3) Directly In A Script

If you wish to use the function in a script, place the function in the script above the sections where you need to use it. Typically this will be towards the top. The plus side of doing it this way is everything is contained in one file, a negative is that if you have a number of functions then readability of the script is reduced since there may be a long way to scroll down before anything of significance starts to happen.

Functions06

4) Called From Another Script

One method I have seen quite often in the wild (and I’m not a particular fan of, point 5 is a much better approach) is to store all regularly used functions in a script file and dot source the functions script file in the script where you need to use one or more of the functions.

Functions script file Tools.ps1:

Functions07

Get-Results script file calling Tools.ps1:

Note the dot and a space before the reference to the Tools.ps1 file


. C:\Users\jmedd\Documents\WindowsPowerShell\Scratch\Tools.ps1

Get-TimesResult -a 6 -b 8

Functions08

 

5) Stored in a Module

Using a PowerShell module is a more advanced and significantly more structured and powerful method of achieving what was done in 4). If you haven’t used PowerShell modules before I wrote an introduction to PowerShell modules a while back which you can find here.

Essentially they are a method to package up your reusable functions and make them available in a manner similar to how other teams in Microsoft and third-parties produce suites of PowerShell cmdlets for consumption.

For this example I have created a Tools module to use, which essentially is the same content as the Tools.ps1 file, but stored in a *.psm1 file (Tools.psm1) in the Modules\Tools folder on my workstation.

Note: the name of the *.psm1 file should match that of the folder. Its possible to create a more enhanced module than taking this approach using a Module Manifest, but we don’t need that for the purposes of this post. It’s described further in the previously mentioned article.

Functions09

Now we can use the *-Module PowerShell cmdlets to work with our content.

To observe the module we can use Get-Module:


Get-Module Tools -ListAvailable

Functions10

To use the functions contained in  the module we can use Import-Module


Import-Module Tools

Get-TimesResult -a 6 -b 8

 

Functions11

Note: Since PowerShell v3 automatic cmdlet discovery and module loading has been supported. (You can find out more about it here) Consequently, you don’t actually need to use Import-Module to get access to the functions as long as you place the Module in the correct location. However, it would be a good practice to add the Import-Module line to your script, so that another user is aware of where you are getting the functionality from.

Presenting a Password Confirmation Form in vCO / vRO

Requirement:

Present a vCO / vRO form which contains two password entry fields using SecureStrings and a field which displays whether the two entered passwords match.

Using an if statement to test whether two SecureStrings are equal will fail even if the text entered is identical. As mentioned in this communities post, in a workflow it is possible to take the SecureStrings into a scriptable task and output them as Strings. However, in the presentation of the workflow this method is not possible.

Solution:

Create an action which converts a SecureString to a String. Call that action from another action that is used to display whether the two entered passwords match. Here are the details of how I did it.

Create an action secureStringToString


outputText = text;

return outputText

PasswordForm01

 

Create an action testPasswords

 


var passwordTest1 = System.getModule("com.jm-test").secureStringToString(password1);
var passwordTest2 = System.getModule("com.jm-test").secureStringToString(password2);

if (passwordTest1 == passwordTest2){

return "Matching Passwords"
}
else {

return "Non-Matching Passwords"
}

PasswordForm02

Create a workflow with the following inputs:

PasswordForm03

Set the presentation for the first three inputs to be mandatory and the displayConfirmation input to use the testPasswords action:

username mandatory

PasswordForm04

displayConfirmation Data binding

PasswordForm05

displayConfirmation Data binding testPasswords action

PasswordForm06

Run the workflow and observe the text changes in displayConfirmation dependent on the passwords matching:

PasswordForm07

 

PasswordForm08

I’d be interested to hear if anyone has a better way to do this because I reckon there might be one :-)

Enabling NFS VAAI Support in Synology 5.1

Synology enabled VAAI support for NFS in version 5.1 of their DSM software. In order to take advantage of this technology from ESXi hosts we need to do two things:

  • Upgrade DSM to at least version 5.1-5004 (2014/11/06)
  • Install the Synology NFS Plug-in for VMware VAAI

DSM

DSM can be upgraded from within the Control Panel application. Head to the Update & Restore section, check for and install updates. This will likely require a reboot so ensure anything or anyone using it is shutdown or notified.

NFSNAAI07

 

ESXi

Prior to installing the NFS plugin my two NFS datastores don’t have the Hardware Acceleration support.

NFSNAAI02

 

From the 5.1-5004 Release Notes:

VMware VAAI NAS
Added NFS support for two primitives: Full File Clone and Reserve Space.
Please note that you should install the Synology NFS Plug-in for VMware VAAI and read the instructions in README.txt to make sure installation is successful.

Once the plugin has been downloaded it is possible to use either VMware Update Manager or esxcli to install the vib. For the purposes of my home lab without Update Manager I’m going to show you the esxcli way.

Upload the vib to a datastore all hosts can access, then the command to install the vib is:

esxcli software vib install -v /path to vib/esx-nfsplugin.vib

Once installed, the ESXi host will require a reboot

NFSNAAI04

After the reboot you can check all was successful by running:

esxcli software vib list | grep nfs

NFSNAAI05

and examining the NFS datastores

NFSNAAI06

Improving the PowerShell ISE Experience with ISESteroids 2

For a long time I’ve used the built-in to Windows, PowerShell ISE for my PowerShell scripting experience. Most people tend to have a particular favourite editor for their coding, usually after trialling out a few different ones. For pretty much everything else I’ve settled on Sublime Text, but for PowerShell I use the ISE since I really like the integration with the PowerShell console.

The ISE was introduced in PowerShell 2.0 and to be honest was pretty basic back then. It’s  improved significantly since then into version 4, but still has some areas where there could be improvement or features missing that you would like to see.

Earlier in the year I tried out ISESteroids 1.0 which started to plug a number of the gaps I found in the ISE. Recently I had chance to upgrade to ISESteroids 2 and it has improved even further.

For a quick preview of what is available check out the below video.

A few things in it I particularly like:

1) More distinct (yellow) highlighting of bracket matching or other sections

ISESteroids01

(single click on this double quoted string)

ISESteroids02

2) Block commenting (this was a real annoyance for me – there is a keyboard shortcut to do it in the standard ISE, but still fiddly)

Before:

ISESteroids03

After pressing the single button below:

ISESteroids04

ISESteroids05

 

3) ScriptMap which allows you to easily navigate your way through long scripts

ISESteroids06

 

4) Manage Version History

ISESteroids07

Clicking on Compare opens WinMerge and a view of what has changed between versions

ISESteroids08

5)  Autoselection. Click repeatedly to select various code segments

ISESteroids12

ISESteroids09

ISESteroids10

ISESteroids11

 

6) Enhanced Debugging

Best explained in the following video

For a more in-depth look at some of the features, check out the below video with ISESteroids creator Dr Tobias Weltner and fellow PowerShell MVP Jeff Wouters.

Delegating Permissions to vCO Workflows and Publishing for Consumption

I needed to look into the possibilities around granting delegated access to vCO workflows and ways to consume them without necessarily using the standard vCO client. The general idea was to have one group of people authoring workflows and another group consume some of them.

vCO has the ability to delegate permissions to workflows and folders of workflows using users and groups; ideally you would have already setup vCO to use your AD for authentication so enabling this delegation via AD users and groups.

Each workflow or folder has a Permissions tab where users and groups can be added with different rights selectable; View, Inspect, Admin, Execute, Edit.

 

DelegatedPermissions02

One thing to watch out for is that (at least) View Permissions need to be set right at the top of the tree for a user to be able to authenticate with the vCO (or other) client. The top level has a similar Permissions tab, however no Pencil style edit button like all of the levels further down.

Turns out it is hidden away in a context sensitive right-click set of options on the top level, Edit access rights…

Now we’ve figured that out, let’s take a scenario where we want to give an AD group access to run workflows in one folder, but not in another. We’ll use the AD group vCO_Users.

In the screenshot above vCO_Users have been given View and Inspect rights at the top level, which will filter all the way down. That will get our vCO_User authenticated, but not able to do too much.

On the folder JM-Dev vCO_Users have been given View, Execute and Inspect rights. Consequently, in the vCO client they are able to view and run the workflow, but not edit it – note the presence of the green Run button, but the Edit button is greyed out.

On the folder JM-Dev2 vCO_Users have only the View and Inspect rights which have filtered down from the top. Consequently, they can see the workflow, but neither run it nor edit it.

So we’ve got the permissions sorted for this example, but how about that requirement to not use the vCO client?

vCO has a built in web client, known as web operator. It is enabled by navigating to the Administer section of the client and then publishing the default weboperator.

DelegatedPermissions07

Now we navigate to:

https://vcoserver.fqdn:8281/vco/vmo/weboperator/default.html

and login with some credentials of a member of the vCO_Users group

Now we can see the tree of workflows similar to the vCO client view

As this user we are able to run the workflow in the JM-Dev folder where we have the Execute permission:

but not the workflow in the JM-Dev2 folder which doesn’t have the Execute permission:

DelegatedPermissions11

I found the web interface to be pretty basic, but possibly it’s worth evaluating if it might meet your needs.

 

Automating Disk Zeroing on VM Deletion

A requirement for a project I had was to zero the VMDK of all VM disks at the time of VM removal.

smash_hard_drive

One consideration was to SSH into the host where the VM was located and use vmkfstools like the below on each vmdk to zero the disk.

vmkfstools –w /vmfs/volumes/<…>.vmdk

Looking for alternatives I found that the PowerCLI cmdlet Set-HardDisk has a ZeroOut parameter. Note the text from the help (version 5.8 R1):

Specifies that you want to fill the hard disk with zeros. This parameter is supported only if you are directly connected to an ESX/ESXi host. The ZeroOut functionality is experimental.

The points to note are:

  • You will need to connect PowerCLI directly to the ESXi host that the VM is registered on. So you will most likely first of all need to connect to vCenter to find out where the VM lives.
  • The functionality is ‘experimental’. A quick scan back through releases showed this had been the same for some time. From my observations the functionality appeared to work fine. There have been many things in vSphere over the years which have been ‘experimental’, but have usually worked fine.

So once you have identified where the VM is and connected to the ESXi host in question, it’s a case of simply looping through all of the disks and zeroing them (with a bit of logging thrown in) – note it will likely take a fair amount of time to zero each disk!


$VM = VM01
$HardDisks = Get-HardDisk -VM $VM

foreach ($HardDisk in $HardDisks){

$HardDisk | Set-HardDisk -ZeroOut -Confirm:$false | Out-Null

$Text = "Zeroed disk $($HardDisk.Filename) for VM $VM"
$Text | Out-File -FilePath C:\log\zerodisk.log -Append
}

vFACTOR London VMUG January 2015

Ever thought about presenting at a user group, but not quite found that extra incentive to give it a go for the first time? Many people I know have benefited in both a personal and professional capacity from doing so and I certainly have too. The opportunity to relate tales of a project at work or a particular piece of technology you have an interest in can often be quite an experience I’ve found, with discussions afterwards leading to new ideas about your topic or even people believing you’re an expert at something, just because you had the guts good sense to stand up in front of your peers and talk about it.

If you haven’t done it before and need an incentive to help get you over that first hurdle, the London VMUG are offering significant encouragement to get you started. At the next London VMUG in January 2015 there are a set of prizes available for five people prepared to give a 10 minute lightning talk.

That’s right, all you need to do is be selected to give a talk for 10 mins and you are guaranteed one of the five prizes below – the winner to be determined by the audience.

The odds look pretty good to me – a 60% chance of winning an Apple product and the worst that can happen to you is you come away with an Amazon voucher and the respect of your peers for giving it a go!

mackbookair

1st Prize: MacBook Air
2nd Prize: iPad Air
3rd Prize: iPad Mini
4th Prize: Amazon voucher
5th Prize: Amazon voucher

Entries need to be submitted by the 19th December and all the details can be found here.

 

Resource Action Object Not Correctly Passed Through to ASD Form in vCAC 6.0.x

When using the Advanced Service Designer to create Resource Actions it’s possible you may hit the following issue in vCAC 6.0.x if you attempt to access the Input Resource as part of the workflow Presentation. While everything will appear to work correctly in vCO, when the form is accessed by a vCAC user the Input Resource may not (I say *may* because the behaviour is inconsistent, sometimes it works, sometimes not!) be available to use. Take the following (contrived) example.

You would expect things to work like the following:

Using a vCO action displayToolsStatus which takes the input of a vCAC:VirtualMachine and queries vCenter for the VMware Tools Status to display in the vCAC form (unlikely you would actually want to do this, but it works for the example).

ResourceAction01

In the workflow to use as the Resource Action set the Presentation Properties of the displayToolsStatus input to the displayToolsStatus action…..

ResourceAction02

….using the vm Input as the parameter

ResourceAction03

However, when using the Resource Action, displayToolsStatus in the form is the default value Unable to determine since we hit the if (!vm) in the action


if (!vm) {
return null;
}

ResourceAction04

 

You can either upgrade to vCAC 6.1 where this is resolved, or go with the following workaround:

Add an additional input to the vCO workflow of the same type as the Resource Action input – hiddenVM in this example

ResourceAction05

Set the Presentation to be Hidden and have a Data Binding mapped to the Resource Action parameter (vm)

ResourceAction06

Now set the parameter of the displayToolsStatus  action to be the hiddenVM instead of the VM input

ResourceAction07ResourceAction08

Remove and re-add the ASD Resource action, make sure to still select the vm Input as the Input Resource

ResourceAction09

 

and now the form will be correctly populated.

ResourceAction10

This VMware KB post relating to a similar issue helped me get to the bottom of this.

Deploying a vShield Edge: “The virtual machine is not supported on the target datastore”

FailedEdgeDeploy02Attempting to deploy a vShield Edge (5.5) via an API call, I was greeted with the following error:

Content as string: <?xml version=”1.0″ encoding=”UTF-8″?>
<error><details>Failed to publish configuration on vShield Edge. Failed to deploy edge
appliance.</details><errorCode>10105</errorCode><rootCauseString>The virtual machine is not supported on the target
datastore.</rootCauseString><moduleName>vShield Edge</moduleName></error>

 

The API call was the second of two calls to deploy an Edge device into a vSphere Datacenter and Cluster, i.e. one Edge per datacenter, into a specified cluster.

The first Edge was deployed successfully, but not the second. Both clusters were configured in a very similar manner, the only real difference being the name of the datacenter they belonged to. Everything was hosted in a lab based solution using nested ESXi hosts.

My GoogleFu revealed not a lot more than this API reference doc and this description:

The virtual machine is not supported on the target datastore. This fault is thrown by provisioning operations when an attempt is made to create a virtual machine on an unsupported datastore (for example, creating a non-legacy virtual machine on a legacy datastore).

Much troubleshooting ensued before a resolution. Given that it was a lab all, of the ESXi hosts in both vSphere datacenters had access to the same datastores. While not typically a good idea for production, it had been an easy way for me to get the lab up and running. vSphere was quite happy for me to run things like this with VMs in both datacenters on the same datastores. vShield however, was not so happy and this turned out to be the reason it was failing to deploy with “The virtual machine is not supported on the target datastore”.

Reconfiguring the storage so that datastores were limited by datacenter then permitted successful Edge deployment in both sites.

FailedEdgeDeploy03