All posts by Jonathan Medd

PowervRO – Now available on macOS and Linux via PowerShell Core!

Back Story

Back in January 2017  Craig and I made PowervRA available for macOS and Linux via PowerShell Core. It was always our intention to do the same thing for PowervRO and , although slightly later than we hoped, we’re finally able to do that. PowerShell Core has come a long way itself over the last year, currently in Release Candidate and soon to be GA, and I’m sure a lot of the hard work and community feedback which has gone into that has helped make the job of PowervRO supporting PowerShell Core very straightforward.

In reality we had to make only a relatively small amount of changes to the code base, mostly around detecting which version of PowerShell is being used and consequently which method to use for making API calls to the vRO appliance when dealing with things like SSL certificates and protocols. There are a lot of great new things available in Invoke-RestMethod and Invoke-WebRequest in PowerShell Core which make API calls a lot simpler, so we take advantage of those.

Note: to take advantage of a lot of these new features we have raised the PowerShell version requirements for PowervRO 2.0.0 to be Windows PowerShell 5.1 and PowerShell Core 6.0.0-rc

Having invested a lot of time with the initial release of PowervRO in creating integration tests via Pester for each function, that really paid off for this release since we were very easily able to test everything against different versions of vRO with different versions of PowerShell across different operating systems. Again very little actually needed to be changed in the code for the functions themselves, which is a testament to the compatibility of PowerShell Core. Typically it was only things like cmdlet parameter changes, such as this one, which tripped us up.

Requirements

You will need:

PowerShell Core Release Candidate + ….instructions on getting it installed for different OS flavours can be found here.

PowervRO 2.0.0 + . Get a copy of PowervRO onto the Linux  or macOS machine you want to run it from. Use the following to download it  from the PowerShell Gallery:

or manually copy the module yourself to one of the locations listed in $env:PSModulePath, for example:

In Action

macOS

Here’s PowervRO on my Macbook:

Connect to vRO:

Retrieve all Workflows, sort by CategoryName and display Name, CategoryName and Version:

Invoke a Workflow:

 

CentOS 7.3

Here’s PowervRO on CentOS 7.3:

 

Connect to vRO:

Retrieve all Workflows, sort by CategoryName and display Name, CategoryName and Version:

Create a new Category:

 

Ubuntu 17.04

Here’s PowervRO on Ubuntu 17.04:

Connect to vRO:

Retrieve all Workflows, sort by CategoryName and display Name, CategoryName and Version:

Remove a Category:


Side Note

In PowervRO 2.0.0 we have also made some under the hood changes that it is worth being aware of (check the changelog for more details):

  • Module Restructure: we changed the functions from being their own individual nested modules in *.psm1 files to more simply being in *.ps1 files and made part of the module in a different way. The build process for a release now combines all of the functions in the individual *.ps1 files to a single *.psm1 module file.
  • The Password Parameter of Connect-vRAServer now requires a SecureString, not a String. Consequently, you will now need to supply a SecureString when using it like the example below:

PowerShell Core does not have -Encoding Byte. Replaced with new parameter AsByteStream

Carrying out the following in Windows PowerShell worked, but didn’t always make a lot of sense because Byte is not really an Encoding type:

If you try to run the same command in PowerShell Core you will receive the following error:

Set-Content : Cannot bind parameter ‘Encoding’. Cannot convert the “Byte” value of type “System.String” to type “System.Text.Encoding”.

This is because Byte is no longer a valid selection for the Encoding parameter:

That’s because it has been replaced by a new parameter AsByteStream, which makes more sense for what you are typically actually trying to do.

Thanks to Mark Kraus for pointing this change out to me.

Using a Specific Security Protocol in PowervRA

A few months ago we had an issue logged in PowervRA where it was not possible to make a connection to the vRA appliance after it had been locked down following the VMware hardening guide. Specifically this was because SSLv3/TLSv1 (weak ciphers) had been disabled.

By default, Windows PowerShell 5.1 has the following Security Protocols available, Ssl3 and Tls – hence the above failure.

It’s possible to workaround this by adding in the security protocol required, in this case TLS 1.2

Note, however, that this change is for the PowerShell session itself, which may or may not be desired behaviour.

PowerShell Core adds in a new parameter SslProtocol to both of the web cmdlets, Invoke-WebRequest and Invoke-RestMethod. Consequently, this improvement means that you can specify a security protocol per request, not per PowerShell session. For example you could do something like this for Tls 1.2:

In PowervRA 3.0.0 we’ve updated Connect-vRAServer to support this functionality, also with a SslProtocol parameter:

If you’re on Windows PowerShell we’ll add to the available security protocols in the PowerShell session (and remove it afterwards when you use Disconnect-vRAServer). If you’re using PowerShell Core we’ll use the SslProtocol parameter of Invoke-RestMethod so that the requested protocol is used per request.

The $vRAConnection variable has been updated with the SslProtocol property to show you whether your connection is using the default protocol or a specified one:

Final note: this was a breaking change for us, since we require Windows PowerShell 5.1 and PowerShell Core 6 release candidate to easily implement the above functionality. So make sure you are on either of those versions before trying PowervRA 3.0.0.

Accessing Content from Variables of Type Any in the vRO Client

One of my colleagues showed me how to do this, so I thought it worth sharing since it has bugged me ever since I started using vRO.

If you have run a vRO workflow and are looking at the output, specifically the Variables tab:

you can then view what values each variable was at the time of workflow completion. If the value is a string or something else simple you will get a nice view of it. However, if it is say a collection of properties you will see something similar to the below and typically you will not be able to scroll across to view them all.

What I have typically done until now is add a Scriptable Task as the next step in the workflow and log all of the properties out. However, she demonstrated to me that it is possible to copy them and then paste into a text editor.

Step:

  1. Bring up the above view by clicking on the ‘i’, next to the magnifying glass
  2. Click once on the white section – in this example the word ‘Properties’
  3. Ctrl-A
  4. Ctrl-C

Even though there is no visual indication that everything was highlighted to be made available for copy, like in say a text editor, it has actually done it. The below is the copied output from the above:

Properties##[#sections#=#Properties##[#section#=#Properties##[#generationNumber#=#number#1.511870334212E12#+#name#=#string#TEST01#+#rule#=#Properties##[#appliedToList#=#Properties##[#appliedTo#=#Properties##[#isValid#=#boolean#true#+#name#=#string#TEST02#+#type#=#string#SecurityGroup#+#value#=#string#securitygroup-530#]##]##+#packetType#=#string#any#+#_disabled#=#boolean#false#+#_logged#=#boolean#false#+#destinations#=#Properties##[#destination#=#Properties##[#isValid#=#boolean#true#+#name#=#string#TEST01#+#type#=#string#SecurityGroup#+#value#=#string#securitygroup-101#]##+#_excluded#=#boolean#false#]##+#name#=#string#Test01#+#action#=#string#allow#+#id#=#number#2758.0#+#sectionId#=#number#1252.0#+#services#=#Properties##[#service#=#Properties##[#destinationPort#=#number#3453.0#+#protocol#=#number#6.0#+#protocolName#=#string#TCP#+#isValid#=#boolean#true#]##]##+#direction#=#string#inout#]##+#_class#=#string#section#+#_id#=#number#1252.0#+#type#=#string#LAYER3#+#_timestamp#=#number#1.51134543303912E12#]##]##]#

OK, it is not that easy to read, but it is pretty handy if you just want to quickly grab it and search for something in the list of Properties.

Preparing for 70-534: Architecting Microsoft Azure Solutions

I recently passed the exam 70-534: Architecting Microsoft Azure Solutions so thought I would share a few preparation materials here. From reading the exam blueprint you will notice a certain amount of crossover with 70-533 (and to a slightly lesser extent 70-532), so a fair amount of the resources I used for those exams are also relevant. See my pages here for info on those: https://www.jonathanmedd.net/2017/03/preparing-for-70-533-implementing-microsoft-azure-infrastructure-solutions.html and https://www.jonathanmedd.net/2017/10/preparing-for-70-532-developing-microsoft-azure-solutions.html

In addition for this exam I used the excellent 70-534 preparation course from Scott Duffy on Udemy: https://www.udemy.com/70532-azure/learn/v4/overview  Not only does it have excellent content, but it appears that Scott updates it on a regular basis. Even during the 3 – 4 weeks I was using the course there were updates and new information coming through from Scott which was really helpful. It’s also often available for an excellent price on Udemy, I managed to pick it up for £10.

Scott also has a set of practice questions available on the same site. Split into 3 tests, there are currently 150 questions. I managed to also pick these up for £10 and found them useful as part of the preparation.

———————————————————————–

Update 21/11/2017: Since I posted this blog I was made aware of the following about Udemy. I would suggest you read it, then make up your own mind about whether you still wish to take one of their courses.

———————————————————————–

There is a useful exam preparation session from Ignite 2017 which is well worth watching.

After completing the above I still had a week or so left to prepare for the exam so I picked up some practice questions from Measureup. These were a bit more pricey at £70 for 30 days access and while useful in terms of making me go read documentation on subjects I was not so good at, they felt a little out of date.

One additional thing to be aware of is that the 70-534 exam is due to expire 31st December 2017, to be replaced by 70-535. Depending on where you are at in your study preparation, you have a decision to make  on which exam to take. Scott Duffy has some useful info on what is the difference between the two exams which may be helpful in making a decision – his initial looks suggests there is a significant amount of new content added to the blueprint for 70-535 and only a few items removed.

Having now passed 70-532, 70-533 and 70-534 I’m done with these Azure certifications for some time. Having been through this process my recommendation if you are following the same path would be to take all three as close together as possible as you can given the overlap in content. I wasn’t able to for various reasons, but if I had to do it again, I would make more of an effort to make that happen.

Preparing for 70-532: Developing Microsoft Azure Solutions

I recently passed the exam 70-532: Developing Microsoft Azure Solutions so thought I would share a few preparation materials here. From reading the exam blueprint you will notice a certain amount of crossover with 70-533, so a fair amount of the resources I used for that exam are also relevant. See my page here for info on that one: https://www.jonathanmedd.net/2017/03/preparing-for-70-533-implementing-microsoft-azure-infrastructure-solutions.html

In addition for this exam I used the excellent 70-532 preparation course from Scott Duffy on Udemy: https://www.udemy.com/70532-azure/learn/v4/overview  Not only does it have excellent content, but it appears that Scott updates it on a regular basis. Even during the 3 – 4 weeks I was using the course there were updates and new information coming through from Scott which was really helpful. It’s also often available for an excellent price on Udemy, I managed to pick it up for £10.

Scott also has a set of practice questions available on the same site. Split into 3 tests, there are currently 150 questions. I managed to also pick these up for £10 and found them useful as part of the preparation.

———————————————————————–

Update 21/11/2017: Since I posted this blog I was made aware of the following about Udemy. I would suggest you read it, then make up your own mind about whether you still wish to take one of their courses.

———————————————————————–

BTW watch out since as of 12th October 2017 there are some changes to the 70-532 exam, Scott has some info on that here. By chance I happened to have mine scheduled for 10th October so I’m unable to comment on whether those changes have filtered through yet.

After completing the above I still had a week or so left to prepare for the exam so I picked up some practice questions from Measureup. These were a bit more pricey at £70 for 30 days access and while useful, they felt a little out of date. The biggest thing I got out of them was the realisation that there might be a fair bit of C# tested in this exam and that turned out to be the case.

While C# is not specifically called out in  the objectives, it turns out that kind of like being comfortable with PowerShell is really required for the 70-533, being the same with C# is needed for the 70-532. At a minimum I would say being familiar with how it’s laid out, having a reasonable idea of what a code section might do and picking out what might be missing from a list of choices would be a good idea.

Finally I watched a session recorded at Ignite 2016: Cert Exam Prep: Exam 70-532: Developing Azure Solutions which had a few good tips for this exam.

Good luck if you are taking it yourself!

PSDayUK September 2017

On Friday 22nd September a group of people involved with organising PowerShell User Group events around the UK will be hosting a 1 day PowerShell conference in London, PSDayUK. It will take place following on from the 2 day WinOps conference at the same venue, Skills Matter CodeNode,  on Wed 20 – Thr 21st September.

 

Headlining the day is Steven Murawski – Cloud Developer Advocate at Microsoft. Then there will be two separate tracks of sessions with many great speakers from the PowerShell community:

  • Track 1: PowerShell: The Door to DevOps
  • Track 2: DevOps with PowerShell

You can see the full agenda here.

Currently you can register your interest for the event to get update notifications; tickets will go on sale in the near future.

Tickets are now available. As of 4th Sep 2017 Early Bird pricing is still an option.

 

Changes to a PowerShell Module Manifest not viewable in Git

When you make a change to a PowerShell Module Manifest and go to commit those changes in Git, I have observed for some time that it was not possible to see the actual changes in the file because Git was showing it as a binary file. I finally found out why this happens……

Take the following example. I’ve created a module manifest for a module Test with New-ModuleManifest:

Now I make a change to the manifest, say up the version to 1.1

When I go to commit the file, I don’t see what has actually changed in the file. Git reports that it is a binary file:

This is because the output from New-ModuleManifest uses UTF16 for encoding, hence Git sees it as a binary file.

Using the handy encoding functions from here we can check the encoding for the manifest file:

I learnt this from the regular PowerShell community call  where there was a discussion around standardising on the encoding for all cmdlets in PowerShell Core 6. In versions prior to 6 different cmdlets use different encodings types, so it seems like a good opportunity to standardise, particularly with the move to make PowerShell cross-platform and Linux having a default encoding of UTF8. There is a lot more information here on the proposal for PowerShell Core and encoding going forward.

So, let’s change the encoding of the manifest file to UTF8:

Now let’s change the manifest file again and see if we can view the changes in Git:

SourceTree:

 

Or in VS Code:

Happy Days 🙂

Thanks to Joey Aiello for sharing this on today’s community call.

 

PowerShell – where, .where or Where?

There are a number of different ways to filter data in PowerShell and the options have expanded since the original release of v1.0. I thought it worth summarising them here, particularly from my experiences of attempting to convey the different choices during PowerShell training I have delivered. Typically these revolve around a dataset and the word ‘where’ used in one form or another, however….

1) Filter on the left

The number 1 rule is if possible use the filtering options of the cmdlet you are originally working with to reduce the size of the dataset. Stay as far to the left of the command as possible, i.e. the first step of a pipeline. That way you will avoid generating a dataset larger than necessary and then having to use other ‘where’ tools to filter it down.

For example, Get-WmiObject contains a filter parameter which means you can reduce the data returned in a query. So if looking at NICs with IP enabled you can be smarter with a reduced dataset, rather than piping it to Where-Object.

Note: not all Get-* cmdlets contain a filter type parameter, so you will need to check the help of the cmdlet you are using to see if it is possible.

2) Where-Object

Available since PowerShell v1.0, Where-Object is often one of the first cmdlets to learn about and enables you to take a dataset and pass it down the PowerShell pipeline for filtering.

Note: just to confuse things, sometimes Where-Object is shortened to Where, since Where is an alias for Where-Object!

Where-Object is one of the fundamentals of PowerShell, however often in classes students new to the language have struggled with the somewhat fiddly syntax of encapsulating the filter criteria in curly braces and particularly the use of $_.propertyname to refer to a property of the current object in the pipeline. So step foward……

3) Simplified Syntax

PowerShell version 3 introduced some simplified syntax for certain areas to help alleviate some of this syntax pain. (There is an excellent reference from Keith Hill ‘s blog here) So from then onward it was possible to drop the curly braces and the $_   in a more SQL style form of where query, which typically seems to be a more natural way to write these things.

Note: just to confuse things again, it is possible to use the full Where-Object in this style too!

The question then became which one did you teach to students? Typically I went with the standard Where-Object first and then moved onto revealing the simplified syntax later. However, often we ended up covering it earlier because many students would write it the SQL style way without even knowing it was possible – it seemed to be the natural way they wanted to do it.

4) .where Method

Introduced in PowerShell v4 the .where method enables filtering a collection or set of properties if you do not require or want to stream data to the pipeline. Continuing our example, it is possible to do this:

Why would you want to do this? Particularly given that it is possibly even fiddlier than the original Where-Object syntax to write and we now have the even clearer simplified syntax as well. The answer is in performance. On a small dataset we are only talking milliseconds difference in pipeline vs non-pipeline:

However, this can of course be a significant difference with a large dataset. In this contrived example, 6 seconds vs 2 seconds:

Depending on what you are doing, streaming vs non-streaming may be preferable for you, so worth trying out each one in your scenario to determine the best option for you.

There where method also includes some additional options (named mode), documented here, which are quite nice.

Similar to Select-Object, there is a First and Last mode. So:

Also interesting is the split mode. Effectively splitting the collection in two parts; the first which meets the condition and the second which doesn’t.

 

Modifying Icons in vRA with PowerShell

There have recently been a number of blog posts around modifying the All Services icon in vRA, and how to change it programmatically:

We had a new feature request open in PowervRA for a while to do the same thing, so I figured it would be a good time to go and add it, so that the same change to the icon could be done from PowerShell. We decided to take a slightly more generic approach than just the All Services icon and make it possible to upload any icon and use it to modify any service or other element within vRA that supports modifying the icon via the API.

So in release 2.1 of PowervRA you will find some new functions for working with icons:

Modify the All Services Icon

Note: Modifying the All Services icon will affect all vRA tenants and requires admin permission to the default tenant. Ensure you are comfortable with this before going ahead!

The icon for All Services is known within vRA as cafe_default_icon_genericAllServices. You can find out more about it with Get-vRAIcon:

To update it, use Import-vRAIcon. The API documentation lets us know that it will either create a new icon or overwrite an existing one. Since the All Services icon already exists, it will be overwritten when you import a new one:

You can also set it back to the original brick icon with Remove-vRAIcon, since the API description states that for deleting an icon which is one of the default system icons, it will be reverted to its default state:

Modify a Custom Service

Note: for this piece you will need admin permissions in the Tenant the Service belongs to

Modifying the icon for your own created service is very straightforward process: import a new icon to the Catalog Service and then update the existing service with the new icon. In this example we’ll modify the icon for Service02:

We can find the name of the currently used icon with Get-vRAService and see that the default icon name is cafe_default_icon_genericService:

To change it, do the following:

Modify a Catalog Item

Note: for this piece you will need admin permissions in the Tenant the Catalog Item belongs to

As mentioned, we’re not just restricted to modifying Service icons, other icons can be changed too. For example we can update the icon of a Catalog Item. Again upload an icon first, then update the Catalog Item with it:

We can find the name of the currently used icon with Get-vRACatalogItem and see that the  icon name is vcoIcon_256x256.png:

To update, use the following: