All posts by Jonathan Medd

PSDayUK September 2017

On Friday 22nd September a group of people involved with organising PowerShell User Group events around the UK will be hosting a 1 day PowerShell conference in London, PSDayUK. It will take place following on from the 2 day WinOps conference at the same venue, Skills Matter CodeNode,  on Wed 20 – Thr 21st September.

 

Headlining the day is Steven Murawski – Cloud Developer Advocate at Microsoft. Then there will be two separate tracks of sessions with many great speakers from the PowerShell community:

  • Track 1: PowerShell: The Door to DevOps
  • Track 2: DevOps with PowerShell

You can see the full agenda here.

Currently you can register your interest for the event to get update notifications; tickets will go on sale in the near future.

Tickets are now available. As of 4th Sep 2017 Early Bird pricing is still an option.

 

Changes to a PowerShell Module Manifest not viewable in Git

When you make a change to a PowerShell Module Manifest and go to commit those changes in Git, I have observed for some time that it was not possible to see the actual changes in the file because Git was showing it as a binary file. I finally found out why this happens……

Take the following example. I’ve created a module manifest for a module Test with New-ModuleManifest:

Now I make a change to the manifest, say up the version to 1.1

When I go to commit the file, I don’t see what has actually changed in the file. Git reports that it is a binary file:

This is because the output from New-ModuleManifest uses UTF16 for encoding, hence Git sees it as a binary file.

Using the handy encoding functions from here we can check the encoding for the manifest file:

I learnt this from the regular PowerShell community call  where there was a discussion around standardising on the encoding for all cmdlets in PowerShell Core 6. In versions prior to 6 different cmdlets use different encodings types, so it seems like a good opportunity to standardise, particularly with the move to make PowerShell cross-platform and Linux having a default encoding of UTF8. There is a lot more information here on the proposal for PowerShell Core and encoding going forward.

So, let’s change the encoding of the manifest file to UTF8:

Now let’s change the manifest file again and see if we can view the changes in Git:

SourceTree:

 

Or in VS Code:

Happy Days 🙂

Thanks to Joey Aiello for sharing this on today’s community call.

 

PowerShell – where, .where or Where?

There are a number of different ways to filter data in PowerShell and the options have expanded since the original release of v1.0. I thought it worth summarising them here, particularly from my experiences of attempting to convey the different choices during PowerShell training I have delivered. Typically these revolve around a dataset and the word ‘where’ used in one form or another, however….

1) Filter on the left

The number 1 rule is if possible use the filtering options of the cmdlet you are originally working with to reduce the size of the dataset. Stay as far to the left of the command as possible, i.e. the first step of a pipeline. That way you will avoid generating a dataset larger than necessary and then having to use other ‘where’ tools to filter it down.

For example, Get-WmiObject contains a filter parameter which means you can reduce the data returned in a query. So if looking at NICs with IP enabled you can be smarter with a reduced dataset, rather than piping it to Where-Object.

Note: not all Get-* cmdlets contain a filter type parameter, so you will need to check the help of the cmdlet you are using to see if it is possible.

2) Where-Object

Available since PowerShell v1.0, Where-Object is often one of the first cmdlets to learn about and enables you to take a dataset and pass it down the PowerShell pipeline for filtering.

Note: just to confuse things, sometimes Where-Object is shortened to Where, since Where is an alias for Where-Object!

Where-Object is one of the fundamentals of PowerShell, however often in classes students new to the language have struggled with the somewhat fiddly syntax of encapsulating the filter criteria in curly braces and particularly the use of $_.propertyname to refer to a property of the current object in the pipeline. So step foward……

3) Simplified Syntax

PowerShell version 3 introduced some simplified syntax for certain areas to help alleviate some of this syntax pain. (There is an excellent reference from Keith Hill ‘s blog here) So from then onward it was possible to drop the curly braces and the $_   in a more SQL style form of where query, which typically seems to be a more natural way to write these things.

Note: just to confuse things again, it is possible to use the full Where-Object in this style too!

The question then became which one did you teach to students? Typically I went with the standard Where-Object first and then moved onto revealing the simplified syntax later. However, often we ended up covering it earlier because many students would write it the SQL style way without even knowing it was possible – it seemed to be the natural way they wanted to do it.

4) .where Method

Introduced in PowerShell v4 the .where method enables filtering a collection or set of properties if you do not require or want to stream data to the pipeline. Continuing our example, it is possible to do this:

Why would you want to do this? Particularly given that it is possibly even fiddlier than the original Where-Object syntax to write and we now have the even clearer simplified syntax as well. The answer is in performance. On a small dataset we are only talking milliseconds difference in pipeline vs non-pipeline:

However, this can of course be a significant difference with a large dataset. In this contrived example, 6 seconds vs 2 seconds:

Depending on what you are doing, streaming vs non-streaming may be preferable for you, so worth trying out each one in your scenario to determine the best option for you.

There where method also includes some additional options (named mode), documented here, which are quite nice.

Similar to Select-Object, there is a First and Last mode. So:

Also interesting is the split mode. Effectively splitting the collection in two parts; the first which meets the condition and the second which doesn’t.

 

Modifying Icons in vRA with PowerShell

There have recently been a number of blog posts around modifying the All Services icon in vRA, and how to change it programmatically:

We had a new feature request open in PowervRA for a while to do the same thing, so I figured it would be a good time to go and add it, so that the same change to the icon could be done from PowerShell. We decided to take a slightly more generic approach than just the All Services icon and make it possible to upload any icon and use it to modify any service or other element within vRA that supports modifying the icon via the API.

So in release 2.1 of PowervRA you will find some new functions for working with icons:

Modify the All Services Icon

Note: Modifying the All Services icon will affect all vRA tenants and requires admin permission to the default tenant. Ensure you are comfortable with this before going ahead!

The icon for All Services is known within vRA as cafe_default_icon_genericAllServices. You can find out more about it with Get-vRAIcon:

To update it, use Import-vRAIcon. The API documentation lets us know that it will either create a new icon or overwrite an existing one. Since the All Services icon already exists, it will be overwritten when you import a new one:

You can also set it back to the original brick icon with Remove-vRAIcon, since the API description states that for deleting an icon which is one of the default system icons, it will be reverted to its default state:

Modify a Custom Service

Note: for this piece you will need admin permissions in the Tenant the Service belongs to

Modifying the icon for your own created service is very straightforward process: import a new icon to the Catalog Service and then update the existing service with the new icon. In this example we’ll modify the icon for Service02:

We can find the name of the currently used icon with Get-vRAService and see that the default icon name is cafe_default_icon_genericService:

To change it, do the following:

Modify a Catalog Item

Note: for this piece you will need admin permissions in the Tenant the Catalog Item belongs to

As mentioned, we’re not just restricted to modifying Service icons, other icons can be changed too. For example we can update the icon of a Catalog Item. Again upload an icon first, then update the Catalog Item with it:

We can find the name of the currently used icon with Get-vRACatalogItem and see that the  icon name is vcoIcon_256x256.png:

To update, use the following:

UK South Coast PowerShell User Group

There are a number of PowerShell User Groups in the UK, but unfortunately none for me that are easy to get to with my home location and work commitments. So I am gathering interest in a UK South Coast PowerShell User Group for coders of all experience levels.

The purpose of this initial meetup is to test the viability of running a PowerShell meetup in Southampton on a regular basis. Hopefully we will get enough interest to take this forward and start running sessions with PowerShell content for everyone to learn from.

If you live in the area and have an interest in PowerShell, we’d love to see you there.

 

Preparing for 70-533: Implementing Microsoft Azure Infrastructure Solutions

I’m not a massive fan of certifications, but I understand why people do them and the benefits which can arise from the whole process of achieving them.  I did a lot of them in the past when my career was more geared around infrastructure work rather than coding. However, I wanted to learn about Microsoft Azure and since it is such a large topic to get to grips with, decided that pursuing the 70-533: Implementing Microsoft Azure Infrastructure Solutions exam would be a good way to focus on learning an initial subset of what is available to work with in Azure.

Currently, as of 20/03/2017, there are a couple of Azure exam bundle deals available which are worth checking out. Basically, for roughly just under the cost of taking the exam on its own, you get a free resit voucher and a MeasureUp practise test thrown in as part of the bundle. I found the MeasureUp test a pretty good barometer for where I was with my learning and ended up going over all of the topics again to gain a better understanding that the practise test was highlighting was needed. These practise tests can be a bit hit and miss in my past experience, but I thought this one was a pretty good indicator of what the actual exam turned out to be like.

 

Objectives

I used a lot of different resources to prepare for the exam, the homepage is the obvious first place to start so you are aware of which areas of Azure are being tested.

https://buildazure.com has some info about how the exam changed towards the end of 2016 to be more focused on ARM rather than ASM:

Azure Infrastructure Exam (70-533) Gets ARM Refresh

Being comfortable with PowerShell and JSON is a pre-requisite before attempting any of the training I would suggest.

Resources

Once familiar with the objectives I used some online training as the largest part of my learning experience:

Implementing Microsoft Azure Infrastructure Solutions (70-533) from Pluralsight; some of it is a little out of date given the above exam changes, but still a very useful starting point

Azure Resource Manager Deep Dive from Pluralsight

I also watched a few chapters from Architecting Microsoft Azure Solutions (70-534) on Pluralsight. Even though it was for a different exam, there is still a lot of crossover and it was useful for a review of topics I had already covered.

Having then started on the MeasureUp practice test and realised more work was required I tried out the free Microsoft Azure training mentioned as part of their bundle offering above, which is free to anyone even if not signing up to one of the bundles. If you don’t have access to Pluralsight then this would be a good place to start. In my case I found it useful to revisit topics I had already learnt. This site was also good for the extra practice questions it contains.

Craig Kilborn has some useful info on his site:

70-533: Implementing Microsoft Azure Infrastructure Solutions – Prep & Exam Experience

The Microsoft site Channel 9 has a large Azure section of videos to choose from. In particular finding relevant videos in the Azure Fridays series was good as a refresher as the exam date approached, for example Azure ScaleSets  or Azure CDN.

Finally, I found it really useful to team up with a colleague who is going for the same exam and regularly review things learnt and compare notes – I learnt a lot from doing this and would suggest trying the same if you can.

Last Minute Preparation

As some last minute preparation for the exam I committed to memory as much as possible from the key facts around Azure Web Apps, Azure SQL and Azure VMs, such as in the below screenshots from the Azure portal for Web Apps. Then at the beginning of the exam, I wrote down as much as possible that I could remember on the materials provided before tackling the questions.

This is part of the reason why I don’t like certs since to me it is fairly pointless to memorise something that could easily be looked up if necessary. However, it was worth it for having a good awareness of, for instance which Web App tier would be suitable for a described application type.

 

You can obtain similar information from the portal for Azure SQL and Azure VMs.

and then Happy Days 🙂

powershell.exe version parameter

PowerShell v6 Alpha 17 has been released and contains an interesting change with the version parameter when applied to powershell.exe. Some discussion around it can be found here and here.

When using a Linux based shell, supplying the version parameter returns the version of the shell:

You can now do a similar thing in PowerShell Core:

Note that using $psversiontable still gives you fuller information:

This is slightly different from the pre-v6 PowerShell version on Windows where the version parameter requires an argument:

For example, you can start PowerShell version 2 from a PowerShell version 5.1 console:

 

There’s discussion in the Github issues about whether that particular functionality of running different PowerShell versions will be taken forward in PowerShell Core.

 

 

 

Create an Azure Storage Blob Container with PowerShell

My observations so far with the Azure PowerShell experience have been somewhat mixed and the example in this post will give you a flavour of that. I wanted to create a new Storage Blob Container via PowerShell, rather than through the below process in the web portal:

I looked for cmdlets which could potentially be used:

However, it returned nothing from the AzureRM module, only the Azure module. (There are currently two modules you need to use when working with Azure, some more info here and here) To say this can get confusing when you are new to the topic is an understatement, hopefully this situation is going to improve significantly ASAP.

So it looks like I need to use New-AzureStorageContainer from the original Azure module, however there do not appear to be any examples which show you how to add it into the desired place, i.e. Resource Group and Storage Account:

So far I have found two different ways to get this done:

1)Set the current Storage Account

I found a StackOverflow post with an example. You need to first of all call a cmdlet from the AzureRM module to set the current Storage Account (note line 2 is the weird response you get from running the command in line 1, i.e. just a string with the name of the current Storage Account, not an object representing it):

Now I can use New-AzureStorageContainer and it will get created in the correct place:

2) Use Storage Account Keys

Within a Storage Account are two Access keys which can be used for automation:

We only need one of the keys, but the following will retrieve both and then we pick out the first key value:

Now using one of the key values we can set the Storage Context:

Note: the above doesn’t actually seem to perform any validation on whether a Storage Account with that name exists. I initially had a typo in  the name and when using the next command generated the error: New-AzureStorageContainer : The remote name could not be resolved: ‘jmtest01.blob.core.windows.net’

Now if we have used the correct name for an existing Storage Account we can create the Storage Container using the generated Storage Context:

Please leave a comment if I have missed an easier way to do it, I’d love to know 🙂

New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name

New-AzureRmResourceGroupDeployment generates the following error:

New-AzureRmResourceGroupDeployment `
-Name $resourceDeploymentName `
-ResourceGroupName $resourceGroupName `
-TemplateFile $template `
@additionalParameters `
-Verbose -Force
New-AzureRmResourceGroupDeployment : A parameter cannot be found that matches parameter name ‘xxxxxxxxxxx’.
At line:5 char:5
+ @additionalParameters `
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [New-AzureRmResourceGroupDeployment], ParameterBindingException
+ FullyQualifiedErrorId : NamedParameterNotFound,Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDeploymentCmdlet

This kind of error seems fairly in tune with the experience I have had so far with the AzureRM PowerShell module, i.e. the error has seemingly nothing to do with the actual problem. While I spent a fair amount of time checking the parameter ‘xxxxxx’ in the ARM JSON file and found nothing wrong, it turned out that a syntax error elsewhere in the file was causing the problem. An error message pointing to that kind of problem would have been a lot more helpful!

Solve the syntax issue and this error goes away.