Sidebar on Ed Wilson’s (Scripting Guy) Latest Book

A while back, I was asked to submit a sidebar for Ed Wilson’s (Microsoft’s Hey, ScriptingGuy!) latest book titled “Windows PowerShell Best Practices ” on my use of PowerShell in the environment.

Of course, I couldn’t refuse this offer and proceeded to submit one to him. If you want to check out my sidebar (as well as many other excellent sidebars from other members of the PowerShell community), then click on the link below to pick up a copy of the book. Hint: my sidebar is in the chapter about Modules (Chapter 10).

Windows PowerShell Best Practices (V3)

Posted in News, powershell | Tagged , , , | Leave a comment

Avoiding System.Object[] (or Similar Output) when using Export-Csv

I’ve ran into this issue a number of times and have seen others as well when they attempt to pipe some data which may have a collection of items in one of its properties to a CSV file using Export-Csv. What happens is something which can drive you batty. Such as the image below.

[pscustomobject]@{
    First = 'Boe'
    Last = 'Prox'
    ExtraInfo = @(1,3,5,6)
    State = 'NE'
} | Export-Csv -notype Random.csv

image

As you can see, the ExtraInfo column has System.Object[] (or a different object type) instead of the 1,3,5,6. This can be frustrating to look at, especially when you have hundreds or thousands of rows of data which may have multiple columns that contains this type of information. Why does this happen? Well, it is because that anything which goes through to Export-Csv is casted as a string before being written, as in this example.

@(1,2,3,5).ToString()

image

There are a few ways that you can resolve this so that the collection is unrolled (or expanded if you will) that requires a little bit of extra code, but will help to make sure that you are getting human readable information in the spreadsheet.

Using –Join

One approach to this is to use the –Join operator on those properties which will have a collection of items in it.

[pscustomobject]@{
    First = 'Boe'
    Last = 'Prox'
    ExtraInfo = (@(1,3,5,6) -join ',')
    State = 'NE'
} | Export-Csv -notype Random.csv

image

Looks nice and is presentable to a person looking at the spreadsheet. Depending on the information, this may be the way for you. I’ve had data which may have 20 items in the collection and can cause that cell to become very long and if there are other various punctuations (such as working with IP addresses), then it could be harder to read.

[pscustomobject]@{
    First = 'Boe'
    Last = 'Prox'
    ExtraInfo = (@(1,3,5,6) -join ',')
    State = 'NE'
    IPs = (@('111.222.11.22','55.12.89.125','125.48.2.1','145.23.15.89','123.12.1.0') -join ',')
} | Export-Csv -notype Random.csv

image

I don’t know about you, but even if there was a space after the comma, it would still be painful to read. Because of that, I prefer to take the following approach with adjusting the output of the collection object.

Out-String and Trim()

My favorite approach (which requires a little more code and a little extra work at the end of it) to display the expanded collection in a spreadsheet by using a combination of Out-String and Trim().

[pscustomobject]@{
    First = 'Boe'
    Last = 'Prox'
    ExtraInfo = (@(1,3,5,6) | Out-String).Trim()
    State = 'NE'
    IPs = (@('111.222.11.22','55.12.89.125','125.48.2.1','145.23.15.89','123.12.1.0') | Out-String).Trim()
} | Export-Csv -notype Random.csv

image

Ok, first off you might be wondering where the rest of the data is at. Here is the part where you have to do a little formatting on the spreadsheet to get all of the data to show up. I typically will click on the upper left hand corner to select everything and then just double click on the row to expand all of the cells and then double click the columns to make sure it all looks good. I also make sure to set the vertical alignment to top as well.

image

After that, I then have this to view:

image

Now the IP Addresses and also the ExtraInfo show up as they normally would if we expanded it in the console. To me, and this is my own personal opinion, I prefer this much more than the other method. When I prepare my reports, I will typically use the ‘Format as table’ button in Excel to give it a little more color and then I ship it off to whoever needs it.

image

So there you go! These are just a couple of available options (I have no doubt that there are others) that you can use to make sure that your report is presentable to whoever needs to see it! As always, I am interested into seeing what others have done to get around this hurdle with sending objects with collections as properties to a spreadsheet.

A function to make things easier

I put together a function called Convert-OutputForCsv which serves as a middle man between the query for data and the exporting of that data to a CSV file using Export-Csv.

The function accepts input via the pipeline (recommended approach) and allows you to determine if you want the property to have the collection expanded to a comma separated value (comma) or if you want the stacked version that I showed above (stack). By default, the data being passed from this function to Export-Csv will not retain its order of properties (I am working on finding a solution to this) but you do have the option of defining the order manually which can be passed into the function.

Updated 02 FEB 2014: Removed OutputOrder parameter as it is no longer needed for this function. Bug has been fixed where output order didn’t match the order of the input object.

After dot sourcing the script file (. .\Convert-OutputForCsv.ps1) and loading the function into the current session, I will now demonstrate and example of how this works.

The following example will gather information about the network adapter and display its properties first without the use of the function and then using the function.

$Output = 'PSComputername','IPAddress', 'IPSubnet',
'DefaultIPGateway','DNSServerSearchOrder'

Get-WMIObject -Class Win32_NetworkAdapterConfiguration -Filter "IPEnabled='True'" |
Select-Object $Output | Export-Csv -NoTypeInformation -Path NIC.csv 

 

image

Pretty much useless at this point. Now lets run it and throw my function into the middle.

$Output = 'PSComputername','IPAddress', 'IPSubnet', 'DefaultIPGateway','DNSServerSearchOrder'

Get-WMIObject -Class Win32_NetworkAdapterConfiguration -Filter "IPEnabled='True'" |
Select-Object $Output | Convert-OutputForCSV -OutputOrder $Output | 
Export-Csv -NoTypeInformation -Path NIC.csv   

image

That looks a whole lot better! And just for another example, let’s see this using the comma format as well.

$Output = 'PSComputername','IPAddress', 'IPSubnet', 'DefaultIPGateway','DNSServerSearchOrder'

Get-WMIObject -Class Win32_NetworkAdapterConfiguration -Filter "IPEnabled='True'" |
Select-Object $Output | Convert-OutputForCSV -OutputOrder $Output -OutputPropertyType Comma | 
Export-Csv -NoTypeInformation -Path NIC.csv   

 

image

One more, this time with Get-ACL

$Output = 'Path','Owner', 'Access'

Get-ACL .\.gitconfig | Select-Object Path, Owner, Access, SDDL, Group| 
Convert-OutputForCSV -OutputOrder Path,Owner,Access |
Export-Csv -NoTypeInformation -Path ACL.csv

 

image

Works like a champ! Anything that I didn’t specify in the OutputOrder will just get tossed in at the end in no particular order.

The download for this function is below. Give it a spin and let me know what you think!

Download Convert-OutputForCsv.ps1

Convert-OutputForCSV.ps1

Posted in powershell | Tagged , , , | 17 Comments

Winter Scripting Games 2014 Tip #2: Use #Requires to let PowerShell do the work for you

In Version 2 of PowerShell, you had the ability to use #Requires –Version 2.0 to ensure that your scripts/functions would only run at a specified PowerShell version to prevent folks running an older version from wondering why things weren’t working that well.

image

In this article, I will show you a couple of new additions to the #Requires statement that will make your life easier when writing functions that require specific pre-requisites rather than coding your own methods.

Modules

This was fine, but only lent itself to scripts that were not version compatible. Fortunately, with Version 3, we gained a better #Requires statement for modules. Rather than adding extra code to handle the checking of whether a module existed or not, we can just add the following statement and if the module existed, the running of the code would continue and if it didn’t work, a terminating error is thrown and the codes stops running.

Let’s try it out!

#Requires -Module ActiveDirectory

Ok, I have this statement placed right after my commented help block and before the [cmdletbinding()] statement as so:

Function Get-SomeUser {
    #Requires -Module ActiveDirectory
    [cmdletbinding()]
    Param ($User)
    Get-ADUser -Identity $User
}

When I dot source the function, nothing happens, meaning that the function found the module required.

image

Nothing happens as expected. The module was found and it was also loaded up. So what happens when a module doesn’t exist on the system in which we are calling the function on?

image

My DoesntExists module…well…doesn’t actually exist and when I try to dot source the script to load that function which requires that specific module, it fails stating that it is missing and therefor unable to proceed any further. Pretty handy if you do not want your stuff to run without a specific module available.

Running as an Administrator

I’ve actually written a small script back in the day as well as a Hey, Scripting Guy! article that was used to detect if the current user was running PowerShell ‘as an administrator’ before running a script. Another possibility that could be used is this:

[bool]((whoami /all) -match "S-1-16-12288")

In PowerShell V4 we were gifted with really one of the best little additions that can really cut down on the amount of code that could be used to detect whether the script/function was being ‘run as an administrator’.  That little #Requires gem is called RunAsAdministrator.

Function Set-Something {
<#
    .SYNOPSIS
#>
#Requires -RunAsAdministrator
    [cmdletbinding()]
    Param ($Item)

}

image

Now lets try this when I am not running my console as an administrator and see what happens.

Capture

Perfect! With just a small line of text, we have made sure that this script can only be run by someone with Administrator rights and also is running the script in a console that was opened using the ‘run as administrator’ context.

By utilizing these two small things, you can ensure that you are letting PowerShell do the work for you and saving time coding checks that are already available to you out of the box!

Posted in powershell, V3, V4, Winter Scripting Games 2014 | Tagged , , , , , | 3 Comments

Winter Scripting Games 2014 Tip #1: Avoid the aliases

Having been a judge for the previous 2 Scripting Game competitions as well as competing in the 2 before that, I have seen my share of scripts submitted that didn’t quite meet the cut of what I felt were the best scripts. It doesn’t mean that they wouldn’t work out in the real world in a production environment (Ok, some wouldn’t Smile), but some were just really hard to read or others were doing things that I wouldn’t consider to be a good practice. 

I’m not judging this year and am instead taking on the role as a coach which gives me the great opportunity to provide input on a submission while the event is ongoing which also allows me to blog about what I am seeing to help everyone out. My goal over the course of the next few weeks is to provide some feedback based on the scripts that I have seen as well as bringing up some past things that have hindered some otherwise excellent scripts. Maybe you will agree with me, maybe you won’t. But if anything, it will make you think about what you might be writing and using in your environment.

I will start this little excursion by talking about the use of aliases in scripts. An alias is a shorthand way to run a command or use a parameter in a script/function. An example of this is here:

ls -di | ? {
    $_.LastWriteTime -gt (date).AddMonths(-24)
} | % {
    mv -pat $_.fullname -des C:\Temp -wh 
}

This is probably a little extreme, but I think you can appreciate what I am trying to point out, which is that aliases make it pretty hard to read the code (especially if you are just learning PowerShell) or if you are trying to read someone else’s code and make sense of the direction that they were going.

If you are just running code ad-hoc from the shell, then this is perfectly fine to do as only you are worried about what is being done and have no plans on giving this to someone else (maybe you are, but you might just say “run this and don’t ask questions!” Winking smile).

So back to our little code snippet above. Perfectly fine for a console run, but in a script, this may present a headache for others. Lets clean this up so everything has been expanded out to be readable.

Get-ChildItem -Directory | Where-Object {
    $_.LastWriteTime -gt (Get-Date).AddMonths(-24)
} | ForEach {
    Move-Item -Path $_.fullname -Destination C:\Temp -WhatIf 
}

Now I have a better idea as to what is being done in this code snippet. Unless otherwise noted, this is not about the shortest amount of characters that you can use in a script, it is about making the script do what you want it to do as well as making sure it is readable to whoever happens to be looking at it. This not only helps the next person understand more of what is going on, but also aids in the troubleshooting process in case the script doesn’t work properly or maybe more developing is done on it.

Posted in powershell, Winter Scripting Games 2014 | Tagged , , , | 5 Comments

Custom PowerShell Objects and Performance Revisited

Way back in my earlier days of blogging (Sept 2010), I wrote an article (I recommend you check this one out for more information regarded some of the older styles of custom object creation) that talked about the performance differences between a few different ways that you can create a custom PowerShell object.

At that time we were rocking PowerShell V2 and enjoying all of the great benefits that were being brought to us. Fast forward to now and we are sitting at V4 and have a new way to create a custom PowerShell object in [pscustomobject]. What this brings us is a way to finally have the speed benefits of doing PSObject…

New-Object PSObject -Property @{
Name = 'Boe'
Number = 1
ID = 007
}

…with the ability of keeping everything in the order we specify like when we use the Select-Object method:

$Object = '' | Select-Object Name, Number, ID
$Object.Name = 'Boe'
$Object.Number = 1
$Object.ID = 007

Another method which was discussed was the use of Add-Member to create objects:

$Object = New-Object PSObject
$Object | Add-Member -MemberType NoteProperty -Name Name -Value Boe -PassThru |
Add-Member -MemberType NoteProperty -Name Number -Value 1 -PassThru |
Add-Member -MemberType NoteProperty -Name ID -Value 007 -PassThru

The New Stuff

I won’t really dive any deeper into these as you can view that old blog post to catch up on the rest of that stuff, but what I will do is show off [pscustomobject] as well as take all of these for a spin again and show off the performance differences between these 4 contenders.

[pscustomobject] is pretty simple to use:

[pscustomobject]@{
    Name= 'Boe'
    Number = 1
    ID = 007
}

The best part is that it keeps its order and is pretty quick as well. Just how quick is it? Well, sit back and check out the stats when compared to the other methods. If you have read the previous blog entry, you will see that Add-Member was by far the slowest method while Select-Object and New-Object PSObject were neck and neck in speed with PSObject pulling ahead slightly for the win.

After I initially published this article, I had some suggestions on other items to include in my testing. The are New-Object PSObject –Property ([ordered@{}) and $prop=[ordered]@{};[pscustomobject]$prop (labeled [pscustomobject][ordered]). Both of these output the same type of ordered object, meaning that the order that you supply the data in the hash table is the same order that it will display on the console.

Keep in mind that this produces the exact same output as if you would use [pscustomobject] by itself.

image

So how will these newcomers compare with everything else? Let’s find out!

The Approach

If you saw the previous blog where I talk about performance, you noticed that I had a script that helped to make the determination as to what was the fastest approach. Well, this script was not exactly the best written script and had a lot of manual things going on. I’ve updated my script to make it easier to add the number of cycles (basically the number of “systems” to run against), the number of properties to have in the custom object and finally the number of times to repeat each operation. The script is available to download from the link at the end of this article if you are interested in running it yourself.

The Results

I started out at running against 10 “systems” (cycles) while creating the following sets of objects (1,5,10,25,50,100) and repeated this 5 times. I then upped the cycles to the following values and ran against each one to record the time it took each type to complete: 50,100,500,1000,5000,10000. The first list of graphs will show you everything at its own scale, meaning that the highest time for each cycle will be the max on the graph. Because this doesn’t always tell the whole store, I also included a second set of graphs that show the highest value recorded during the entire time spent running the scans (the 10000 cycle scan had the highest time taken). At the beginning, you really won’t see much of anything but as the cycles go on, you can see which approach becomes slower and slower.

So with all of that out of the way, let me show the first set of graphs that show the results of my tests.

image

image

image

image

image

image

 

Both [pscustomobject] and the [pscustomobject][ordered] approach were the fastest ones in PowerShell and for the most part are interchangeable (assuming that you are running V3/4, of course). Add-Member is definitely the slowest approach, much like it was when I first ran this test. The rest of the bunch were back and forth as far as which ones were faster at any given moment in time.  As I said previously, these numbers are all scaled to their respected highest time. The graphs below will show the scale from the 10000 Cycle run which had the highest time returned.

image

image

image

 

image

image

image

This shows that in the same scale as 10000 cycles, the level of difference in performance doesn’t really come into play until you are running against 500 systems. Of course, various outside conditions such as network latency and system performance come into play as well, but you get the idea.

Much like the tests I ran back in 2010, Select-Object and New-Object PSObject are practically neck and neck and really come down to your preference of having your properties come out in the order that they were coded (Select-Object) or having a very slight and much better looking (from a coding style) approach (PSObject).

I hope you enjoyed this article and found the information useful when it comes time to make a decision what method you wish to take. My personal take is that if you are running V3/4, then you really should be using [pscustomobject] unless you have a need to add to an existing object or add some other levels of properties to an object in which Add-Member should be used at the cost of performance. While the [pscustomobject][ordered] was on par with [pscustomobject], they are the same thing and if you want to save an extra line of code, you should look at just using [pscustomobject]@{} by itself. But in the end, it is up to you to decide what you feel may be the right choice based on your requirements.

Download Script

Technet Script Repository

Posted in powershell | Tagged , , | 4 Comments