Scripting Games 2013: Event 2 Notes

I spent some time last week and this weekend to compile a list of notes of what I have seen with the Event 2 submissions that should show improvement. I touched up on some items with my previous article where I picked out some submissions that I liked and didn’t quite like but wanted to touch on a few more things. Some of this feels like a repeat of last week and even last years games, but that is Ok. This is all about learning and as long as everyone takes what all of the judges have been writing about, then there will be nothing but great improvements during the course of the games.

When is a one-liner not a one-liner?

The answer is when someone uses a semicolon “;” and assumes that counts as a one-liner. Sorry folks, but this doesn’t work. A one-liner is something that is a complete continuation of a string of commands. In PowerShell, a semicolon is a line break which means that the one-liner that was being put together has now became a multi-one-liner.

Here is what not to do:

$OS = Get-WMIObject -Class Win32_OperatingSystem ;$ComputerSystem = Get-WmiObject -Class Win32_ComputerSystem; New-Object PSObject -Property @{Computername=$ComputerSystem.Name;OS=$Os.Caption}

It seems like a one-liner, but with the semicolons, it is nothing more than a one-liner lie. In fact, the only time here that the semicolon is legal is with the New-Object cmdlet that takes a hash table as input.

A one liner is not required for the Scripting Games in the beginner event. It is a nice to have, but it is also a risk because if you try to force it as such, it could come back to bite you when it comes to voting. My advice, split the code up on lines because not only will it prevent possible issues in the code, it is going to look tons better!

Formatting your code

Moving from one-liners to proper code formatting so it can easily be read by not only the voting community, but also applies to production scripts that may have to be read 6 months from by someone else or even the future you. Think to yourself, “Will I know exactly what is going on here in 6 months?”. If you cannot answer this question or answer No, then stop, go back and take the time to re-format the code so it is more readable.

Here are some bullet points to think about when writing code:

  • Space out your code; separate by code blocks so it is not just one line after another of code
  • Use the proper line continuation characters: commas“,”, opening square bracket “{“ and pipe symbol “|”; please for the love of everything good, do not use a backtick “`” as a line continuation. This is also a lie and must be stopped!
  • Indent your code where needed. Examples would be when you use a natural line break (first bullet point) or during a ForEach/Where block and If/Else block. Doing so will make it easier to read and organize your code.

Splatting instead of back ticks for parameters

Splatting is not only great for working with various cmdlets and making input a lot easier when in a loop, but it is also a smart idea to make your code easier to read and avoiding the use of back ticks when working with multiple parameters.

Send-MailMessage -To user@domain.com `
                 -From user1@domain.com `
                 -Subject Test `
                 -SMTPServer server@domain.com `
                 -Body "This is a test"

This is not a good way of handling the parameters in a script. Sure, it is better than having them on one line and navigating that mess, but this still doesn’t fit a good practice. This is where splatting comes into play.

$email = @{
    To = 'user@domain.com'
    From = 'user1@domain.com'
    Subject = 'Test'
    SMTPServer = 'server.domain.com'
}
...
#Some code
...
#Add the output into the body of the email
$email.Body = $outputText
Send-MailMessage @email

I can define most of my parameters for Send-MailMessage up front and once I have the data available for my Body, I can add that into my $email hash table before calling Send-MailMessage and supplying the hash table into the cmdlet (using “@” to splat) and the command will run just like usual.

Enough with $ErrorActionPreference

I have seen a lot of people using $ErrorActionPreference up front in their submissions as a catch all for error handling. Sure, some people are using Stop or Continue (this is already the default). This should never be touched at all in a script unless there is an excellent reason to do so (so far there hasn’t been a reason to do so). Each cmdlet has a common parameter called –ErrorAction that accepts Continue, Inquire, Stop and SilentlyContinue that you can use. Unless you need to suppress error messages for some reason, then I would recommend using Stop and put this command in a Try/Catch statement to handle any errors that come up.

Format* fail

The last thing that I want to go over is the use of Format* cmdlets as a final step to output data to the user. Bad, bad idea! What you are doing is ruining the object for the user in a way that they cannot manipulate the output either through Sort or Where-Object and cannot even output this to a file ( it can be outputted to a .txt file, but it will be just like what is on screen, truncated and all).

Get-Service | Format-Table | Sort Name

image

Doesn’t work out so good, does it? If only the Format-Table wasn’t there, then I could have sorted this out.

Maybe this will work for a CSV file instead?

Get-Service | Format-Table | Export-Csv -NoTypeInformation Report.csv

image

No errors, so it had to have worked correctly, right? Wrong!

image

If you can read this, then you are a better person than me. Smile The same goes with Format-List as well. Format* cmdlets should never be used in a script and should be left for a user to make the decision themselves. It will save you the hassle of being asked why the output stinks and also save you from losing points in a submission.

That’s it for now. Hopefully you can use these tips not only for the Scripting Games but also for “real world” scripts in your environment as well!

Posted in 2013 Scripting Games Judges Notes, powershell, Scripting Games 2013 | Tagged , , | 3 Comments

Scripting Games 2013: Event 2 ‘Favorite’ and ‘Not So Favorite’

Event 2 is in the books and with that, it is time to take a look at all of the scripts submitted and make the difficult decisions as to which ones I liked and which ones I didn’t quite like. Remember, just because a script landed on my ‘Not so Favorite’ list doesn’t mean it was terrible. It was just that I felt that there were some things here and there that could have been looked at a little differently.

So without further ado, lets dig into the submissions!

Advanced Category – Not So Favorite Submission

<#
.SYNOPSIS
    This script runs an inventory of OS, CPU cores and installed RAM on a list of computers
.DESCRIPTION
    .
.PARAMETER FilePath
    The path to the computer list text file
 
 
.EXAMPLE
        .\inventory.ps1 C:\Computernames.txt
.NOTES
    Author: Posh_London
    Date:   May 2013    
#>
 
# == Define params == #
 
param([Parameter(Mandatory=$true)][string]$filepath)
 
$output = @()
$Servers = gc $filepath
$Servers | % {
	$processor = gwmi win32_processor -computername $_
	$os = gwmi win32_operatingsystem -computername $_
	$m = gwmi win32_computersystem -computername $_
	$memGB = $m.totalphysicalmemory/1gb
	$memory = [math]::round($memGB)
	$osout = $os.caption
	$cores = $processor.numberofcores
	$servername = $m.name
	$output +="$ServerName   $osout    $cores CPU Cores       $memory GB RAM"
	$output +="`n"
 
	}
 
write-host $output -foregroundcolor "yellow"
  1. Line 1: Help text is lacking good examples and a Description
  2. Line 19: FilePath is not really a good choice for Parameters, especially when you might want to pass Computernames through to it. Computername would be the better choice in this case.
  3. Line 19: Lack of pipeline support for this really hurts the submissions ability to pass content from a text file through to the script.
  4. Really should be a function if in the Advanced category; makes it more of a usable tool
  5. Line 19: Instead of using Mandatory for the parameter, consider setting a default value such as $Env:Computername
  6. Line 22,23: Alias are bad news! Don’t use them
  7. Line 23-37: No object at all is being outputted at all! In fact, you can’t even output this to a file because Write-Host is being used or do things such as sorting or manipulating of the objects. Recommend that the user read up outputting objects using New-Object to create custom objects that can then be exported to files or manipulated.

Advanced Category – Favorite Submission

<#
.SYNOPSIS
   Get inventory data for specified computer system.
.DESCRIPTION
   Get inventory data for provided host using wmi.
   Data proccessing use multithreading and support using timeouts in case of wmi problems.
   Target computer system must be reacheble using ICMP Echo.
   Provide ComputerName specified by user and HostName used by OS. Also provide OS version, CPU and memory info.
.PARAMETER ComputerName
   Specifies the target computer for data query.
.PARAMETER ThrottleLimit
   Specifies the maximum number of WMI operations that can be executed simultaneously
.PARAMETER Timeout
   Specifies the maximum time in second command can run in background before terminating this thread.
.PARAMETER ShowProgress
   Show progress bar information
.EXAMPLE
   PS > Get-AssetInfo -ComputerName test1
 
   ComputerName : hp-test1
   OSCaption    : Microsoft Windows 8 Enterprise
   Memory       : 5,93 GB
   Cores        : 2
   Sockets      : 1
 
   Description
   -----------
   Query information ablout computer test1
.EXAMPLE
   PS > Get-AssetInfo -ComputerName test1 -Credential (get-credential) | fromat-list * -force
 
   ComputerName   : hp-test1
   OSCaption      : Microsoft Windows 8 Enterprise
   OSVersion      : 6.2.9200
   Cores          : 2
   OSServicePack  : 0
   Memory         : 5,93 GB
   Sockets        : 1
   PSComputerName : test1
   Description
   -----------
   Query information ablout computer test1 using alternate credentials
.EXAMPLE
   PS > get-content C:\complist.txt | Get-AssetInfo -ThrottleLimit 100 -Timeout 60 -ShowProgress
 
   Description
   -----------
   Query information about computers in file C:\complist.txt using 100 thread at time with 60 sec timeout and showing progressbar
.NOTES
   Required: Powershell 2.0
   Info: WMI prefered over CIM as there no speed advantage using cimsessions in multitheating against old systems.
#>
function Get-AssetInfo
{
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true, 
                   ValueFromPipeline=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        [ValidateNotNullOrEmpty()]
        [Alias('DNSHostName','PSComputerName')]
        [string[]]
        $ComputerName,
 
        [Parameter(Position=1)]
        [ValidateRange(1,65535)]
        [int32]
        $ThrottleLimit = 32,
 
        [Parameter(Position=2)]
        [ValidateRange(1,65535)]
        [int32]
        $Timeout = 120,
 
        [Parameter(Position=3)]
        [switch]
        $ShowProgress,
 
        [Parameter(Position=4)]
        [System.Management.Automation.Credential()]
        $Credential = [System.Management.Automation.PSCredential]::Empty
    )
 
    Begin
    {
 
        Write-Verbose -Message 'Creating local hostname list'
        $IPAddresses = [net.dns]::GetHostAddresses($env:COMPUTERNAME) | Select-Object -ExpandProperty IpAddressToString
        $HostNames = $IPAddresses | ForEach-Object {
            try {
                [net.dns]::GetHostByAddress($_)
            } catch {
                # We do not care about errors here...
            }
        } | Select-Object -ExpandProperty HostName -Unique
        $LocalHost = @('', '.', 'localhost', $env:COMPUTERNAME, '::1', '127.0.0.1') + $IPAddresses + $HostNames
 
        Write-Verbose -Message 'Creating initial variables'
        $runspacetimers = [HashTable]::Synchronized(@{})
        $runspaces = New-Object -TypeName System.Collections.ArrayList
        $bgRunspaceCounter = 0
 
        Write-Verbose -Message 'Creating Initial Session State'
        $iss = [System.Management.Automation.Runspaces.InitialSessionState]::CreateDefault()
        foreach ($ExternalVariable in ('runspacetimers', 'Credential', 'LocalHost'))
        {
            Write-Verbose -Message "Adding variable $ExternalVariable to initial session state"
            $iss.Variables.Add((New-Object -TypeName System.Management.Automation.Runspaces.SessionStateVariableEntry -ArgumentList $ExternalVariable, (Get-Variable -Name $ExternalVariable -ValueOnly), ''))
        }
 
        Write-Verbose -Message 'Creating runspace pool'
        $rp = [System.Management.Automation.Runspaces.RunspaceFactory]::CreateRunspacePool(1, $ThrottleLimit, $iss, $Host)
        $rp.Open()
 
        Write-Verbose -Message 'Defining background runspaces scriptblock'
        $ScriptBlock = {
            [CmdletBinding()]
            Param
            (
                [Parameter(Position=0)]
                [string]
                $ComputerName,
 
                [Parameter(Position=1)]
                [int]
                $bgRunspaceID
            )
            $runspacetimers.$bgRunspaceID = Get-Date
 
            if (Test-Connection -ComputerName $ComputerName -Quiet -Count 1 -ErrorAction SilentlyContinue)
            {
                try
                {
                    Write-Verbose -Message "WMI Query: $ComputerName"
                    $WMIHast = @{
                        ComputerName = $ComputerName
                        ErrorAction = 'Stop'
                    }
                    if ($LocalHost -notcontains $ComputerName)
                    {
                        $WMIHast.Credential = $Credential
                    }
 
                    $WMICompSystem = Get-WmiObject @WMIHast -Class Win32_ComputerSystem
                    $WMIOS = Get-WmiObject @WMIHast -Class Win32_OperatingSystem
                    $WMIProc = Get-WmiObject @WMIHast -Class Win32_Processor
 
                    if (@($WMIProc)[0].NumberOfCores) #Modern OS
                    {
                        $Sockets = @($WMIProc).Count
                        $Cores = ($WMIProc | Measure-Object -Property NumberOfLogicalProcessors -Sum).Sum
                    }
                    else #Legacy OS
                    {
                        $Sockets = @($WMIProc | Select-Object -Property SocketDesignation -Unique).Count
                        $Cores = @($WMIProc).Count
                    }
 
                    #region Create custom output object
                    #Due to some bug setting scriptblock directly as value can cause 'NullReferenceException' in v3 host
                    $MethodOptions = @{
                        Name = 'ToString'
                        MemberType = 'ScriptMethod'
                        PassThru = $true
                        Force = $true
                        Value = [ScriptBlock]::Create(@"
                            "{0:N1} {1}" -f @(
                                switch -Regex ([math]::Log(`$this,1024)) {
                                    ^0 {
                                        (`$this / 1), ' B'
                                    }
                                    ^1 {
                                        (`$this / 1KB), 'KB'
                                    }
                                    ^2 {
                                        (`$this / 1MB), 'MB'
                                    }
                                    ^3 {
                                        (`$this / 1GB), 'GB'
                                    }
                                    ^4 {
                                        (`$this / 1TB), 'TB'
                                    }
                                    default {
                                        (`$this / 1PB), 'PB'
                                    }
                                }
                            )
"@
                        )
                    }
 
                    $myObject = New-Object -TypeName PSObject -Property @{
                        'PSComputerName' = $ComputerName
                        'ComputerName' = $WMICompSystem.DNSHostName
                        'OSCaption' = $WMIOS.Caption
                        'OSServicePack' = $WMIOS.ServicePackMajorVersion
                        'OSVersion' = $WMIOS.Version
                        'Memory' = $WMICompSystem.TotalPhysicalMemory | Add-Member @MethodOptions
                        'Cores' = $Cores
                        'Sockets' = $Sockets
                    }
 
                    $myObject.PSObject.TypeNames.Insert(0,'My.Asset.Info')
                    $defaultProperties = @('ComputerName','OSCaption', 'Memory', 'Cores', 'Sockets')
                    $defaultDisplayPropertySet = New-Object System.Management.Automation.PSPropertySet(‘DefaultDisplayPropertySet’,[string[]]$defaultProperties)
                    $PSStandardMembers = [System.Management.Automation.PSMemberInfo[]]@($defaultDisplayPropertySet)
                    $myObject | Add-Member MemberSet PSStandardMembers $PSStandardMembers
                    #endregion
 
                    Write-Output -InputObject $myObject
                }
                catch
                {
                    Write-Warning -Message ('{0}: {1}' -f $ComputerName, $_.Exception.Message)
                }
            }
            else
            {
                Write-Warning -Message ("{0}: Unavailable" -f $ComputerName)
            }
        }
 
        function Get-Result
        {
            [CmdletBinding()]
            Param
            (
                [switch]$Wait
            )
            do
            {
                $More = $false
                foreach ($runspace in $runspaces)
                {
                    $StartTime = $runspacetimers.($runspace.ID)
                    if ($runspace.Handle.isCompleted)
                    {
                        Write-Verbose -Message ('Thread done for {0}' -f $runspace.IObject)
                        $runspace.PowerShell.EndInvoke($runspace.Handle)
                        $runspace.PowerShell.Dispose()
                        $runspace.PowerShell = $null
                        $runspace.Handle = $null
                    }
                    elseif ($runspace.Handle -ne $null)
                    {
                        $More = $true
                    }
                    if ($Timeout -and $StartTime)
                    {
                        if ((New-TimeSpan -Start $StartTime).TotalSeconds -ge $Timeout -and $runspace.PowerShell)
                        {
                            Write-Warning -Message ('Timeout {0}' -f $runspace.IObject)
                            $runspace.PowerShell.Dispose()
                            $runspace.PowerShell = $null
                            $runspace.Handle = $null
                        }
                    }
                }
                if ($More -and $PSBoundParameters['Wait'])
                {
                    Start-Sleep -Milliseconds 100
                }
                foreach ($threat in $runspaces.Clone())
                {
                    if ( -not $threat.handle)
                    {
                        Write-Verbose -Message ('Removing {0} from runspaces' -f $threat.IObject)
                        $runspaces.Remove($threat)
                    }
                }
                if ($ShowProgress)
                {
                    $ProgressSplatting = @{
                        Activity = 'Getting asset info'
                        Status = '{0} of {1} total threads done' -f ($bgRunspaceCounter - $runspaces.Count), $bgRunspaceCounter
                        PercentComplete = ($bgRunspaceCounter - $runspaces.Count) / $bgRunspaceCounter * 100
                    }
                    Write-Progress @ProgressSplatting
                }
            }
            while ($More -and $PSBoundParameters['Wait'])
        }
    }
    Process
    {
        foreach ($Computer in $ComputerName)
        {
            $bgRunspaceCounter++
            $psCMD = [System.Management.Automation.PowerShell]::Create().AddScript($ScriptBlock).AddParameter('bgRunspaceID',$bgRunspaceCounter).AddParameter('ComputerName',$Computer)
            $psCMD.RunspacePool = $rp
 
            Write-Verbose -Message ('Starting {0}' -f $Computer)
            [void]$runspaces.Add(@{
                Handle = $psCMD.BeginInvoke()
                PowerShell = $psCMD
                IObject = $Computer
                ID = $bgRunspaceCounter
           })
           Get-Result
        }
    }
 
    End
    {
        Get-Result -Wait
        if ($ShowProgress)
        {
            Write-Progress -Activity 'Getting asset info' -Status 'Done' -Completed
        }
        Write-Verbose -Message "Closing runspace pool"
        $rp.Close()
        $rp.Dispose()
    }
}
  1. Excellent use of help. Shows multiple examples of using this function and clearly defines parameters
  2. Line 58-69: Great use of parameter attributes to handle various situations. I like the use of [Alias()] with the ByPropertyName attribute for the pipeline. This allows you to pipe a call from Get-ADComputer directly into this function and it will chug right along to perform the query.
  3. Line 82,83: Great use of handling the Credential parameter. By doing it this way, you have a few ways to handle the Credential input (domain\username, username, pscredential object or just use the –Credential param and allow it to prompt for input.
  4. Use of  [net.dns]::GetHostByAddress() was unnecessary as the WMI call would return a usable hostname for the object output.
  5. I will give kudos to the use of custom runspaces to make a more efficient query. Great job at handling the runspaces and the implementation of a timer to handle runspaces that might be hung up.
  6. Handling of the credential against local system issue was nicely done to avoid issues.
  7. Use of splat tables to pass into cmdlet is great.
  8. Line 206-210: Smart use of enhancing the output of the object. Haven’t seen most of this done before.

 

Beginner Category – Not So Favorite Submission

#Pulls IPs from txt file.  Gets Machine Name, Number of Physical CPUs, Number of cores per CPU, RAM in GBs, and Windows OS install
ipcsv C:\ScriptingGames\Event2\IPList.txt | gwmi Win32_ComputerSystem | 
    Format-List Name,NumberOfProcessors, NumberOfLogicalProcessors, 
    @{Name="RAM"; Expression={[math]::round($($_.TotalPhysicalMemory/1GB), 2)}} ;
    gwmi win32_OperatingSystem |Format-list name

Code formatted so it can be viewed better

  1. This isn’t a CSV file, so I am unsure as to why Import-Csv is being used for the list of IPs.
  2. Get-WMIObject doesn’t support pipeline input at all. Check Get-Help Get-Wmiobject –Parameter Computername. You can also check out my article on using Trace-Command to troubleshoot pipeline issues here. This causes the line to fail instantly. Instead, use a ForEach loop and use the –Computername parameter with the current item in the loop.
  3. Format-List is a no-no. Once you use a Format* cmdlet, you have lost all ability to use this object for anything such as sorting, outputting to a file, etc… See this article for more information.
  4. A better approach would have been to save the output of each WMI call to variables and then use New-Object to output a custom object that lists all of the information as well as allowing the use to pipe the object to another cmdlet or export to a file.
  5. The second pipe to Get-WMIObject will fail pretty much like the first; again this should have been used with the –Computername parameter and saved to a variable for future object outputting.

 

Beginner Category – Favorite Submission

Get-Content -LiteralPath 'C:\IPLIST.txt' | ForEach-Object {
    (Get-WmiObject -Class "Win32_ComputerSystem" -ComputerName $_ | 
    Select-Object -Property Name,@{n="Total Memory (GB)";e={$_.TotalPhysicalMemory / 1GB}},
    @{n="Cores";e={$_.NumberOfLogicalProcessors}},@{n="Processors";e={$_.NumberOfProcessors}} | 
    Add-Member -Name "OS Version" -Value $(Get-WmiObject -Class Win32_OperatingSystem -ComputerName $_ | 
    Select-Object -ExpandProperty Caption) -MemberType NoteProperty -PassThru)
}

Code formatted so it can be viewed better

  1. Great use of a one-liner approach to solve this issue
  2. Like the use of ForEach-Object to iterate through each of the items in the text file.
  3. Custom objects with the Select-Object @{n=’name’;e={$_}} allows for object output that can be sorted, exported to a file, etc…
  4. Clever use of Add-Member to handle the Caption property in the current object

 

That’s it for Event 2! I will say that it was very tough deciding on which submissions I wanted as ‘Favorite’ as there were some great submissions, which meant that I really had to be picky about what I liked. Same for the ‘Not So Favorites’, it is never easy to find one that I just don’t like as much as another, but it goes with the territory.

I always appreciate feedback on how you think I did with picking the submissions. Disagree with my decisions? Let me know! I am just one person out of many and as we all know, not everyone has the same opinions on what they think is a better or not so good solution. As long as we can provide a learning experience, then we all win in this competition!

Posted in 2013 Scripting Games Judges Notes, powershell, Scripting Games 2013 | Tagged , , | 3 Comments

Tips on Implementing Pipeline Support

Something that I was seeing during the first event of the Scripting Games was the use (or misuse) of implementing pipeline support for a parameter in a function or script. While most people did do this correctly, I did see a decent number of people do some things that would never work at all if someone attempted to pipe data into their function. I want to clarify/expand on some things that I talked about in a previous article that should be done and why some of the methods being used will not work like you think.

We all know that being able to pass objects (not text!) through the pipeline with PowerShell is just amazing and very powerful. Doing this by taking output from another cmdlet and then streaming it into another command which allows us to chain commands very seamlessly without effort. Doing this will also throttle the amount of memory that is being allocated (in most cases) that the current session is using for the commands.

Getting started…

Want to know more about the pipeline? Then do the right thing and explore PowerShell’s awesome help system with the following command:

Get-Help about_pipelines

What you may not know is how to properly implement this to get the benefit of the pipeline in your functions. And by this, I am talking about the Begin, Process and End blocks in the code. I am going to show initially some mistakes that could be made with this implementation and how to overcome them.

First off, how do I allow my parameter to accept pipeline input? By specifying one of the following parameter attributes:

  1. ValueFromPipeline
    1. Accepts values of the same type expected by the parameter or that can be converted to the type that the parameter is expecting.
  2. ValueFromPipelineByPropertyName
    1. Accepts values of the same type expected by the parameter but must also be of the same name as the parameter accepting pipeline input.

Now with that out of the way, lets look at what you might expect to see:

    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )

The Computername parameter allows for pipeline support by value of something that is a string or a collection of strings. If we were accepting pipeline that has the same name as Computername (or any defined Aliases with the [Alias()] attribute, we would use the following:

    Param (
        [parameter(ValueFromPipelineByPropertyName)]
        [Alias('IPAddress','__Server','CN')]
        [string[]]$Computername
    )

This allows me to do something like pipe the output of a WMI query using Get-WMIObject into a function and it would grab the __Server property of the object and use it in the pipeline of the function. Pretty cool stuff! Please make sure that if you use the *ByPropertyName attribute, that there is actually a property in the object either supports it or you are using an Alias attribute that has the property that will map to whatever the incoming object has.

Now on to the main point of this article which is setting up the guts of the function to process this correctly.

Begin, Process and End with no pipeline support

First off, if you are not accepting pipeline input, you really have no need to use Begin, Process and End because frankly, it is doing nothing for you other than just taking up space in your code. I know that people may be doing this as a way to organize their code, but there is a better way that I will show you in a moment.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter()]
        [string[]]$Computername
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }
    Process {
        Write-Verbose "Stuff in Process block to perform"
        ForEach ($Computer in $Computername) {
            $Computer
        }
    }

    End {
        Write-Verbose "Final work in End block"
    }
}

image

This is really a false sense of the blocks working as they are just going in the order provided in the code. Instead, take advantage of the PowerShell V3 ISE and its ability to use code folding with regions to organize your code accordingly.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter()]
        [string[]]$Computername
    )
    #region Initialization code
    Write-Verbose "Initialize stuff in Begin block"
    #endregion Initialization code

    #region Process data
    Write-Verbose "Stuff in Process block to perform"
    ForEach ($Computer in $Computername) {
        $Computer
    }
    #endregion Process data

    #region Finalize everything
    Write-Verbose "Final work in End block"
    #endregion Finalize everything
}

image

Same output, but now without the Begin, Process and End blocks. I’ll repeat it again, if you don’t allow for pipeline input, then just stick with using #region/#endregion tags to organize your code (you should also do this regardless of pipeline input or not).

Pipeline support with no Process block support

Ok, so what happens if we do specify a parameter that has pipeline support but has NO Process block? This was something common I saw during Event 1 and will show you what happens when trying to run a command that is setup this way.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )
    Write-Verbose "Initialize stuff in Begin block"

    Write-Verbose "Stuff in Process block to perform"
    $Computername

    Write-Verbose "Final work in End block"
}

What do you think will happen when I run this with pipeline input? Will it process everything? Will it process nothing? Lets find out!

image

If you thought that it would only show the last item in the pipeline, then you are the winner! What is happening is that without the Process block, the behavior is similar to what we would expect from the End block.

The way to do it…

So what is the proper way to accomplish this, let me show you that now with the following example.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }

    Process {
        Write-Verbose "Stuff in Process block to perform"
        $Computername
    }

    End {
        Write-Verbose "Final work in End block"
    }
}

image

Works like a champ now. But take a look at something here. The Write-Verbose statement runs for each item that is processed in the pipeline. What does this mean? Well, it means that you have to be careful about what is put in the Process block as it will run each and every time for each item being passed through the pipeline. In other words, don’t try to create the same file to write to with output or creating your main array that will hold data in it such as this example:

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }

    Process {
        $report = @()
        Write-Verbose "Stuff in Process block to perform"
        $report += $Computername
    }

    End {
        Write-Verbose "Final work in End block"
        $Report
    }
}

image

All of that data collected was overwritten with each item. I also saw something similar to this on a few submissions during Event 1. Be careful about not making this mistake!

Do I need all of these Begin, Process and End blocks?

With all of this information presented to you, does this mean that you only have to specify a Process block in your function? Well, yes and no. Yes if all you have is pipeline stuff to process and have no need to initialize anything else in the beginning. If you do have things to spin up, then add a Begin block to handle that, otherwise your function will fail when being run like this:

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )
    Write-Verbose "Initialize stuff in Begin block"
    Process {
        Write-Verbose "Stuff in Process block to perform"
        $Computername
    }
}

The function will actually load into memory without issue, but check out what happens when you attempt to run the function.

image

All seems well until we get to the Process piece. Instead of being read as a Process block, it is misinterpreted as Get-Process which obviously fails. Point here is keep everything in the Begin,Process and End blocks if you have need for them.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Computername
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }
    Process {
        Write-Verbose "Stuff in Process block to perform"
        $Computername
    }
}

image

Much better!

Multiple parameters that accept pipeline input

Now for something a little different. I saw at least one submission that had multiple parameters with pipeline input and wondered how was that going to work (turns out not so well!). See this example:

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Name,
        [parameter(ValueFromPipeline=$True)]
        [string[]]$Directory
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }

    Process {
        Write-Verbose "Process block"
        Write-Host "Name: $Name"
        Write-Host "Directory: $Directory"
    }

    End {
        Write-Verbose "Final work in End block"
        $Report
    }
}

Instead of the usual numbers into the pipeline, I am going to use Get-ChildItem and pipe that into my function to see what happens.

image

Weird, isn’t it? It will process the same value for each parameter just because it accepted pipeline input. How do we get around this issue? Use the PipelineValueByPropertyName attribute instead.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipelineByPropertyName=$True)]
        [string[]]$Name,
        [parameter(ValueFromPipelineByPropertyName=$True)]
        [string[]]$Directory
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }

    Process {
        Write-Verbose "Process block"
        Write-Host "Name: $Name"
        Write-Host "Directory: $Directory"
    }

    End {
        Write-Verbose "Final work in End block"
        $Report
    }
}

image

Now we are able to pull two separate values with 2 parameters that accept pipeline input. Another option would be to use ParameterSets, but that would mean that you would only have one parameter or the other to accept pipeline input and wouldn’t have the output that I have above allowing the use of multiple parameters to accept input AND use that in the function side by side.

One last thing, take care when using both ValueFromPipeline and …ByPropertyName with multiple parameters as it can cause some craziness in the output.

Function Get-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
        [string[]]$Name,
        [parameter(ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
        [string[]]$Directory
    )
    Begin {
        Write-Verbose "Initialize stuff in Begin block"
    }

    Process {
        Write-Verbose "Process block"
        Write-Host "Name: $Name"
        Write-Host "Directory: $Directory"
    }

    End {
        Write-Verbose "Final work in End block"
    }
}

image

In fact, this completely freaks out the Directory parameter and doesn’t actually bind to anything. This is due to the order of binding when you use both of these attributes.

Order of Parameter Binding Process From Pipeline

  1. Bind parameter by Value with same Type (No Coercion)
  2. Bind parameter by PropertyName with same Type (No Coercion)
  3. Bind parameter by Value with type conversion (Coercion)
  4. Bind parameter by PropertyName with type conversion (Coercion)

You can use Trace-Command to dig deeper into this and really see what is happening. Working with Trace-Command can be complicated and reading all of the output can certainly be overwhelming, so use at your own discretion!

This is a little bonus content on working with Trace-Command and seeing where the parameter binding is taking place as well as when Coercion and No Coercion is taking place.  I’ll be covering 6 one-liners to highlight specific items with the parameter binding.

The baseline that I will be using is for a timestamp that is a string and then another object that has a [datetime] type that will be piped into 5 functions to show each method of binding.

#String time
$nonType = New-Object PSObject -prop @{Datetime = "5:00 PM"}
#[datetime] type 
$Type = New-Object PSObject -prop @{Datetime = [datetime]"5:00 PM"}

Looking at how parameter binding handles different types

Here we will look at a simple function that accepts pipeline input by PropertyName to handle incoming data.

Function Get-Something_PropName {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipelineByPropertyName=$True)]
        [datetime[]]$Datetime
    )
    Process {$Datetime}
}

First lets run my variable with the Type [datetime] already defined and see where the parameter binding takes place:

Trace-Command parameterbinding {$Type | get-Something_PropName} -PSHost

image

You can see that it started out at line 5 by first seeing if it can pass the NO COERCION with the ByPropertyName attribute by validating that it is the type of [datetime] with the result being SUCCESSFUL.

Next up: the non-type property for the datetime parameter.

Trace-Command parameterbinding {$nonType | get-Something_PropName} -PSHost

image

Remember where the NO COERCION worked on the last run because the property was of the same type as the parameter requirement? Well, it doesn’t work out so well with my string value of “5:00 PM”. You can see where it doesn’t pass with a SKIPPED. Next up is the attempt to cast the input (COERCION) as the [datetime] type so it can match what the $DateTime parameter is requiring. This is done using the [System.Management.Automation.ArgumentTypeConverterAttribute] and in this case, it is SUCCESSFUL.

As a side note, I will be using the $nonType variable from here on out to show each time how it fails the NO COERCION attempt before the COERCION attempt.

Working with the [Alias()] attribute

Writing advanced functions means that support pipelining means also potentially using the [Alias()] parameter attribute to handle other properties that the parameter doesn’t have. This is more important when working with ByPropertyName.

Function Get-Something_PropName_NoAlias {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipelineByPropertyName=$True)]
        [datetime[]]$Date
    )
    Process {$Date}
}
Trace-Command parameterbinding {$nonType | get-Something_PropName_NoAlias} -PSHost

image

Well, that is certainly interesting. If you look at line 4 here, it shows an arg of System.Management.Automation.PSCustomObject which isn’t all that useful. This is because the property being passed is DateTime while the parameter of this function is Date. The PropertyByName completely fails because it has no idea about the incoming data. So will ByValue work instead? Lets find out.

Trace-Command parameterbinding {$nonType | get-Something_Value_NoAlias} -PSHost

image

A little better this time around, but still a failure. Since it is ByValue, the parameter doesn’t care about what the name is of the object being passed through. It does see the input as a hash table with the data viewable, but still fails because it is neither the type of [datetime] nor can it be converted to the type as well. Just for run, lets pass a single integer into this and see how it works out.

Trace-Command parameterbinding {1 | get-Something_Value_NoAlias} -PSHost

image

Obviously it was never going to be of the [datetime] type, but it was easily converted into a [datetime] type object so it was able to bind to the parameter even without the alias attribute.

Ok, now we are going to add an Alias attribute for ‘DateTime’ to handle the incoming object.

Function Get-Something_PropName_Alias {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipelineByPropertyName=$True)]
        [Alias('DateTime')]
        [datetime[]]$Date
    )
    Process {$Date}
}
Trace-Command parameterbinding {$nonType | get-Something_PropName_Alias} -PSHost

image

As expected with the nontype input, the first check fails and then, thanks to the alias that handles the DateTime parameter, is successful with the COERCION check. Rather than show what would happen with the ByValue attempt, what do you think will happen this time around? HINT: History will repeat itself.

Working with both ByPropertyName and ByValue and Aliases

Up until now, I have been working with either ByValue or ByPropertyName, but never actually combining both into a function. That changes with the following example. Here I will have both configured as well as setting an Alias to show what happens both with the nontype input.

Function Get-Something_PropName_Value_Alias {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
        [Alias("Datetime")]
        [datetime]$Date
    )
    Process {$Date}
}

image

This shows a perfect example of the list I showed on how the parameter attempts to bind the incoming data. First it attempts the ByValue and ByPropertyName with NO COERCION (matching the type of object to parameter) and then proceeding to the type conversion (COERCION) with ByPropertyName before finally succeeding with the ByValue type conversion.

Doing what I did earlier by adding a 1 instead of the $nonType shows a different result by succeeding on the ValueFromPipeline WITH COERCION.

image

If using No Aliases in your function for parameters, expect some issues as well. If you are passing an object into your function that doesn’t have the same property name as your parameter, then it will fail regardless of how the pipeline attributes are set. The only way it would succeed is if you pass a single object (vs. an object that has multiple properties) that will work for the ByValue and is either of the same type or can be converted to the required type.

That wraps up this post on implementing pipeline support as well as taking a swim into using Trace-Command to debug parameter binding. Hopefully this has provided you enough information to feel better prepared to implementing pipeline support as well as troubleshooting when it fails.

Posted in 2013 Scripting Games Judges Notes, powershell, Scripting Games 2013 | Tagged , , , | 27 Comments

Scripting Games 2013: Event 1 ‘Favorite’ and ‘Not So Favorite’ Submissions

As a follow-up to my previous blog post, I plan to pick out a submission or two or three which stood out as my personal favorite and least favorite and tell you why I think this by pointing pieces of code that was either put together nicely or could have been improved in one way or another. Depending on my time, I will do at least 1 Advanced and 1 Beginner submission for both ‘Favorite’ and ‘Not so Favorite. I’ll start out by listing the code and then discussing it bullet point style to highlight my thoughts. So with that, lets begin this journey through the Event 1 submissions.

One last thing: Please don’t take any of this personal if I negatively critique your code. I will be a little tougher on Advanced than on Beginner because if you are doing the Advanced category, I expect that you know what you are doing. Smile As someone who has competed in the past in the games and also who publishes various scripts and articles online, getting reviewed (and corrected) by your peers is one of the best ways to better yourself as a scripter.

Advanced Category – Not So Favorite Submission

<#
	.SYNOPSIS
		script that can run against local folder structure to move 
		program logs older than specified time to specified location.
		The script will keep the folder structure intact, creating
		subfolders as needed to match the folder structure of the source.
 
	.DESCRIPTION
		Advanced 1 4-29-13
		Dr. Scripto is in a tizzy! It seems that someone has allowed a
		series of application log files to pile up for around two years,
		and they’re starting to put the pinch on free disk space on a
		server.  Your job is to help get the old files off to a new location.
		Actually, this happened last week, too. You might as well create a
		tool to do the archiving.
		The current set of log files are located in C:\Application\Log. There
		are three applications that write logs here, and each uses its own
		subfolder.  For example, C:\Application\Log\App1, C:\Application\Log\OtherApp,
		and C:\Application\Log\ThisAppAlso.  Within those subfolders, the
		filenames are random GUIDs with a .LOG filename extension. Once created
		on disk, the files are never touched again by the applications. Your
		goal is to grab all of the files older than 90 days and move them to
		\\NASServer\Archives ‐ although that path will change from time to time.
		You need to maintain the subfolder structure, so that files from
		C:\Application\Log\App1 get moved to \\NASServer\Archives\App1, and
		so forth.  Make those paths parameters, so that Dr. Scripto can just
		run this tool with whatever paths are appropriate at the time.
		The 90 ‐day period should be a parameter too.  You want to ensure that
		any errors that happen during the move are clearly displayed to whoever
		is running your command. If no errors occur, your command doesn’t need
		to display any output – “no news is good news.”
 
		.EXAMPLE
	Advanced_1.ps1
		This will move the logs from the source folder to the default destination folder
		"\\NASServer\Archives\" with the default time period of 90 days or older.
 
	Advanced_1.ps1 -d RemoteLocation
		This will move the logs from the source folder to the specified destination folder
		with the default time period of 90 days.  The path should have a trailing backslash.
 
	Advanced_1.ps1 -t FilesOlderThanThisNumber
		This will move the logs from the source folder to the default destination folder
		"\\NASServer\Archives\" with the specified time period.  This number should be
		represented by a negative number.
 
	Advanced_1.ps1 -d RemoteLocation -t FilesOlderThanThisNumber
		This will move the logs from the source folder to the specified destination folder
		with the specified time period.
#>
 
Param(
	[alias("d")][string]$RootDestination="\\scanbox\workstation\archive\",
	[alias("t")][string]$TimePeriod="-90"
	)
 
GCI C:\Application\Log -Recurse -Filter "*.log" | `
?{$_.LastWriteTime -lt (get-date).AddDays($TimePeriod)} | `
%{$Dest=$_.Directory.Name;`
If(-not(Test-Path -Path $RootDestination$Dest))`
{NI -itemtype directory $RootDestination$Dest | Out-Null};`
MV -Path $_.FullName -Destination $RootDestination$Dest}
  • Has commented help BUT it is missing the parameters that are in this script. As an advanced function, this has to be complete if it is going to be a re-usable tool.
  • Aliases are everywhere! From GCI to % to ?, this makes it very hard to read
  • Using backticks “`” as a line break is completely unnecessary, especially when you have the pipes “|” that server as a natural break.
  • A lack of any error handling always hurts a script/function
  • Giving the script a better name other than advanced_1.ps1 that better represents what you are using it for. Move-LogFile.ps1 is much better or something similar.
  • Kudos to the default parameter values; just too bad there weren’t a few more so hard coded values were used later in the script. “.Log” and the Source location should have been their own parameters.
  • Would be better as a function for re-usability rather than calling script each time

 

Advanced Category – Favorite Submission

function Move-LogFile {
 
<#
.SYNOPSIS
Move files with a .log extension from the SourcePath to the DestinationPath that are older than Days. 
.DESCRIPTION
Move-LogFile is a function that moves files with a .log extension from the first level subfolders that are specified via
the SourcePath parameter to the same subfolder name in the DestinationPath parameter that are older than the number of days
specified in the Days parameter. If a subfolder does not exist in the destination with the same name as the source, it will
be created. This function requires PowerShell version 3.
.PARAMETER SourcePath
The parent path of the subfolders where the log files reside. Log files in the actual SourcePath folder will not be archived,
only first level subfolders of the specified SourcePath location.
.PARAMETER DestinationPath
Parent Path of the Destination folder to archive the log files. The name of the original subfolder where the log files reside
will be created if it doesn't already exist in the Destination folder. Destination subfolders are only created if one or more
files need to be archived based on the days parameter. Empty subfolders are not created until needed.
.PARAMETER Days
Log files not written to in more than the number of days specified in this parameter are moved to the destination folder location.
.PARAMETER Force
Switch parameter that when specified overwrites destination files if they already exist.
.EXAMPLE
Move-LogFile -SourcePath 'C:\Application\Log' -DestinationPath '\\NASServer\Archives' -Days 90
.EXAMPLE
Move-LogFile -SourcePath 'C:\Application\Log' -DestinationPath '\\NASServer\Archives' -Days 90 -Force
#>
 
    [CmdletBinding()]
    param (
        [string]$SourcePath = 'C:\Application\Log',
        [string]$DestinationPath = '\\NASServer\Archives',
        [int]$Days = 90,
        [switch]$Force
    )
 
    BEGIN {        
        Write-Verbose "Retrieving a list of files to be archived that are older than $($Days) days"
        try {
            $files = Get-ChildItem -Path (Join-Path -Path $SourcePath -ChildPath '*\*.log') -ErrorAction Stop |
                     Where-Object LastWriteTime -lt (Get-Date).AddDays(-$days)
        }
        catch {
            Write-Warning $_.Exception.Message
        }
 
        $folders = $files.directory.name | Select-Object -Unique
        Write-Verbose "A total of $($files.Count) files have been found in $($folders.Count) folders that require archival"
    }
 
    PROCESS {
        foreach ($folder in $folders) {
 
            $problem = $false
            $ArchiveDestination = Join-Path -Path $DestinationPath -ChildPath $folder
            $ArchiveSource = Join-Path -Path $SourcePath -ChildPath $folder
            $ArchiveFiles = $files | Where-Object directoryname -eq $ArchiveSource
 
            if (-not (Test-Path $ArchiveDestination)) {
                Write-Verbose "Creating a directory named $($folder) in $($DestinationPath)"
                try {
                    New-Item -ItemType directory -Path $ArchiveDestination -ErrorAction Stop | Out-Null
                }
                catch {
                    $problem = $true
                    Write-Warning $_.Exception.Message
                }
            }
 
            if (-not $problem) {
                Write-Verbose "Archiving $($ArchiveFiles.Count) files from $($ArchiveSource) to $($ArchiveDestination)"
                try {
                    If ($Force) {
                        $ArchiveFiles | Move-Item -Destination $ArchiveDestination -Force -ErrorAction Stop
                    }
                    Else {
                        $ArchiveFiles | Move-Item -Destination $ArchiveDestination -ErrorAction Stop
                    }
                }
                catch {
                    Write-Warning $_.Exception.Message
                }
            }
 
        }
    }
 
    END {
        Remove-Variable -Name SourcePath, DestinationPath, Days, Force, files, folders, folder,
        problem, ArchiveDestination, ArchiveSource, ArchiveFiles -ErrorAction SilentlyContinue
    }
 
}
  • Great use of commented help to include all parameters and nice examples
  • Use of a function (with proper naming convention) which is great!
  • Using Write-Verbose and making sure to use [cmdletbinding()] allows the user running the function to specify the –Verbose parameter to add more information to the console when being run.
  • Great use of error handling to deal with anything that comes up
  • I wish that there would have been a parameter for Filter so it could handle other file extensions.
  • Makes good use of the Begin,Process and End blocks; these were not technically needed as none of the parameters allowed for the use of the pipeline.
  • I like the use of Join-Path to combine the folder paths; makes the code look cleaner
  • Also liked the use of ‘*\*.log’ in the Get-ChildItem command to get all of the files; clever

Beginner Category – Not So Favorite Submission

Get-ChildItem -Path 'C:\Application\Log' -Recurse | Where-Object {
$_.LastWriteTime -lt (Get-Date).AddDays(-90)
} | Move-Item -Force -Destination '\\NASServer\Archives'

Note: This is a one-liner that I intentionally broke up so I can all fit without much scrolling.

  • This assumes that there will only be .Log files; should have used –Filter to grab only the .log files. Never assume with a file search.
  • No appropriate destination folders are created to archive each of the log files from their specified source directory.
  • Because the proper destination directory doesn’t exist, the logs are all bundled together in one folder.
  • Commented help would always be great; even in a beginner event. The sooner you start adding the help, the better off you will be.
  • I do like that full cmdlet names were used instead of aliases

 

Beginner Category – Favorite Submission

<#
.SYNOPSIS
    Moves application log files that are older than 90 days to \\NASServer\Archives.
.DESCRIPTION
    This script was originally written for Event 1 of the 2013 Scripting Games (Beginners Track).
.EXAMPLE
    C:\Scripts\Move-LogFiles.ps1  
#>
 
$files = Get-ChildItem –Path  “C:\Application\Log” -Filter *.log  –Recurse | Where-Object -FilterScript {$_.CreationTime –lt (Get-Date).AddDays(-90)} 
 
foreach($file in $files)
{
    if(!(Test-Path -LiteralPath "\\NASServer\Archives\$($file.Directory.Name)"))
    {
            New-Item -Path "\\NASServer\Archives\$($file.Directory.Name)" -ItemType Directory -ErrorAction Stop | Out-Null
    }
 
    Move-Item -Path $file.FullName -Destination "\\NASServer\Archives\$($file.Directory.Name)\$($file.Name)"
}
 
  • Kudos for the commented help in this script. This makes it easier for someone to pick and run Get-Help against to better understand how this works.
  • Good use of –Filter against the .log extension
  • I like that there were no vbscript style concatenation happening with the folder paths
  • Per the submission requirement, using Out-Null with New-Item to create each directory suppresses the object output that occurs when creating anything with the New-Item cmdlet.
  • Clean code that makes it easy to read and understand what is happening. Could have done a line break at the Where-Object, but still readable.

That’s it for Event 1. Thanks again to everyone who has competed this year and submitted their entries! Everyone here has done incredible work and I am truly looking forward to seeing what is submitted for the future events! Keep it up!

Posted in 2013 Scripting Games Judges Notes, powershell, Scripting Games 2013 | Tagged , , | 5 Comments

Scripting Games 2013: Thoughts After Event 1

With Event 1 in the books for the 2013 Scripting Games, we are now in the voting period where the community gets the chance to play judge on all of the scripts submitted by voting and commenting on the submissions. My next articles will cover my “Favorite” and “Least Favorite” scripts submitted along with my reasons why. For this article, I want to cover some of the things that I have seen which are not the best approach when writing your script/function.

Where is the help!?!

Yes, where indeed is the help at? I was amazed by just how many submissions I have reviewed that didn’t have inline help (comment based help).  For the advanced entries, there really is no excuse to why no help has been added. By the way, this is not acceptable help:

Function Test-Something {
    <#
        Script does something

        Usage: Test-Something -Computername H1 -Source C:\
        
        More stuff here    
    #>

}

There is a specific format to the help that you need to use to make it work properly.

Same goes for when help is included, but missing key things such as parameters. If you have a parameter listed in the function, it had better be listed in the help. I saw one example that looked something like this:

Function Test-Something {
    <#
        .SYNOPSIS
            This does something

        .DESCRIPTION
            This does something

        .PARAMETER Computername
            Name of computer

        .PARAMETER Source
            Source of something

        .EXAMPLE
            Test-Something -Computername H1 -Source "C:\" -Destination "D:\"
            
    #>
    [cmdletbinding()]
    Param (
        $Computername,
        $Source,
        $Destination
    )
    ...
}

Where is the Destination parameter in the help? This really is the easiest part of the script. Just add all of your parameters, give a decent description and some good examples (in fact one of your examples could be based on the requirement of the event!).

Naming conventions for your functions and scripts

I actually have an article out for this here. Short story: Use Verb-SingularNoun for your names. That is all.

Aliases…Don’t use them

While it works great for one-liners and when you might be working interactively, using aliases is a great way to work with PowerShell. What it isn’t so great for is when you are writing scripts (and especially when you are submitting scripts for the Scripting Games!). If your code looks like the below example, then you need to re-think your strategy.

GCI C:\Application\Log -Recurse -Filter "*.log" | `
?{$_.LastWriteTime -lt (get-date).AddDays($TimePeriod)} | `
%{$Dest=$_.Directory.Name

This is difficult to read unless you know what all of the aliases mean. In this case, it might not be that bad, but imagine hundreds of lines of code that you are working through to troubleshoot. Are you really going to want to do this? I wouldn’t want to for this and another reason that segues into this nicely…

Formatting your code

What else is wrong with the above code? If you said the backtick “`”, then you are correct! This is not a good way to break a line in the code. Use what PowerShell has provided for you in the following: pipe “|”, opening curly bracket “{“ and comma “,”. Each of these servers as a natural line break that allows you to continue onto the next line without worrying about errors occurring. The same goes if you an opportunity to break a line before it gets too long.

Get-ChildItem C:\Application\Log -Recurse -Filter "*.log" | Where {
        $_.LastWriteTime -lt (get-date).AddDays($TimePeriod)
} | ForEach {
    $Dest=$_.Directory.Name
}

Easier to read and troubleshoot. Remember, it might look good now, but think about who else could be looking at your code and the possibility that you will be back to look at it in 6 months when you need to troubleshoot or add something to the existing code. Might not look so pretty then.

Use parameters instead of Where-Object for filtering

The parameters are there for a reason. They can more efficiently filter out the good data that you are looking for from the bad data. This especially goes if you are writing your scripts for PowerShell V3. Take this example:

Get-ChildItem -Path $Source -Filter *.log | Where {$_.PSIsContainer}

While I’m happy that the Where-Object is being used correctly, being that this is V3, it is completely unneeded. Get-ChildItem has a –Directory parameter that filters out for only Directories. Same goes for files with the –File parameter. This practice should be used on any event or “real world” script you write.

Useful Parameters

While we are on the subject of parameters, make your parameters useful. Look at what exist already in PowerShell for parameters and make those your baseline. Computername, Source, Destination, etc… should be what you aim for. Not Computer, Server, DestinationFolder,A,B,C etc… If something similar doesn’t exist, then keep it singular when you create it, not plural. Strive to keep consistency and it will pay off for you!

This is PowerShell, not Vbscript

Please do not follow a vbscript mentality. When you create a variable, make it meaningful to what you are saving, not the type of object you are saving. Any of these are bad:

  • strcomputer
  • arrnumbers
  • objwmi

Just keep it to what the variable represents instead of the type of data and you will be following a better practice of writing scripts in PowerShell.

The same goes for the “catch all” error handling that would occur in vbscript with On Error Resume Next. Do not use $ErrorActionPreference=’SilentlyContinue’ at the beginning of your scripts. This will mask every error that occurs and will usually work against you, especially when troubleshooting why the script isn’t doing what you think it should.

Accepts pipeline input, but where is the Process block?

If you have a script with a parameter that you have decided to accept pipeline input, then you had better have your Begin,Process and End block (or at the very least a Process block; but you had better have EVERYTHING in that Process block or it cause you issues!). If you do not have a Process block, then you are really missing out on the purpose of doing allowing pipeline input. Check the following out:

Function Test-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeLine=$True)]
        [string[]]$Computername
    )
    ForEach ($Computer in $Computername) {
        $Computer
    }
}

Let’s run this and see what happens with pipeline input.

1,2,3,4,5 | Test-Something

What do you think will happen? Will it show everything or will it just show the last item? Lets find out!

image

Well, how about that! It only display the last item in the collection that was passed to the pipeline. Why did this happen? In short, it was because we didn’t use a Process block to handle the pipeline input correctly. Lets fix that!

Function Test-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeLine=$True)]
        [string[]]$Computername
    )
    Process {
        ForEach ($Computer in $Computername) {
            $Computer
        }
    }
}

Now I will run the same command and lets see what happens this time!

image

Now that is much better! It is now behaving exactly like it should with the pipeline input. Remember, the process block will run for each item being passed through the pipeline, so do not keep anything in that block that you plan on using at the end, such as creating an empty array for a report to store each item being processed (yes, this did happen in a submission).

Function Test-Something {
    [cmdletbinding()]
    Param (
        [parameter(ValueFromPipeLine=$True)]
        [string[]]$Computername
    )
    Process {
        Write-Verbose "Processing pipeline"
        ForEach ($Computer in $Computername) {
            $Computer
        }
    }
}

image

Use the Begin{} to initialize anything that you need prior to the Process{} block and then use End{} to wrap anything up.

I think I have covered enough things for one night. Keep these tips in mind as your not only progress through the rest of the events, but work on future scripts out in the ‘real world’.

Posted in 2013 Scripting Games Judges Notes, Scripting Games 2013 | Tagged , , | 10 Comments