PowerShell Server Inventory, Part 2: Collecting and Sending Data

This will be at least a 3 part series in which I will go over each aspect of my build which will cover the following topics:

Picking up where I left at, we have created our database and tables that will be used to store all of our data on our servers. Next up is to begin querying each server in the domain and pull as much information as we can and then send that data up to SQL so we can view it later on.

For this part of querying and sending data to the SQL server, I felt that it would be best to do some multithreading to speed up the time that it would take to hit all of the systems rather than perform the work sequentially. My multithreading approach of choice is to use my module, PoshRSJob to handle this. You can grab this module from my GitHub repo and place it on whichever server that you will be using to schedule the task from to perform the scans and uploads to SQL.

I say that you can scheduled a job to run the next script, but this is something that could be performed manually as well if need be and in fact, I would recommend this to ensure that no errors are thrown during the script execution.

The script that I wrote which will be used is called Invoke-ServerInventoryDataGathering.ps1 which only has a single parameter that accepts what SQL server will be used to send the data to. The script itself is over 1000 lines of which most of it consists of various regions which are used to group each type of data pull ranging from user account information to drive data to system configurations. In fact, about half of the code consists of helper functions which are used for data gathering or, in the case of Get-Server, is used to perform a domain wide lookup of all of the servers for a more dynamic approach of ensuring that all servers are being checked rather than using a file to host all of the systems.


There is a variable called $ServerGroup which is keeping a value of ‘MemberServer’. You can disregard this (keep it uncommented though) as I have my lab set up differently where Domain Admins do not have access to anything other than domain controllers which require a second script to be used to query the domain controllers and then write the data to the SQL server (which does allow domain admins to access for the purpose of writing the data to SQL).

Continuing to step through the code,  I have several regions which I use to group together each type of query. If more queries need to be added to grab more information, I can simply create a new region and add the necessary code to gather and send the data.


Each region follows the same type of process:

  1. Query the remote system for the particular information that I need
  2. If the query was successful, then proceed to perform a check against the database for data on the same system on the table that matches the type of data being queried for
  3. If data already exists, then remove the existing data
  4. Send the new data up to the appropriate table based on the current data being collected

Kicking off the script manually is just a matter of running it. In my case, VSQL is hard coded as the parameter value for the SQL server. In other cases, you would want to supply your own value.


I use Wait-RSJob with the –ShowProgress switch so I can track the status of each job. Each job represents a computer that is being scanned and the data gathered sent to a SQL database for later viewing.


Of course,  the preferred use of this script is by putting it into a scheduled task so it can be run on a daily schedule to handle all of the updates or new systems that come into the environment.


With all of this done, we can quickly verify that data actually exists. We can either look in SQL Server Management Studio for the data, or run a simple PowerShell command using the Invoke-SQL function against one of the tables.


PowerShell Example

$TSQL = @"
SELECT TOP 1000 [ComputerName]
  FROM [ServerInventory].[dbo].[tbGeneral]

Invoke-SQLCmd -Computername VSQL -TSQL $TSQL -Database ServerInventory -CommandType Query


All of this code as well as the rest of the Server Inventory code is available at my GitHub repo here: https://github.com/proxb/ServerInventoryReport

The last part in this series will take us through building a tool that will pull the data from SQL and provide users with a graphical way to view and generate reports.

Posted in powershell | Tagged , , , , | 5 Comments

PowerShell Server Inventory, Part 1: SQL Database Build

If you’re like me, you might not have SCOM or SCCM or some other 3rd party solution that you can leverage to track all of your systems in your environment that you can pull up at a glance to know what software is on it or what local account might be created on it. Fortunately, with the combined forces of PowerShell and SQL, we can build up our own solution that can provide us this information. This is where I came up with a project to roll my own server inventory system.


This will be at least a 3 part series in which I will go over each aspect of my build which will cover the following topics:

This article will focus on the SQL database and table build using a few techniques. I consider this almost a living script as I have added and even removed a few things since its original inception. I opted to create tables for each section that will be inventoried. Currently, the sections are (in no particular order):

  • AdminShares – Holds the admin shares of a server
  • Drives – Holds the different drives on a server such as CD-ROMs and hard drives
  • General – Holds information that doesn’t really apply to the other tables
  • Groups – Holds local group information on the server
  • Memory – Holds the memory information of a server
  • Network – Holds networking configuration information of a server
  • OperatingSystem – Holds operating system information
  • Processor – Information about the processors on a server
  • ScheduledTasks – Lists all scheduled tasks on a server
  • ServerRoles – Lists all server roles, if applicable, on a server
  • Services – Lists all services on a server
  • Software – Lists all installed software on a server
  • Updates – Lists all installed updates on a server
  • Users – Lists all local users on a server
  • UserShares – Lists all user created shares on a server

Of course, this could change over time depending on various requirements, but I am hoping that for the most part, things stay the same and the only things that would need to be updated would be the content in each table such as adding or removing various properties. Adding new tables is as easy as copying the code from a different table creation region and making changes to add the new names and properties to include.

One last requirement that I had was that this data was only meant to last until the next run of the scheduled task which collects the data and sends it to SQL. This means that whatever that I had prior to the run will be replaced by the new data. Probably not the most efficient way to handle the data, but there is always room for improvement with this project.

The script that I wrote to help build not only the database, but the tables where I will push all of my data to is found at the following link: https://github.com/proxb/ServerInventoryReport/blob/master/Invoke-ServerInventorySQLBuild.ps1

I have defaulted the SQL server name to vSQL as that is my server in my lab environment so feel free to update it as needed or just call the script with the parameter –SQLServer and it will do the rest of the work for you! You can also see that I have my own version of Invoke-SQLCmd here. There are much better versions of this out in the wild so you can feel free to rip this one out and replace with a different one if you wish.

As for the column names and types, I tried to make it something that would ‘just work’ and also after some trial and error with figuring out a size for some of the data in the columns where I ran into errors when sending the data to SQL as something like a display name was longer than what was allowed in SQL. This might still be an issue for some of you depending on what you have running, but for that you can just the code and re-build the database.

Running the code is simple, just make sure that you point it to a valid SQL server and let it run!



It shouldn’t take too long to finish and after that, you can use SQL Server Management Studio (or whatever else you want to use) to validate that the database and tables have been created.


Part 2 will take us to performing the queries against remote server and sending the data collected up to the SQL server using a script that I wrote which will leverage my PoshRSJobs module for added multithreading.

Posted in powershell | Tagged , , , | 1 Comment

Speaking at Austin PowerShell User Group


This Thursday at 6pm CST I will be speaking at the Austin PowerShell User Group. I won’t actually be there (I wish I was there) but instead will be doing this remotely from my house. My topic will be on PowerShell runspaces which happens to be one of my favorite things to talk about.

This was supposed to have happened last month but I ended up being pretty sick and had to  reschedule. This week I am feeling great and looking forward to sharing my knowledge on PowerShell and runspaces!

Be sure to check out the link here for more information; https://www.meetup.com/Austin-PowerShell/events/237381901/

Posted in powershell | Tagged , , , , | Leave a comment

New Updates in PoshRSJob

I recently pushed out some new updates and bug fixes to my PoshRSJob module. I am continuing to provide updates to this module based not only on the feedback on the GitHub Issues page, but also on things that I happen to think about as well to continue to make this an amazing module to use!

This update only included 3 major updates that I focused on and each one was important to help ensure that the module is more stable and provides more accurate data. I’ll highlight each update and provide a little bit of information for each one.


This one was an odd bug that I hadn’t encountered while using the module and during testing. Fortunately for me, others have really put this module through the ringer and have found some bugs that I doubt that I would have ever found.

What was happening was that the runspacepool monitor was determining that a newly created runspacepool was already past its expiration timestamp and would then dispose of the runspacepool which resulted in errors when runspaces being built on the runspacepool was trying to be invoked. The solution to this was actually very simple: when the runspacepool is created, set the time of expiration to 5 minutes in the future to give Start-RSJob enough time to create and start the runspaces within that runspacepool. Once the runspaces are going, the runspacepool monitor will update the timestamp as long as runspaces are still being used within the pool.


This one is not exactly a major bug, but still one that had to be dealt with and squashed.

Instead of using Break to halt all activity in the Wait-RSJob command and any subsequent commands, I instead use return instead which allows other commands to run after it. I had originally use Break because I wanted to ensure that no other commands within Wait-RSJob would, causing potential errors later on in the command.


This one was an issue that was reported a long time after I built this module but in the back of my mind was something that I really wanted to solve. Being able to track each runspace as its own job and to know if it was truly running or waiting to run would really change how PoshRSJob would look. Prior to this, all jobs showed as Running regardless of how many jobs were running. This led to a lot of confusion as to whether jobs were actually being throttled (they were) and this just added fuel to my fire of solving this issue.

This led me down the path of finding a useful way of determining which runspaces in a runspacepool were actually running while others were still waiting in the queue to kick off. In the end, I took what I learned with that and applied it to PoshRSJob. You can see an example in the image below.


There is still plenty to do with this module and I hope to continue to implement the various feature requests that others have asked for as well as the things that I would like to see done to make this an amazing module. Of course, I am always looking for others to help out with this if they want to contribute to any of the posted issues or if they happen to have other ideas.

Posted in powershell | Tagged , | Leave a comment

2017 PowerShell Resolutions

I’m a little late this year in getting this blog posted. But I definitely wanted to get this out before the end of 2016. Smile

As with the last couple of years (2014, 2015), I wanted to look back on the previous years resolutions and see how I did as well as setting up my 2017 ones for everyone to check out.

So without further ado, let’s see what I had for last year and see if I managed to complete my resolutions!

  • Speak at a local conference (SQL Saturday, InfoTech, Nebraska.Code(), etc…). Yes I am using this again but really it is never guaranteed to speak at these things so making one would be great!

Well, I submitted for SQL Saturday as well as Nebraska.Code() but unfortunately didn’t get any of my talks accepted. I won’t call this one a bust since I did the tough part of preparing and submitting sessions. Anything that happens with the approval was out of my control.

  • Work with PowerShell Classes, etc…. These are new in PowerShell V5 and I honestly haven’t even messed with these much, so it is on my TODO list.

I did spend some times with classes in PowerShell V5 and even added some of them to my PoshRSJob module where appropriate as well as blogging about them on MCPMag.com. I’d call this one a success!

  • Start working on re-writing a module as a binary module in C# to better learn the language.

Unfortunately working with bug and feature requests on PoshRSJob took away from my attempts to convert a module from a script module to a binary module. Maybe next year!

  • Do something with web APIs and OAuth stuff. I keep meaning to invest some time with this but always get side tracked.

While I wanted to get to this and work more learning OAuth and working with web APIs, it just didn’t pan out for me. I was looking to use Yahoo Fantasy Football as a great means to do this but time wasn’t on my side.

Looking back at last year’s resolutions, I was probably 1.5 out of 4 (I’m counting my conference one as at least .5 for effort), not the best showing but that is the way that it goes. Maybe 2017 will be a little more better to me to knock some of my new ones out.

Speaking of 2017, my PowerShell resolutions for the coming year are ones that I hope to knock out during the year barring any sort of craziness which can and will happen. So here they are:

  • Speak at a user group or conference this year. I figure that I will lump these into one thing as I am sure that I will hopefully accomplish one of these in 2017.
  • Write at at least 2 blogs that relate to SQL or InfoSec. I recently took on a position as a SQL DBA after many years of working more in the infrastructure of Windows with Active Directory, DNS, etc…. At same time, I am also very interested in the InfoSec field so I feel that this would be a good way to get more into both topics using PowerShell.
  • Start a new PowerShell project. I have been working a lot on PoshRSJob to make it a great module and fix a ton of the bugs and feature requests that I have received for it this year, but I want to be able to put out a new project sometime this year. I have a few ideas but we will see which one will get pushed out.
  • Operational Validation with PowerShell and Pester. This is something that I have been starting to look at but plan on doing more in 2017 both in my job as well as with a blog or two.

What about you? I’d love to hear about what you are planning on doing in 2017 with PowerShell so be sure to leave them in the comments!

Happy New Year!

Posted in powershell | Tagged , , | 4 Comments