Petri Newsletter Sign-up
Tech Tuesday

Subscribe to Tech Tuesday, the latest insights from for IT Pros.

    See All Petri Newsletters

    PowerShell Problem Solver: Processor Loads

    Posted on by Jeff Hicks in PowerShell

    Today’s PowerShell Problem Solver comes from a post I read in a forum. The original question involved the best way to get the average processor utilization for a remote server, as well as the top five processes using the most CPU. There’s a lot to work with here, and it might take more than one article, but let’s break this down and use the scenario as a learning exercise. Along the way you might even end up with a useful PowerShell tool that you can use.

    Using WMI

    The easiest way to get processor load information with WMI and the Win32_Processor class.

    Getting processor information (Image Credit: Jeff Hicks)
    Getting processor information (Image Credit: Jeff Hicks)

    I’m using Get-WmiObject, but you could just as easily use Get-CimInstance. The WMI class has a property called LoadPercentage. According to the MSDN documentation for this class, this property is the “load capacity of each processor, averaged to the last second. Processor loading refers to the total computing burden for each processor at one time.”

    Because my computer only has one physical processo, there isn’t an average. But you might have computers with multiple processors in which case you will get an WMI instance back for each one, each with its own LoadPercentage property. But no matter, we can easily calculate an average with Measure-Object.

    Getting a LoadPercentage average (Image Credit: Jeff Hicks)
    Getting a LoadPercentage average (Image Credit: Jeff Hicks)

    I don’t have anything with more than one processor, but I’m going to pretend that I do for the sake of demonstration. In this example, it is pretty easy to look at the screen and see that the information is for a specific computer. But, let’s keep in mind that PowerShell is about managing at scale. If I can run this command for one server, then I can run it for many. But I need a way to correlate the computername with the output. Measure-Object writes a different type of object to the pipeline, which means I potentially lose the PSComputername property I would otherwise get with Get-WmiObject.

    To demonstrate, I want to get the average LoadPercentage from several servers.

    Because I can’t determine which output object came from which computer, I will have to run my WMI command sequentially for each server.

    Getting Average LoadPercentage for multiple servers (Image Credit: Jeff Hicks)
    Getting Average LoadPercentage for multiple servers (Image Credit: Jeff Hicks)

    The trick is to use a new PowerShell V4 feature called the PipelineVariable. This is a new common parameter that makes it easier to capture information in one part of a pipelined expression and use it later. In my example, the output from Get-WmiObject for each computer is also saved to a variable I called pv. You can call it anything you want. Just remember you only specify the name. You only need the $ symbol when you want to reference the variable, which I do at the end. I’m creating a custom property called Computername which uses the PSComputername property from my pipelinevariable.

    Using Remoting Techniques

    A variation you might consider is a PowerShell workflow. This feature was introduced in PowerShell 3.0. When you run a workflow, the command runs on the remote computer. Workflows are more than another type of script but for our purposes work rather nicely.

    One benefit of a workflow is that all of the remoting is built in and the results always include the remote computername.

    Using a PowerShell workflow (Image Credit: Jeff Hicks)
    Using a PowerShell workflow (Image Credit: Jeff Hicks)

    Another advantage is performance. Whereas my first attempt ran through the list of computers sequentially, this command runs in parallel remotely and finishes in a fraction of the time. To be honest, I probably could have done the same thing using Invoke-Command.

    Using remoting (Image Credit: Jeff Hicks)
    Using remoting (Image Credit: Jeff Hicks)

    Bear in mind that when measuring something, the mere act of observing can affect the results. I’ve always assumed this meant the best way to measure is from a distance. With that in mind, let’s go back to workflow and take advantage of its parallel processing capabilities.

    Because of the way variables are scoped in a workflow, I had to modify my logic a little bit. This workflow is going to run locally and make parallel WMI connections to $computers.

    Using a parallel workflow (Image Credit: Jeff Hicks)
    Using a parallel workflow (Image Credit: Jeff Hicks)

    But this too runs pretty quickly.

    There is another way to see processor load or utilization and we’ll explore those in another article.


    Don't have a login but want to join the conversation? Sign up for a Petri Account


    Register for this Petri Webinar!

    Want to Make Your Backup Storage Unlimited & Ready for the Cloud? – Free Thurrott Premium Account with Webinar Registration!

    Tuesday, August 27, 2019 @ 1:00 pm EDT

    A Scale-Out Backup storage infrastructure is a must-have technology for your backups. In this webinar, join expert Rick Vanover for a look on what real-world problems are solved by the Scale-Out Backup Repository.

    Register Now

    Sponsored By