Performance Data Gathering

As a DBA, one of the things I struggled with was gathering all my important system information in one place, so I could see easily where my problems might be. Gathering perfmon data, combining it with trace data, adding in disk space information – it was hard to do this automatically.

This is probably why I like PowerShell so much. I can do this from a single script and don’t have to try to merge data later.

The perfmon data gathering was always a problem for me, trying to export the log files into something useful. I’ve found that .NET has a namespace specifically for that – System.Diagnostics.PerformanceCounter. By creating (instantiating) objects of the PerformanceCounter type, I can collect the same perfmon counter data that perfmon, and I can control where that data goes. Here’s the PowerShell code to capture some key counters:

$ppt = New-Object System.Diagnostics.PerformanceCounter
$ppt.CategoryName = 'Processor'
$ppt.CounterName = '% Processor Time'
$ppt.InstanceName = '_Total'
$pptv = $ppt.NextValue()
$mab = New-Object System.Diagnostics.PerformanceCounter
$mab.CategoryName = 'Memory'
$mab.CounterName = 'Available MBytes'
$pql = New-Object System.Diagnostics.PerformanceCounter
$pql.CategoryName = 'System'
$pql.CounterName = 'Processor Queue Length'
$bch = New-Object System.Diagnostics.PerformanceCounter
$bch.CategoryName = 'SQLServer:Buffer Manager'
$bch.CounterName = 'Buffer cache hit ratio'
$brs = New-Object System.Diagnostics.PerformanceCounter
$brs.CategoryName = 'SQLServer:SQL Statistics'
$brs.CounterName = 'Batch Requests/sec'

So, in these few lines we’re setting up the collection for Percent Processor Time, Available Megabytes of Memory, Processor Queue Length, Buffer Cache Hit Ratio and Batch Requests/sec. Not a bad start to building our performance baseline. Notice that I called the NextValue method for the Percent Processor Time counter. I did this because this counter needs to be “seeded” – the first returned value is always zero.

Next we need to have a place to put the values. I like the fact that PowerShell uses objects, and learned that youc an create your own objects, with whatever properties you’d like. COOL!

$perf = new-object object

Now I have an object called $perf with no properties. I need to add the properties corresponding to the counters I’m collecting, so I’ll use the Add-Member cmdlet to do this. (Note that piping the $perf object into the add-member cmdlet causes the object to become the value passed via the -InputObject parameter to add-member. The pipeline doesn’t really go right-to-left here.)

In my case I’m setting up each additional properties as NoteProperties, and will assign values to them. I could have also used the ServerProperty parameter and used a scriptblock, but every time the object is referenced it would run the script block, and I wanted the results to be static in time.

$perf | add-member NoteProperty Server 'MyServer'
$perf | add-member NoteProperty Date get-date
$perf | add-member NoteProperty PctProc $ppt.NextValue()
$perf | add-member NoteProperty Memory $mab.NextValue()
$perf | add-member NoteProperty ProcQueLn $pql.NextValue()
$perf | add-member NoteProperty BufCchHit $bch.NextValue()
$perf | add-member NoteProperty BatRecSec $brs.NextValue()

I do need a place to put the object once it’s populated, so I’ll create an empty array like this:

$$sysperf = @()

Then, at the end of each iteration I add the object to the array.

$sysperf += $perf

Now I can just loop through this last bit of code every 15 seconds or so to get a feel for how the system is performing, like this:

Start-Sleep -s 15

Finally, when I’ve got the information I want, I can use the export-csv cmdlet to load the data into a CSV file.

$sysperf | export-csv -noTypeInformation sysperf.csv

I could also use the Invoke-SQLCMD cmdlet to load it directly to SQL Server if I chose, but this is easier at a client site.

At the PASS Community Summit in Seattle next month I’ll be demonstrating the full script using this code. Hope to see you there.

Allen

(This blog was first posted October 9, 2009.)

Leave a Reply

Your email address will not be published. Required fields are marked *