030717_1553_CreateaSear6.png

Create a Search Service Status Dashboard

Last year I was introduced to the amazing work of Brian Pendergrass, Russ Maxwell, Brent Groom and Eric Dixon in the form of the SRx Core the SharePoint Search Health Reports. This suite of reports was created to help evaluate, monitor and maintain on-premises SharePoint farms by providing DEEP feedback and analysis of the SharePoint Search Service Application. The core is a set of “tests” that you can run collectively or independently and in detail to determine the overall health of your SharePoint Search Service Application (SSA).

Running the Tests

Running the tests starts with initializing the reports. Run:

. .\InitSRx.ps1

This will build a local cache and do some initialization of the object needed to run the reports.

Next you can run the tests to evaluate your SSA:

New-SRxReport -RunAllTests

Now you should begin to get an idea why I am so excited about these reports. They are beyond simple “point in time” status reports. They provide an insight into performance and optimization tweaks that you can make to your environment to gain the maximum performance possible from your SharePoint SSA. For example, the OSPowerPlan test evaluates that the Servers in the Search Topology are all running in “High Performance” mode. The OSVolumeProperties test evaluates that the drives used by the Index are formatted correctly and not using compression or native indexing which will hinder performance of the SharePoint indexing process. You can review the details of all the tests here. Summary for each test.

There are a ton more reports than these tests, too. I hope to go into more detail about them in future articles.

Running Detailed Reports

As you look at the summary of test results you will notice that Normal, is good (because it’s green…). Warning may be a cause for concern. Your goal is to eliminate the warning indicators through configuration and management of the Search Topology. You can begin this process by running more detailed reports for each test. This can be done in two ways, you can either use the -Details flag when you -RunAllTest though I think that’s a bit hard to process.

Or you can run each test independently and evaluate the results separately. For example, the OSVolumeProperties test looks like this:

New-SRxReport -Test OSVolumeProperties -Details

Creating a Dashboard

All this reporting and output is amazing for those of us thirsting for guidance and tools for monitoring our Search Service. But the report output is a bit awkward for pure reporting. What we need is an object that we can mess with to produce a report. Enter the Test-SRx cmdlet. Test-SRx returns a PowerShell object that we can pass down the pipeline to do cool stuff. For example:

$result = Test-SRx -RunAllTests

$result | Where-Object {$_.Level -ne “Normal”} | Select Name, Level, Headline

Now I have a handle on my result object I can trick out the output and push it somewhere I can report it, like Excel or better yet, Azure SQL. (Or I could have someone else do it…because he already did…) The last couple years in Orlando at SharePoint Live! 360, I have had the pleasure to learn PowerShell tricks from Ben Stegink. (He says he’s the only Ben Stegink on the planet, so here’s how to find him.) Ben does a very cool demo of monitoring SharePoint with PowerShell and then pumping the reports out to Azure SQL data tables so you can do trend analysis. You can get the code for those talks from his GitHub Presentations repo. So, that’s what I did. Grabbed his code, added my reports and created my data tables in SQL. He also has a script that creates an encrypted file for your credentials so you don’t give your passwords away accidently…

So, the meat of the script just takes the result object and then sends it to Excel (thanks Ben) or to SQL.

Test-SRx -RunAllTests | ForEach-Object{

$reportObject = New-Object PSObject

if($destination -eq "SQL"){
[datetime]$timestamp = $_.Timestamp

$timestamp = $timestamp.ToString("yyyy-MM-dd HH:mm:ss")
$cmd.CommandText = "INSERT INTO SrxReportSummary (BatchTimeStamp,Data,Alert,Category,Headline,
Timestamp,Result,Details,Name,FarmId,Level,RunId,ControlFile,Dashboard,Source) VALUES('{0}','{1}','{2}',
'{3}','{4}','{5}','{6}','{7}','{8}','{9}','{10}','{11}','{12}','{13}','{14}')"
-f $batchTimeStamp,$_.Data, $_.Alert, $_.Category, $_.Headline.Replace("'","''"), $timestamp, $_.Result,
$_.Details, $_.Name, $_.FarmId, $_.Level, $_.RunId, $_.ControlFile, $_.Dashboard, $_.Source

$nada = $cmd.ExecuteNonQuery()
}
elseif($destination -eq "XLS"){
$reportObject | Add-Member -MemberType NoteProperty -Name "Data" -Value $_.Data
$reportObject | Add-Member -MemberType NoteProperty -Name "Alert" -Value $_.Alert
$reportObject | Add-Member -MemberType NoteProperty -Name "Category" -Value $_.Category
$reportObject | Add-Member -MemberType NoteProperty -Name "Headline" -Value $_.Headline
$reportObject | Add-Member -MemberType NoteProperty -Name "Timestamp" -Value $_.Timestamp
$reportObject | Add-Member -MemberType NoteProperty -Name "Result" -Value $_.Result
$reportObject | Add-Member -MemberType NoteProperty -Name "Details" -Value $_.Details
$reportObject | Add-Member -MemberType NoteProperty -Name "Name" -Value $_.Name
$reportObject | Add-Member -MemberType NoteProperty -Name "FarmId" -Value $_.FarmId
$reportObject | Add-Member -MemberType NoteProperty -Name "Level" -Value $_.Level
$reportObject | Add-Member -MemberType NoteProperty -Name "RunId" -Value $_.RunId
$reportObject | Add-Member -MemberType NoteProperty -Name "ControlFile" -Value $_.ControlFile
$reportObject | Add-Member -MemberType NoteProperty -Name "Dashboard" -Value $_.Dashboard
$reportObject | Add-Member -MemberType NoteProperty -Name "Source" -Value $_.Source

$searchReport += $reportObject
}
}

In PowerBI, on the web, I can view the summary report and ask questions about it.

Using the PowerBI App I can monitor the report and set alerts in the event the data changes.

Conclusion

So, there you have it, one approach for creating the beginning of what could easily become a complete dashboard solution for your SharePoint Search Service Application. Ben does some great demos of Site Collection and Content Database reporting too, you should check out his sessions in Orlando later this year. Since this is what I call a “Franken Project”, I have created a GitHub Repository for the full scripts, but you will need to download the Srx Reports and Ben’s project separately. The instructions for the directory structure are included in my project. I also included the SQL script to provision the Azure SQL database used in this demo. OK, that it for this initial post. I plan to continue to decorate the dashboard with additional reports. I would love to see how folks use this approach to monitor and manage their Search Service Applications.

2 Comments

  1. Nice work. Noticed a typo ” .\InirSRx.ps1″ should be “InitSRx.ps1”.

Leave a Comment