Tuesday 13 September 2011

Scripted - Creating a perfmon counter collection for monitoring the SharePoint crawl:


Contents:

  • Folder Structure
  • XCOPY
  • The CrawlLogs_Scripts folder
  • Create the data counter collection
  • Crawlcounters.config file:
  • Create the Start_Collector task:
  • Create the Stop_Collector task:
  • Create the set-execpolicy unrestricted task:
  • Run the deletelogs powershell script:
  • Create the set-execpolicy allsigned task:
  • Conclusion

This post describes a fully scripted perfmon counter collection deployment used to collect data about SharePoint crawls - complete with scheduled tasks to run the data collection and a PowerShell script to remove old logs

Folder Structure

First, on the Index server create 2 folders and grant everyone the Modify permission –

I:\Logfiles\CrawlLogs
I:\Logfiles\CrawlLogs_Scripts

XCopy


Next I use XCOPY to copy the CrawlLogs_Scripts from my desktop to the server:

cmd.exe /c  XCOPY "\_SOURCE\Crawllogs_scripts" "I:\LogFiles\Crawllogs_scripts" /E /V /I /Y > "c:\logs\copy_Crawllogs_scripts.log"

The CrawlLogs_Scripts folder

This contains the PowerShell scripts, batch files and config file that will be used to configure crawl logging


Create the data counter collection


Do this by running the create_data_collector.bat – the contents of this file are listed below – this creates the “Osearch-crawl-counters” counter, in .csv format, named “crawllog”, in the “I:\Logfiles\CrawlLogs” path, using the crawlcounters.config file
logman create counter Osearch-crawl-counters -f csv -max 250 -cnf -si 00:00:15 -o "I:\LogFiles\Crawllogs\crawllog" -cf "I:\LogFiles\Crawllogs_scripts\crawlcounters.config"

Crawlcounters.config file:


This is used to list the perfmon counters that will be logged – in this case:
"\Office Server Search Gatherer\*"
"\Office Server Search Archival Plugin(Portal_Content)\*"
"\Office Server Search Gatherer Projects(Portal_Content)\*"
"\Office Server Search Indexer Catalogs(Portal_Content)\*"
"\Office Server Search Schema Plugin(Portal_Content)\*"
The crawl in our environment runs every weekday from 17:00 – so we’ll now set the counters to collect data over this period

Create the Start_Collector task:


Now we create a scheduled task to run the “start_collector.bat” at 16:30 Mon-Fri using the SPFarm account credentials
cmd.exe /c  if not exist C:\WINDOWS\tasks\Start_Collector.job SCHTASKS /create /tn "Start_Collector" /tr "I:\LogFiles\Crawllogs_scripts\start_collector.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 16:30 /ru domain\SPfarm /rp password > "c\logs\add-schtask-startcollector.log"

Create the Stop_Collector task:


Now we create a scheduled task to run the “stop_collector.bat” at 04:30 Mon-Fri using the SPFarm account credentials
cmd.exe /c  if not exist C:\WINDOWS\tasks\Stop_Collector.job SCHTASKS /create /tn "Stop_Collector" /tr "I:\LogFiles\Crawllogs_scripts\stop_collector.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 04:30 /ru DOMAIN\SPfarm /rp password > "c:\logs\add-schtask-stopcollector.log"

Create the set-execpolicy unrestricted task:


Next we create a scheduled task to set the PowerShell execution policy to unrestricted at 05:29 Mon-Fri – this will enable us to then run the deletelogs.ps1 – a script that deletes counter logs over 10 days old and thus prevents the drive from filling up
cmd.exe /c  if not exist C:\WINDOWS\tasks\Set-execpolicy-un.job SCHTASKS /create /tn "Set-execpolicy-un" /tr "I:\LogFiles\Crawllogs_scripts\set_un.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:29 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask3.log"

Run the deletelogs powershell script:


This task runs 1 minute after the execution policy has been set to unrestricted
cmd.exe /c  if not exist C:\WINDOWS\tasks\deletelogs.job SCHTASKS /create /tn "deletelogs" /tr "I:\LogFiles\Crawllogs_scripts\deletelogs.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:30 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask4.log"
This task runs the deletelogs.bat:
powershell -command "& 'I:\LogFiles\Crawllogs_scripts\deletelogs.ps1' "
The deletlogs.bat runs the deletelogs.ps1

Deletelogs.ps1:

               
$TargetFolder = "I:\LogFiles\Crawllogs"
$cFiles = get-childitem $TargetFolder
foreach ($cFile in $cFiles)
       {
       If ($cFile.CreationTime -lt ((get-date).AddDays(-10))) 
              {
              write-host "Deleting File $cfile" -foregroundcolor "Red"
              $targetfile = "$TargetFolder\$cfile"
              del $targetfile
              }
       }

Create the set-execpolicy allsigned task:


This task resets the PowerShell execution policy at 05:35 - after the deletelogs.ps1 has been run
cmd.exe /c  if not exist C:\WINDOWS\tasks\Set-execpolicy-re.job SCHTASKS /create /tn "Set-execpolicy-re" /tr "I:\LogFiles\Crawllogs_scripts\set_re.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:35 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask5.log"
It runs the set_re.bat:
powershell -command set-executionpolicy allsigned

Conclusion


At the end of all this you get log files for the last 10 days that detail all major criteria of crawl activity – very useful



Hope this helps – I did all this because next I added the PDF Ifilter and I wanted to measure the impact
In the next blog I’ll go over how I scripted the PDF IFilter install on a MOSS 2007 environment
Cheers

No comments:

Post a Comment

SharePoint Information Architecture Diagram

Here is the template I use for Information Architecture designs. It's built using Mindjet and I flesh the nodes out with the low level d...