Tuesday, 13 September 2011
Scripted - Creating a perfmon counter collection for monitoring the SharePoint crawl:
First, on the Index server create 2 folders and grant everyone the Modify permission –
Next I use XCOPY to copy the CrawlLogs_Scripts from my desktop to the server:
cmd.exe /c XCOPY "\_SOURCE\Crawllogs_scripts" "I:\LogFiles\Crawllogs_scripts" /E /V /I /Y > "c:\logs\copy_Crawllogs_scripts.log"
Do this by running the create_data_collector.bat – the contents of this file are listed below – this creates the “Osearch-crawl-counters” counter, in .csv format, named “crawllog”, in the “I:\Logfiles\CrawlLogs” path, using the crawlcounters.config file
logman create counter Osearch-crawl-counters -f csv -max 250 -cnf -si 00:00:15 -o "I:\LogFiles\Crawllogs\crawllog" -cf "I:\LogFiles\Crawllogs_scripts\crawlcounters.config"
This is used to list the perfmon counters that will be logged – in this case:
"\Office Server Search Gatherer\*"
"\Office Server Search Archival Plugin(Portal_Content)\*"
"\Office Server Search Gatherer Projects(Portal_Content)\*"
"\Office Server Search Indexer Catalogs(Portal_Content)\*"
"\Office Server Search Schema Plugin(Portal_Content)\*"
The crawl in our environment runs every weekday from 17:00 – so we’ll now set the counters to collect data over this period
Now we create a scheduled task to run the “start_collector.bat” at 16:30 Mon-Fri using the SPFarm account credentials
cmd.exe /c if not exist C:\WINDOWS\tasks\Start_Collector.job SCHTASKS /create /tn "Start_Collector" /tr "I:\LogFiles\Crawllogs_scripts\start_collector.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 16:30 /ru domain\SPfarm /rp password > "c\logs\add-schtask-startcollector.log"
Now we create a scheduled task to run the “stop_collector.bat” at 04:30 Mon-Fri using the SPFarm account credentials
cmd.exe /c if not exist C:\WINDOWS\tasks\Stop_Collector.job SCHTASKS /create /tn "Stop_Collector" /tr "I:\LogFiles\Crawllogs_scripts\stop_collector.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 04:30 /ru DOMAIN\SPfarm /rp password > "c:\logs\add-schtask-stopcollector.log"
Next we create a scheduled task to set the PowerShell execution policy to unrestricted at 05:29 Mon-Fri – this will enable us to then run the deletelogs.ps1 – a script that deletes counter logs over 10 days old and thus prevents the drive from filling up
cmd.exe /c if not exist C:\WINDOWS\tasks\Set-execpolicy-un.job SCHTASKS /create /tn "Set-execpolicy-un" /tr "I:\LogFiles\Crawllogs_scripts\set_un.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:29 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask3.log"
This task runs 1 minute after the execution policy has been set to unrestricted
cmd.exe /c if not exist C:\WINDOWS\tasks\deletelogs.job SCHTASKS /create /tn "deletelogs" /tr "I:\LogFiles\Crawllogs_scripts\deletelogs.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:30 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask4.log"
This task runs the deletelogs.bat:
powershell -command "& 'I:\LogFiles\Crawllogs_scripts\deletelogs.ps1' "
The deletlogs.bat runs the deletelogs.ps1
$TargetFolder = "I:\LogFiles\Crawllogs"
$cFiles = get-childitem $TargetFolder
foreach ($cFile in $cFiles)
If ($cFile.CreationTime -lt ((get-date).AddDays(-10)))
write-host "Deleting File $cfile" -foregroundcolor "Red"
$targetfile = "$TargetFolder\$cfile"
This task resets the PowerShell execution policy at 05:35 - after the deletelogs.ps1 has been run
cmd.exe /c if not exist C:\WINDOWS\tasks\Set-execpolicy-re.job SCHTASKS /create /tn "Set-execpolicy-re" /tr "I:\LogFiles\Crawllogs_scripts\set_re.bat" /sc weekly /D "MON,TUE,WED,THU,FRI" /st 05:35 /ru DOMAIN\SPfarm /rp password > "c:\install\logs\add-schtask5.log"
It runs the set_re.bat:
powershell -command set-executionpolicy allsigned
At the end of all this you get log files for the last 10 days that detail all major criteria of crawl activity – very useful
Hope this helps – I did all this because next I added the PDF Ifilter and I wanted to measure the impact
In the next blog I’ll go over how I scripted the PDF IFilter install on a MOSS 2007 environment
Here is the template I use for Information Architecture designs. It's built using Mindjet and I flesh the nodes out with the low level d...
Introduction This document is intended to highlight possible causes of poor indexing performance on SharePoint 2007 farms as well as p...
Using stsadm to stop the osearch service will blat all the indexes: In command prompt try running: taskkill.exe /F /IM mssearch.exe /T ...
Export SP Group members to .csv # Script for extracting members of SP groups in a site to a .csv file This is how it looks in I...