Month: March 2013

I was working on a search server recently and started getting errors that the crawl component was failing to “CreateTempFolderForCacheFiles” As it turns out, the environment I was working on was an extremely secure farm where permissions were locked down and often shares were not permitted and accounts were allowed the minimum permissions they needed in order to run. In this case, the local temporary folder where the index files are created had been deleted, and the search service did not have permissions to recreate the folder. This was blocking the crawls from proceeding and they were just sitting there. In order to fix the issue, the folders need to be recreated. What is nice is that using PowerShell you can quickly recreate the folders in the correct location using the following script: $app = Get-SPEnterpriseSearchServiceApplication "<my SSA>" $crawlComponents = Get-SPEnterpriseSearchCrawlComponent -CrawlTopology $app.CrawlTopologies.ActiveTopology | where {$_.ServerName.ToLower().Equals($Env:COMPUTERNAME.ToLower()) } foreach ($component in $crawlComponents) { $path = $component.IndexLocation + "\" + $component.Name if ( Test-Path $path -pathType container) { Write-Host "Directory " $path " already exists" } Else { Write-Host "Creating Directory: " $path New-Item $path -ItemType directory | write-output }...

I’m not an expert with FAST, I just have to deal with it. This is a fun little thing to have happened recently. SharePoint adoption has been going really well. More people are using it, more people are adding content, more content is being indexed, more space is being used. The drive that we installed FASTsearch on is fairly small for drives these days, roughly 136GB of free disk space. This particular company also has a policy of 80% utilization of drives before an alert goes off to go look at the server disk utilization and reduce it. As I know from getting these alerts, when FAST is doing an index, there are times where the %FASTSEARCH%\tmp and the %FASTSEARCH%\data\data_index directories get pretty full. Like an extra 60GB worth of full. This is enough that along with the other items on the drive it tips past 80% utilization and I get the email alert. This is because FAST Search Server keeps a read-only binary index file set to serve queries while building the next index file set. The worst-case disk space usage for index data is approximately 2.5 times the size of a single index file set. This generally happens at night and by morning all the indexing is done and the drives have plenty of space in them. It’s not really worth ordering another drive at this point...

Have I done a soap-box post recently? Yup, this one. There is an awful lot of misunderstanding about the concept of Alternate Access Mappings (AAM) in the SharePoint world. It seems like every SharePoint consultant I talk to has a different opinion on why and how the AAM setting should be used, with a lot of it boiling down to the old stand-by of “it depends”. So, this is how I’ve set up my standards so that I always have a certain level of consistency across the farm. At its most basic level, AAMs are configuration settings set in...