This blog post will show you how to write your own custom security post-trimmer for SharePoint 2013. Not only that, we will take you through the steps of deploying and registering the trimmer using a crawl rule before putting the trimmer to work.

Please visit the official MSDN documentation for the overview and definitive source of documentation of this feature:

Why Post-Security Trimming

There are two kinds of custom security trimming: Pre-trimming refers to pre-query evaluation where the backend rewrites the query adding security information before the index lookup in the search index. Post-trimming refers to post-query evaluation where the search results are pruned before they are returned to the user.

Post-trimming analyzes each search result after the query has been evaluated. Performance and potential incorrect refiner data and hit counts aside, sometimes it is necessary to perform last-minute security evaluation “when the results are served”.

One scenario could be to deny documents in the search results for machines that do not have proper anti-virus software. Another can be to restrict certain documents from being visible in the search results outside a given time of day.

Requirements

SharePoint 2013 Server

Visual Studio 2012

A Custom Connector sending claims (see previous post on this blog).

The Trimmer Design

Let’s create a simple post-security trimmer. It should remove documents from the search results if a certain hint is given for document results. We will use the string field docaclmeta for this purpose. The rules are simple. Documents will be removed if this field contains the text ‘deny’. If this field is empty or contains anything else (e.g. ‘allow’), then the documents will be visible in the search results.

The Code

This MSDN article offers useful starting tips on creating the security post-trimmer project in Visual Studio, adding the references to both the Microsoft.Office.Server.Search.dll and the Microsoft.IdentityModel.dll.

The following is added to the using directives at the beginning of the class file, SearchPostTrimmer.cs:

using System.Collections.Specialized;
using System.Security.Principal;
using Microsoft.Office.Server.Search.Administration;
using Microsoft.Office.Server.Search.Query;

We then define the class as implementing the ISecurityTrimmerPost interface in the class declaration:

public class SearchPostTrimmer : ISecurityTrimmerPost

We do not need any additional settings for this trimmer to work. Thus, the Initialize method remains empty:

Testing the Trimmer

Now, you can issue queries and the beauty of the Post-trimmer logic should reveal itself on each query evaluated. Try modifying the docaclmeta field contents in the Product.xml file for the custom connector, issue a full crawl and repeat the query.

This blog post will show you how to write your own custom security pre-trimmer for SharePoint 2013. We will take you through the steps of deploying and registering the trimmer before putting the trimmer to work.

Please visit the official MSDN documentation for the overview and definitive source of documentation of this feature:

Why Use Pre-Security Trimmers

Pre-trimming refers to pre-query evaluation where the backend rewrites the query adding security information before the index lookup in the search index. Post-trimming refers to post-query evaluation where search results are pruned before they are returned to the user.

We recommend the use of pre-trimming for performance and general correctness; pre-trimming prevents information leakage for refiner data and hit count instances.

Requirements

SharePoint 2013 Server

Visual Studio 2012

A Custom Connector sending claims (see previous post on this blog)

The Trimmer Design

Let’s create a simple pre-security trimmer. A trimmer that reads group membership data from a text file, performs a user lookup for group membership data and then adds claims to the query tree based upon this. In short, the trimmer code needs to figure out which user that is issuing the query, then perform a group membership lookup on that user and then add claims for that user, depending on the group membership.

The Code

This MSDN article offers useful starting tips on creating the security pre-trimmer project in Visual Studio, by adding references to both the Microsoft.Office.Server.Search.dll and the Microsoft.IdentityModel.dll.

Add the following to the using directives at the beginning of the class file, SearchPreTrimmer.cs:

using System.Security.Principal;
using Microsoft.IdentityModel.Claims;
using Microsoft.Office.Server.Search.Administration;
using Microsoft.Office.Server.Search.Query;

We then define the class as implementing the ISecurityTrimmerPre interface in the class declaration:

public class XmlContentSourcePreTrimmer : ISecurityTrimmerPre

We have to define a few constants at the beginning of the class. These variables may be altered by the static properties given when the trimmer is registered with SharePoint.

The AddAccess method of the trimmer is responsible for returning claims to be added to the query tree. We will refresh the group membership data if needed and figure out the user id for key lookup into the group membership structure from the text file.

We need a class to act as a simple wrapper of a Dictionary<string, string>. It should essentially serve as a key-value lookup from a user-ID to its group membership data. Each user’s group membership has the form group1;group2;group3, where “;” is used to separate between each group membership entry. This class is simply called Lookup.

Testing the Trimmer

Now, you can issue queries and you can see the beauty of the pre-trimmer logic in action for every query evaluated. Try modifying the datafile.txt for different group membership per user (keep in mind that contents of the data file are only loaded every 30 seconds). The 30 second refresh interval is a defined constant in the code. Enjoy!

This blog entry describes how to take an existing custom XML connector to the Search service application, modify the connector to submit security ACLs as claims, install and deploy for SharePoint 2013 and test it. Using this blog post, you can index external sources with its security model within SharePoint itself. Keep reading and we will show you how.

Why Use Claims

Claims have the potential to simplify authentication logic for external contents in SharePoint 2013. Claims-based identity enables applications to know a few facts about the user’s permission to view content. Thus, we can render legacy or different security models into custom claims in SharePoint 2013.

Modifying the XML Connector to send ACL information as claims should be helpful to demonstrate how to “light up” external content in search results, i.e. be able to search securely in any content.

Requirements

SharePoint 2013 Server

Visual Studio 2012

The Starting Point: XML Connector

With this XML Connector as a starting point, we need to modify the connector code as it does not send security claims with its document submissions.

The next two sub-sections are “greatly copied” from Anders Fagerhaug’s blog post. A huge thank-you and many kudos goes out to him for producing a great starting point for a wonderful custom connector!

Install Custom Connector (Thanks Anders!)

Download the zip archive attached to the bottom of this blog entry.

Unzip the contents of the zip archive to a folder on your computer, e.g. C:\CustomSecurityConnector

On the Start menu, choose All Programs, choose Microsoft Visual Studio 2010, and then choose Visual Studio Tools and open a Visual Studio command prompt.

To install the XmlFileConnector.dll, type the following command the command prompt

The expected output from this should be a protocol “xmldoc” with a ModeFileLocation pointing to the Model.xml given above.

Finally, we need to restart the search service, type the following commands in the command prompt:

net stop osearch15net start osearch15

Create Crawled Property for the Custom XML Connector (Thanks Anders!)

When the custom XML connector crawls content, the crawled properties discovered during crawl will have to be added to a crawled property category. You need to create this category. Note: The user that performs this operation has to be an administrator for the Search service application.

On the Start menu, choose All Programs, then choose Microsoft SharePoint 2013 Products and open a SharePoint 2013 Management Shell as an administrator.

To create a new crawled property category, at the command prompt type the following commands and run them, where: <ConnectorName> is the name you want to give the custom XML connector, for example Custom XML Connector:

where <ConnectorName> is the name you want to give the custom XML connector, for example Custom XML Connector.

Create Content Source for the XML Contents (Thanks Anders!)

To specify what, when and how the XML content should be crawled, you have to create a new content source for your XML content. Note: The user that performs this operation has to be an administrator for the Search service application.

On the home page of the SharePoint Central Administration website, in the Application Management section, choose Manage service applications

On the Manage Service Applications page, choose Search service application.

On the Search Service Administration Page, in the Crawling section, choose Content Sources.

On the Manage Content Sources page, choose New Content Source.

On the Add Content Source page, in the Name section, in the Name box, type a name for the new content source, for example XML Connector.

In the Content Source Type section, select Custom repository.

In the Type of Repository section, select xmldoc.

In the Start Address section, in the Type start addresses below (one per line) box, type the address from where the crawler should being crawling the XML content. The start address syntax is different depending on where the XML content is located.

XML content is located on a local drive, use the following syntax:xmldoc://localhost/<XMLcontentfolder>/#x=doc:id;;urielm=url;;titleelm=title#

XML content is located on a network drive, use the following syntax: xmldoc://<SharedNetworkPath>/#x=doc:id;;urielm=url;;titleelm=title#

XML content from the supplied example of this blog entry:xmldoc://localhost/C$/CustomSecurityConnector/#x=Product:ID;;titleelm=Title;;urlelm=Url#

where given that you extracted the zip archive of this blog post to C:\CustomSecurityConnector.

Verify that your newly created content source is shown on the Search Service Application page.

The CustomSecurityConnector folder given with the ZIP archive contains Product.xml, which is a sample of a small product catalog.

Create a Crawl Rule for Content Source

It is useful to create a crawl rule if we want to do post-Security trimming for these documents.

Go to SharePoint Central Admin, choose Search Administration.

Under the Crawling section, choose Crawl Rules and then New Crawl Rule

In the Path field, type in xmldoc://* which should match up the CrawlUrl of the documents we will crawl in our example Product.xml file. Even if the Url in our data states urls along the lines of http://wfe/site/catalog, remember that these are the display urls. The SharePoint gatherer/crawler needs the access URL given in the content source setting.

Select “Include all items in this path” and then create the rule by selecting OK

Crawl

Finally, start a full crawl of the newly created content source. Then select “View Crawl Logs” when the content source status goes back to the Idle status again. You should see 11 items successfully crawled.

Modifying the Connector to Send Custom Claims for Security

The Model.xml

First, we need to change the Model.xml of the connector. To enable sending claims in the connector, we need to submit a binary security descriptor, a boolean to say that we will be providing our own type of security and finally an optional string field (docaclmeta).

Essentially, we need to notify the connector framework of our security field type descriptors plus set a few properties to enable this in the model.

Let’s start with the TypeDescriptors first. For every item that we wish to enforce custom security on, we have to set the type descriptors for the following fields:

UsesPluggableAuth as a boolean field type (if the value of this field is true = custom security claims instead of the Windows Security descriptors next)

SecurityDescriptor as a byte array for the actual encoded claims data

docaclmeta as an optional string field, which will only be displayed in the search results if populated. This field is not queryable in the index.

In the model file itself, the added lines for TypeDescriptors are encapsulated by the XML comments for Claims Security, part 1/2, like this:

The Entities.cs

We need to modify the corresponding C# code for the objects (i.e. documents) that the connector will submit with custom security ACLs. We have to adjust the Document class with the same fields as specified in the Model.xml:

SecurityDescriptor

UsesPluggableAuth

docaclmeta

Thus, we added these three properties in the Entities.cs file as shown next:

The XmlFileLoader.cs

Finally, we need to modify the connector code to get the input data for the security data, translate this into a corresponding byte array of claims and set the proper field values of the Document class.

Key points to note:

The UsesPluggableAuth should only be set to true if we indeed will be supplying claim ACLs with this document.

Claims are encoded as a binary byte stream. The data type of this example is always of type string but this is no requirement of the SharePoint backend.

The encoding is done according to the protocol documentation where:– The first byte signals an allow or deny claim– The second byte is always 1 to indicate that this is a non-NT security ACL (i.e. it is a claim ACL type)– The next four bytes is the size of the following claim value array.– The claim value string follows as a Unicode byte array.– The next four bytes following the claim value array, gives the length of the claim type– The claim type string follows as a Unicode byte array.– The next four bytes following the claim type array, gives the length of the claim data type– The claim data type string follows as a Unicode byte array– The next four bytes following the claim data type array, gives the length of the claim original issuer– The claim issuer string finally follows as a Unicode byte array

The Input XML

We modified the input XML from the original connector, adding the claimtype, claimvalue, claimissuer and claimaclmeta field to the input:

Typical scenarios to support could be to enable searching securely in 3rd-party content sources.

Back in SharePoint 2010 only post-trimming was offered for solving this scenario. Post-trimming refers to post-query evaluation where search results are pruned before they are returned to the user. With the new and shiny SharePoint 2013, we have now two types of security trimmers in our toolbox:

Pre-Trimmer

Post-Trimmer

The new pre-trimmer type is where the trimmer logic is invoked pre-query evaluation. The search server backend rewrites the query adding security information before the index lookup in the search index.

It is our hope to present you with explanations and examples showing you:

How to take an existing connector, modify it to submit our own security ACLs as security claims directly, install it and deploy it

How to write and deploy a custom security pre-trimmer

How to write and deploy a custom security post-trimmer

I will write up a few blog posts on this in the coming days. Stay tuned!