A common feature request is the ability to limit the crawler to a certain number of levels, or directories, when crawling a site. For example, a crawler limited to 2 levels would index "http://xav.com/1/2.html" but not "http://xav.com/1/2/3.html".

This functionality can be realized using Filter Rules. Go to "Admin Page" => "Filter Rules" and choose "Create New Rule". Use the following parameters:

Name: "Limited Levels"

Enabled: checked

Action: Deny

Analyze: URL

Apply rule *unless* analyzed text contains at least *1* of

Substring: ^http://[^/]+(/[^/]*){0,1}$

The final integer in the regex is the maximum level allowed. Examples: