Saturday, September 21, 2013

If you face with the problem that ASP.Net SessionID is changed for each request, then after googling for solution, the most frequent will be suggestion to store something to the session: when session is empty ASP.Net generates new SessionID for each request by design and it confuses many developers. But what to do if you tried it and it didn’t help?

We encountered with this problem on the computer of one user. For this customer there was site on Sharepoint 2010 and have 2 environments: QA and production. The problem was reproduced only on QA, but since it was very critical to have working QA on this project, it was necessary to fix it. Problem was reproduced both on IE and FF and only from user’s computer, i.e. it wasn’t reproducible when we tried to open site in RDP session directly.

First of all I created simple application layouts page in order to check that problem still here for non-empty session:

1:<%@ Page Language="C#" %>

2:<!DOCTYPEhtmlPUBLIC"-//W3C//DTD XHTML 1.0 Transitional//EN"

3:"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">

4:

5:<htmlxmlns="http://www.w3.org/1999/xhtml">

6:<headrunat="server">

7:<title></title>

8:</head>

9:<body>

10:<formid="form1"runat="server">

11:<%

1:

2: HttpContext.Current.Session["foo"] = "bar";

3:this.lbl.Text = HttpContext.Current.Session.SessionID;

%>

12:<asp:LabelID="lbl"runat="server"/>

13:</form>

14:</body>

15:</html>

It showed the same session id for me, but different for the user on each request. It forced me think that the problem is in client’s browsers configuration. The fact that on production environment everything works helped, because I could compare configurations on client and server side for both environments. I checked that there were no significant changes in web.configs on both environments. Also both QA and production URLs (https was used for them) were added to the trusted site and configurations of proxy were the same.

Solution was found quite unexpectedly: we removed QA URL from trusted site and session id stopped changing on each request. This was the first time in my practice when removing site from trusted site fixed the problem. It may be some customer-specific environment problem of course, but if you will try all possible solutions and they won’t work, there will be one more thing to try.

It works, but it contains domain name (example.com) which makes it specific to particular site. You can’t just copy and paste it to another web.config: in this case you will need to change the values to new host header.

Here I used possibility to add negative conditions (host doesn’t match to www pattern on line 4) and possibility to use server variables in actions ({HTTP_HOST} in action url on line 6). It will work like 1st rule, but can be used as is on any site accessed by http (in order to make it work for https, change http to https on line 7. Note also that in real implementation lines 6 and 7 should be on the single line).

Friday, September 20, 2013

If you faced with problem that after login to your wordpress blog, dashboard (/wp-admin) is empty, then the most suggestions you will find will be to check extra lines and spaces in wp-config.php and functions.php. First of all try to enable additional logging by setting WP_DEBUG to true in wp-config.php:

1: define('WP_DEBUG', true);

It may show the following warnings on wp-login.php page:

Cannot modify header information - headers already sent by (output started at …/wp-config.php:1) in wp-login.php on line xxx

If you tried another suggestions and they didn’t help, try to save wp-config.php in ANSI encoding (regular Windows Notepad allows it). It may be so that it was saved as Unicode and in this case it will have several hidden symbols at beginning, which you won’t be able to remove in most common editors.

Saturday, September 14, 2013

In this post I would like to describe one interesting problem with search crawler and managed metadata fields in Sharepoint 2013. In some situations you may face with the problem that your taxonomy fields are not crawled. Let’s assume that we provisioned the site and created several managed metadata fields using the following declarations:

Here as always we provision 2 fields: hidden Note field and managed metadata field itself, which has reference to the Note field. After that we create some content in the doclib or list using content type which this field (this is important step: without at least 1 item with non-empty values in managed metadata fields, crawled and managed properties for them won’t be created) and run full crawl of our site. During the crawling if everything went properly Sharepoint creates 2 crawled properties and 1 managed property for each managed metadata field (actually it creates them also for other field types, but in this article we are talking only about managed metadata):

As shown on the picture above the following crawled and managed are created automaically:

Crawled property

Mapped managed property

ows_{field name}

-

ows_taxId_{field name}

owstaxId{field name}

One crawled field (ows_MyManagedMetadataField) is not mapped to managed property initially (after first full crawl), another (ows_taxId_MyManagedMetadataField) has mapping to managed property (owstaxIdMyManagedMeatadataField), which is also created automatically during crawl. After that you should create 2nd managed property and map it to the crawled property without mapping. In all your queries, content by search web parts, search result sources, display templates, etc. you should use this second managed property which you map manually, not the one which was created automatically (for manually created managed property you may set it’s properties like Searchable, Queryable, etc. how you need, while for automatically created property Sharepoint set them and it is better to not change it). After that run full crawl again.

This is how search schema should look like if everything went correct. However if you provisioned managed metadata fields using the code shown above, you will have the following problem: after crawling crawled properties won’t be created at all or only 1 crawled property, which doesn’t have mapping to managed property, will be created. This fact it self is not critical. The problem however is that after that your KQL queries which filter the data based on managed metadata fields won’t return any data. It indicates that something went wrong during the crawling.

The problem is caused by the way how managed metadata field is provisioned. As you can see above it has the following name: MyManagedMetadataField_0, i.e. it uses format {managed metadata field name}_0. But as it turned out in order to have managed metadata fields correctly crawled it should use another format:

{managed metadata field name}TaxHTField

i.e. it should be named MyManagedMetadataFieldTaxHTField:

1:<FieldType="Note"

2:DisplayName="MyManagedMetadataFieldTaxHTField"

3:MaxLength="255"

4:Group="My Fields"

5:ID="{4FCE0732-EF53-4b43-B678-3D2FC28D9A29}"

6:StaticName="MyManagedMetadataFieldTaxHTField"

7:Name="MyManagedMetadataFieldTaxHTField"

8:Hidden="TRUE"

9:ShowInViewForms="FALSE"

10:Description=""/>

This is not documented fact, but this is how Sharepoint provisions its own managed metadata fields (e.g. Enterprise keywords). If you will check Sharepoint search assemblies in reflector for “TaxHTField” you will see usages of this suffix in the code, i.e. search crawler really demands on it. This is the only change in the above example needed for making MyManagedMetadataField properly crawlable. Now you know how to properly provision managed metadata fields declaratively :).

Saturday, September 7, 2013

During implementation of search-driven site on Sharepoint 2013 we faced with the following problem: for some pages titles were cut on search results page:

I.e. for most pages titles were ok, but for some of them they were cut as shown on the picture above, e.g. “Find the right solution” were cut to “he right”, “We as business partner” to “as busine”, etc. These titles are not exactly the same as on the real site, but they should show the basic problem. There were no visible consistent pattern for the cutting: for some titles several full words were included, for another words were cut and shown part didn’t contain the keywords added by user in the search box.

Investigation showed that the problem comes from standard Item_CommonItem_Body.html display template. Title is shown with the following code inside it:

I.e. at first it gets the highlighted title using Srch.U.getHighlightedProperty() method (line 1), and then trims it for adding into the “a” tag using Srch.U.trimTitle() method (line 8). The first thing which came to my mind was that trimTitle method worked incorrectly. This method (as well as getHighlightedProperty) is defined in Search.ClientControls.js file, which is located in 15/template/layouts folder. I added temporary traces into it using console.log and found that wrong title is returned already from getHighlightedProperty method. Here is the code of this method:

key parameter passed to this method is the id of the item shown in the search results, result is ctx.CurrentItem and property in our case is “Title”. Using traces I checked that it goes to line 12 (property = 'HHTitle') and then reads property “HHTitle” from object, which is returned from the following call: Srch.U.$5H(result['HitHighlightedProperties']) (lines 6-7). I.e. the problem is caused by incorrect calculation of the highlighted title stored in ctx.CurrentItem[‘HitHighlightedProperties’].

Test showed that current browser language affects this method, i.e. title may be cut for one language, but shown properly for another. As a workaround for this problem you may change the code of Item_CommonItem_Body.html display template to the following:

About Me

I've created this blog for sharing my technical experience in software engineering. Most of posts will be dedicated to Sharepoint. But I will write also about another areas of software development for .Net platform. Hope it will be useful and will help you in your work.